Why do we call it a “computer bug”?
A “software bug” or a “computer bug” is a widespread term among IT experts and the general public. Read on to discover what it means and where it comes from.
Along with the special technical terms as part of the ICT vocabulary, we often encounter some “funny” expressions describing technology-related processes. The phrase “to find a computer bug” is the perfect example of such words.
What does it mean, and where does it come from?
According to a Techopedia definition: “a bug refers to an error, fault or flaw in any computer program or a hardware system. A bug produces unexpected results or causes a system to behave unexpectedly. In short, it is any behavior or results that a program or system gets, but it was not designed to do”. Simply put, a software bug is a mistake or problem in a computer program.
“Debugging”, the origin of the computer bug
Most computer bugs are due to human error. Various reasons cause a computer bug – mistakes made in the program’s design or source code or the software developers’ use of incorrect code. The experts say even nowadays, there is no bug-free program, despite the technological progress.
But why do we call it a computer bug?
The term “computer bug” became popular among ITs with the rise of computers. However, the word “bug” is used much earlier to describe a problem that occurs in a machine. In 1843, the idea of the technical malfunction was presented by Ada Lovelace when she talked about possible problems with program cards used in Charles Babbage’s analytical engine. Her concept is that due to insufficient operative information, the cards of the Analytical Engine may give wrong orders. She never called it a bug, but Thomas Edison did. In 1878 in a letter to his partner, he wrote about an error in his machine – the quadruplex telegraph system – calling this malfunction a bug. Later on, he continued to use this exact word whenever he found a flaw in the design or operation of his technical system. Soon other members of the electrical community started using it in the same context. It became quite popular among technical inventors, and in 1892, it was included in the Standard Electrical Dictionary. The definition said – a bug is “[a]ny fault or trouble in the connections or working of electric apparatus”.
Which is the most famous bug?
The Y2K bug is probably one of the most notorious bugs. It caused the display of the wrong date as the software programs were not designed at the time to handle dates after 1999. So, by the mid-20th century, the term bug was already popular. Technicians used it before the emergence of computers. But still, why a computer bug?
Probably because 1947 computer programmer Grace Hopper and her team found a bug – a real moth lying in a relay of Harvard University’s Mark II electromechanical computer. The moth was found on a piece of tape on the machine’s logbook. It created holes in the computer’s punched paper tapes, which caused problems in the calculator’s functions. Later Hopper and her team continued to use the term bug, referring to the problems that complicated data input in Mark I and II computers. And the rest is history – the term “computer bug” exploded in the following years with the emergence and the higher accessibility of PCs. Nowadays, we call bugs ANY errors or glitches in a program.
Don’t miss any more news or articles!