The Computer Bug

DEFINITION:
In computer science, an error in software or hardware. In software, a bug is an error in coding or logic that causes a program to malfunction or to produce incorrect results. Minor bugs—for example, a cursor that does not behave as expected—can be inconvenient or frustrating, but not damaging to information. More severe bugs can cause a program to "hang" (stop responding to commands) and might leave the user with no alternative but to restart the program, losing whatever previous work had not been saved. In either case, the programmer must find and correct the error by the process known as debugging. Because of the potential risk to important data, commercial application programs are tested and debugged as completely as possible before release. Minor bugs found after the program becomes available are corrected in the next update; more severe bugs can sometimes be fixed with special software, called patches, that circumvents the problem or otherwise alleviates its effects. In hardware, a bug is a recurring physical problem that prevents a system or set of components from working together properly. The origin of the term reputedly goes back to the early days of computing, when a hardware problem in an electromechanical computer at Harvard University was traced to a moth caught between the contacts of a relay in the machine.

(Entomologists will undoubtedly be quick to note that a moth is not really a bug.)

No comments:

Post a Comment