In computing, a single binary digit, either 0 or 1. A bit is the smallest unit of data stored in a computer; all other data must be coded into a pattern of individual bits. A byte represents sufficient computer memory to store a single character of data, and usually contains eight bits.
The maximum number of bits that a computer can normally process at once is called a word. Microcomputers are often described according to how many bits of information they can handle at once. For instance, the first microprocessor, the Intel 4004 (launched in 1971), was a 4-bit device. In the 1970s several different 8-bit computers, many based on the Zilog Z80 or Rockwell 6502 processors, came into common use. In 1981, the IBM Personal Computer (PC) was introduced, using the Intel 8088 processor, which combined a 16-bit processor with an 8-bit data bus. Business micros of the later 1980s began to use 32-bit processors such as the Intel 80386 and Motorola 68030. Machines based on the first 64-bit microprocessor appeared in 1993. From the 2000s, processing speed has been improved by using multi-core processors (with more than one central processing unit) rather than by increasing the number of bits an individual processor can handle.
Bits and bytes and their importance