The Word 'Byte' Was a Deliberate Typo, Misspelled to Prevent Accidents
The fundamental unit of digital information was deliberately misspelled as 'byte' instead of 'bite' to avoid confusion with the word 'bit.'
Key Takeaways
- •Werner Buchholz coined 'byte' in 1956 at IBM
- •Deliberately misspelled to avoid confusion with 'bit'
- •A byte is typically 8 bits of information
- •One of the most fundamental terms in computing history
Root Connection
Werner Buchholz at IBM deliberately misspelled 'bite' as 'byte' in 1956 to avoid confusion with 'bit', creating one of computing's most fundamental terms.
The word 'byte' is a typo. Deliberately. A computer engineer misspelled a common English word on purpose in 1956 to avoid a catastrophic telegraph problem, and the misspelling became the fundamental unit of digital information for the next seventy years.
You read megabytes, gigabytes, and terabytes every day. The etymology behind that word is a single engineer choosing to ruin the spelling of 'bite' because he was worried about data transmission errors.
THE IBM STRETCH CONTEXT
In 1956, IBM was building the Stretch — officially the IBM 7030 — which was intended to be the most powerful computer in the world. Stretch was a supercomputer ahead of its era: it introduced pipelining, memory interleaving, 64-bit floating point arithmetic, and instruction prefetching. When it finally shipped in 1961, Stretch cost $13.5 million (about $140 million in 2026 dollars) and landed at Los Alamos National Laboratory for nuclear weapons simulation.
Werner Buchholz, a German-American computer scientist who had joined IBM in 1949, was the project's lead architect for instruction set design. The Stretch team faced a vocabulary problem: they needed a term for a group of bits that could be addressed and transmitted as a single unit. Existing hardware used different sizes — 6 bits, 7 bits, 9 bits — for different purposes. No single standard word existed.
Buchholz coined 'byte' in a June 1956 internal IBM memo on the Stretch instruction set. The memo described a byte as a variable-length unit, between 1 and 6 bits. The eventual 8-bit standard did not solidify until the IBM System/360 in 1964.
WHY THE DELIBERATE TYPO
The natural English word was 'bite' — as in, a bite-sized piece of data. The problem was that in 1956, computer documentation and engineering correspondence were routinely transmitted by teletype. Teletype machines had notoriously aggressive typo-correction, and typists would 'correct' unfamiliar words automatically, sometimes without realizing. If a memo said 'bite,' the recipient might receive 'bit' instead. In a document defining the architectural difference between a bit (1 binary digit) and a byte (a group of binary digits), that typo-correction would introduce bugs into the specification itself.
Buchholz later explained the decision in a 1977 IEEE Annals of the History of Computing paper: he changed the spelling to 'byte' specifically to prevent accidental conversion to 'bit' during transcription. The Y made the word visually and orthographically unique. No one would 'correct' it without noticing.
In other words: the spelling 'byte' exists because Werner Buchholz did not trust typists.
THE BATTLE OVER SIZE
Throughout the 1960s, 'byte' was not strictly 8 bits. Early IBM machines used 6-bit bytes. The CDC 6600, one of Stretch's competitors, used 60-bit words that were subdivided in inconsistent ways. The PDP-10 had 36-bit words. The term 'byte' was contested, and for a time 'octet' — a precise 8-bit unit — was used in international telecommunications standards to avoid ambiguity.
The IBM System/360, launched April 7, 1964, standardized the 8-bit byte because it could represent both the decimal digits 0–9 and the full uppercase and lowercase English alphabet in the EBCDIC encoding. ASCII, which was finalized the same year, also settled on 7-bit character encoding that fit comfortably inside an 8-bit byte. By the late 1970s, 'byte' meant 8 bits worldwide.
Internet protocol standards still use 'octet' formally — RFC 791 (the Internet Protocol specification) uses 'octet' rather than 'byte' to eliminate any possible ambiguity. In practice, everyone else uses 'byte' and means eight bits.
DID YOU KNOW?
The 4-bit half-byte has a name: 'nibble.' The spelling is also deliberately unusual — some early IBM documentation used 'nybble' specifically to match the Y in 'byte.' Modern usage has standardized on 'nibble,' which is used most commonly in hexadecimal representation (one nibble = one hex digit). Werner Buchholz is responsible for that one too, at least informally.
WHY IT MATTERS
The byte is the atom of the digital economy. Every file size, every memory address, every network packet, every disk sector, every cryptographic key is measured in bytes or powers of bytes. Global data creation in 2025 was estimated at 181 zettabytes — that is 181 followed by 21 zeros, all measured in units coined by Werner Buchholz's 1956 typo-avoidance strategy.
The word itself is a tiny act of engineering discipline. Buchholz could have let 'bite' stand and trusted the system. He did not. He deliberately introduced a spelling variant to make the word immune to the error patterns of the communication infrastructure of his era. A generation of engineers, hardware architects, and standards writers followed his spelling without even knowing why the Y was there.
This is a pattern you see repeatedly in computing. Unusual spellings, weird capitalizations, deliberately odd punctuation: they are often defensive choices against the transmission medium of their time. The Y in 'byte' is armor against 1956 teletype autocorrect, still in service in 2026.
(Sources: Werner Buchholz, 'The System Design of the IBM Type 7030,' 1962; Werner Buchholz interview, IEEE Annals of the History of Computing, 1977; IBM Archives, Stretch (IBM 7030) project records; Emerson W. Pugh, 'Building IBM: Shaping an Industry and its Technology,' MIT Press, 1995; IBM System/360 Principles of Operation, 1964; RFC 791, Internet Protocol, September 1981)
BYTE
Coined by Werner Buchholz at IBM in 1956 during work on the Stretch computer.
DELIBERATE TYPO
Buchholz changed 'bite' to 'byte' to avoid confusion with 'bit' (binary digit).
DEFINITION
A byte is typically 8 bits, the standard unit for measuring digital information.
ETYMOLOGY
'Byte' was chosen because it's a 'bite-sized' piece of information that a computer can process.
LEGACY
The term 'byte' is now fundamental to computing, used in everything from file sizes to memory measurements.
ALTERNATIVES
Early suggestions included 'bite,' 'octet,' and 'nybble' (4 bits), but 'byte' won out.
Enjoy This Article?
RootByte is 100% independent - no paywalls, no corporate sponsors. Your support helps fund education, therapy for special needs kids, and keeps the research going.
Support RootByte on Ko-fiHow did this make you feel?
Recommended Gear
View all →Disclosure: Some links on this page may be affiliate links. If you make a purchase through these links, we may earn a small commission at no extra cost to you. We only recommend products we genuinely believe in.
Framework Laptop 16
The modular, repairable laptop that lets you upgrade every component. The right-to-repair movement in action.
Flipper Zero
Multi-tool for pentesters and hardware hackers. RFID, NFC, infrared, GPIO - all in your pocket.
The Innovators by Walter Isaacson
The untold story of the people who created the computer, internet, and digital revolution. Essential tech history.
reMarkable 2 Paper Tablet
E-ink tablet that feels like writing on real paper. No distractions, no notifications - just thinking.
Keep Reading
Want to dig deeper? Trace any technology back to its origins.
Start Research