Object-Oriented Programming Was Invented to Simulate the Real World — Then It Took Over the Real World
In 1967, two Norwegian scientists built a programming language to simulate ships. They accidentally invented the paradigm that now powers 80% of all software on Earth.
Key Takeaways
- •Simula (1967) — the first OOP language — was built to simulate ship traffic through Norwegian fjords
- •Alan Kay coined 'object-oriented programming' in 1972 while building Smalltalk at Xerox PARC
- •OOP's four pillars: encapsulation, inheritance, polymorphism, and abstraction
- •Java (1995) made OOP the world's default paradigm — 80%+ of production code today uses OOP
- •Critics argue OOP's rigid hierarchies don't match how the real world actually works
Root Connection
OOP traces directly to Simula, a language built in 1967 by Ole-Johan Dahl and Kristen Nygaard at the Norwegian Computing Center — originally designed to simulate ship traffic through fjords.
Programming Paradigm Usage in Production Codebases (2026)
OOP remains the dominant paradigm despite growing interest in functional programming
Source: Stack Overflow Developer Survey 2026
Timeline
COBOL and Fortran dominate — programming is strictly procedural, step-by-step instructions
Ole-Johan Dahl and Kristen Nygaard create Simula at the Norwegian Computing Center — the first language with objects, classes, and inheritance
Alan Kay coins 'object-oriented programming' at Xerox PARC while building Smalltalk — he envisions computers as networks of communicating objects
Bjarne Stroustrup begins work on 'C with Classes' at Bell Labs — it becomes C++ in 1985, bringing OOP to systems programming
Java launches with the promise of 'Write Once, Run Anywhere' — OOP goes mainstream worldwide
Microsoft releases C# as its answer to Java; Python and Ruby gain popularity — OOP is now the default paradigm
OOP powers everything from banking systems to video games to AI frameworks — but functional programming is gaining ground
In 1967, two Norwegian scientists had a problem that had nothing to do with software.
Ole-Johan Dahl and Kristen Nygaard were working at the Norwegian Computing Center in Oslo. They'd been hired to simulate complex real-world systems — ship traffic through fjords, customer queues at service points, factory assembly lines. The problem was that existing programming languages made this nearly impossible.
Fortran and COBOL — the dominant languages of the 1960s — were strictly procedural. You wrote a sequence of instructions, and the computer executed them in order. This worked fine for calculations, but it was terrible for simulating a world full of independent entities interacting with each other.
A ship isn't an instruction. A ship is a thing — with properties (speed, cargo, position) and behaviors (dock, sail, load). In a simulation, you need hundreds of ships doing their own thing simultaneously. Procedural code couldn't model that cleanly.
So Dahl and Nygaard invented a new concept: the object.
Alan Kay once said: 'I invented the term object-oriented, and I can tell you I did not have C++ in mind.' His vision was biological — cells sending messages to each other, not rigid class hierarchies.
In their language Simula, a 'ship' wasn't a list of variables — it was a self-contained unit that bundled its data and its behavior together. You could create a Ship class that defined what every ship looks like and does, then stamp out individual ship objects from that template. Each ship knew its own state. Each ship could act independently.
They also invented inheritance. A CargoShip could inherit everything from Ship and add cargo-specific features. A Tanker could inherit from CargoShip and add tank capacity. You built hierarchies of increasingly specific types.
This was revolutionary. For the first time, code could mirror the structure of the real world it was trying to represent.
But Simula stayed niche — mostly used in Scandinavian academia. The idea needed a champion.
It found one in Alan Kay.
The irony of OOP is that it was designed to model the real world, but the real world doesn't have inheritance hierarchies. A platypus doesn't inherit from Duck and Beaver.
In 1972, Kay was at Xerox PARC in Palo Alto, building a revolutionary computer called the Dynabook (which eventually became the inspiration for laptops and tablets). He needed a programming language for it, and Simula's object concept fascinated him.
But Kay had a different vision. Where Dahl and Nygaard saw objects as data structures with attached procedures, Kay saw them as biological cells — independent organisms that communicated by sending messages to each other. The important thing wasn't the object's internal structure. It was the messages flowing between objects.
Kay built Smalltalk, and he coined the term 'object-oriented programming.' But he later regretted the name. 'The big idea is messaging,' he said. 'The key in making great and growable systems is much more to design how its modules communicate rather than what their internal properties and behaviors should be.'
Almost nobody listened to that part.
What the industry heard was: organize your code into classes and objects. Build inheritance hierarchies. Encapsulate data behind methods. That interpretation — simpler, more concrete, more immediately useful — is what spread.
In 1979, Bjarne Stroustrup at Bell Labs started building 'C with Classes,' which became C++ in 1985. It grafted OOP concepts onto C, the dominant systems programming language. Now you could write operating systems and games using objects. C++ made OOP practical for performance-critical code.
Then came the 1990s, and OOP exploded.
Java launched in 1995 with Sun Microsystems' bold promise: 'Write Once, Run Anywhere.' Java was OOP from the ground up — everything was a class, every program was a collection of objects. It was designed for the emerging internet era, and corporations adopted it en masse. Enterprise software became synonymous with OOP.
Microsoft responded with C# in 2000. Python, Ruby, and PHP all adopted OOP features. University computer science programs rebuilt their curricula around objects and classes. By 2005, OOP wasn't just popular — it was the default way to think about code.
The four pillars became gospel: Encapsulation (hide internal state behind a public interface). Inheritance (build new types by extending existing ones). Polymorphism (treat different types uniformly through shared interfaces). Abstraction (model complex systems with simplified representations).
But cracks appeared.
Inheritance hierarchies got deep and brittle. Change one class at the top and everything below it might break. The 'diamond problem' — where a class inherits from two classes that share a common ancestor — created nightmares. Developers spent more time designing class hierarchies than solving actual problems.
The 'real world modeling' promise didn't hold up either. The real world is messy. A penguin is a bird that can't fly. A platypus is a mammal that lays eggs. Inheritance hierarchies assume clean taxonomies, but reality doesn't have clean taxonomies.
Functional programming advocates pushed back: 'Just use functions. Pass data through transformations. No mutable state, no side effects, no inheritance headaches.' Languages like Haskell, Erlang, and later Clojure and Elixir offered alternatives. Even OOP-first languages started adding functional features — Java got lambdas in 2014, C# got LINQ, Python had list comprehensions from the start.
By 2026, the consensus has shifted. Pure OOP is rare. Most modern code is multi-paradigm — using objects where they make sense, functions where they don't, and mixing freely. Go rejected inheritance entirely. Rust uses traits instead of classes. The rigid OOP of the 1990s and 2000s has softened.
But the core insight survives: bundling data with behavior, creating reusable abstractions, modeling systems as interacting entities. That idea — born in a Norwegian lab trying to simulate ships — still structures the vast majority of software running today.
Dahl and Nygaard received the Turing Award in 2001 for their work on Simula. Alan Kay received it in 2003 for Smalltalk. Three pioneers, one idea: the world is made of objects. Code should be too.
The root of OOP isn't computer science. It's a fjord full of ships, and two scientists who decided that if the world is made of things, maybe code should be made of things too.
How did this make you feel?
Recommended Gear
View all →Disclosure: Some links on this page may be affiliate links. If you make a purchase through these links, we may earn a small commission at no extra cost to you. We only recommend products we genuinely believe in.
Framework Laptop 16
The modular, repairable laptop that lets you upgrade every component. The right-to-repair movement in action.
Flipper Zero
Multi-tool for pentesters and hardware hackers. RFID, NFC, infrared, GPIO — all in your pocket.
The Innovators by Walter Isaacson
The untold story of the people who created the computer, internet, and digital revolution. Essential tech history.
reMarkable 2 Paper Tablet
E-ink tablet that feels like writing on real paper. No distractions, no notifications — just thinking.
Keep Reading
Want to dig deeper? Trace any technology back to its origins.
Start Research