OO: what you learnt is probably wrong

Marcelo Camargo
Rung Developers
Published in
5 min readDec 13, 2017

One of the most misrepresented concepts from computer science. This post doesn’t aims to make you sad, but show that what you learnt in university about object orientation ─ most probably ─ is wrong.

INB4: In reference to the academic definition, not to the one later misunderstood by enterprises.

Sorry for the capybaras, but I like to insert them in every article or presentation I do.

It’s common for most universities to use languages such as C++ or Java to teach object orientation concepts ─ or at least try ─ but it is not exactly OO that is taught, but a programming model based on a class system merged to the imperative paradigm. My objective in this article is not to create controversy, but try to demystify some concepts and try to explain what really object orientation is about, and how most modern programming languages do not obey to the its main principles, including the one which use OO support as an advantage for marketing.

The OO history

The paradigm is old. The modern essential notion of object orientation comes from 50’s, but had its concepts largely defined and represented by Alan Kay, and implemented as reference in the language Smalltalk, which until now is seen like the greatest reference in the paradigm. In general, the term OO is still seen with ambiguity, and there are a lot of flamewars around the Internet about it.

OO is not about classes

Sorry to make you sad, but a language doesn’t need to have classes to be OO. The language Self introduced a model based on prototypes, avoiding classes, and by default. A concept that could be used previously in Smalltalk since 80's! Do you want me to prove? JavaScript supports object orientation programming better then Java, and does that based purely on prototype-based-programming! Another plot-twist: ES6 also doesn’t have real classes. What you see is in truth syntax sugar for prototypes model that is used inside the VM.

But what is OO?

According to the definitions of who created the term, some core concepts must be respected:

  • Everything is an object: seriously. Everything must be an object. Classes must be an object, numbers, primitive values, like strings e booleans and also your dog must be an object.
  • Objects communicate by message passing: this means objects must receive and dispatch messages from ─ and to ─ other objects. For example, the simple operation 1 + 1. Trivial, no? A OO language must deal with 1 as being an object, + as being a message and the other value as being another object. This way, the sum operation would be a message dispatched to the number and that would return another object; in this case, another number. Because of that, in Smalltalk, there is no operator precedence, because multiplication and addition are messages sent to the object, and not primitive operations. In this case, control structures you know in other languages ─ such asif, for and while ─ do not exist the same way you see in Java or C++, but you simulate their behaviors using messages.
  • Every object is instance of some class: not exactly a “class” like in Java, but as a shape ─ and this also must be an object. This doesn’t limit prototype based programming.
  • Behavior sharing: that is, the shape ─ the class ─ holds the shared behavior of the objects. For example, arithmetic operations would be behaviors/messages instances of Number!
  • When start, the control goes for the first object: and everything else works with messages. The first object has the control.

The concepts of Alan Kay for OO also were largely used as base for the concepts of Carl Hewitt for Actors model and concurrent programming. Tim Budd also proposed some additions to Kay’s specification:

  • Objects can have their own memory, made up by other objects;
  • Adds vertical inheritance control to behavior sharing.

Why Java doesn’t satisfy OO?

This is maybe the most controversial point of this article. The first broken principle by the definitions is that everything should be an object.

  • Classes are not objects: however, every classes carries an instance of Class class to describe it. The classes per se are only abstractions. Classes also should be created by message passing.
  • Primitive types: all the primitive values should be objects. Java provides wrappers for these types, like Integer for int, but int, per se, is not an object and thus do not receive messages. Arithmetic is native.
  • Imperative control structures: for example, if. true andfalseshould be objects ─ Boolean instances ─ and should be able to receive calls to ifTrue andifFalse to continue the computations. The current concept of methods in Java is based in named methods, but how is it in Smalltalk? The Boolean class has the “methods” ifTrue:ifFalse:, ifTrue: andifFalse:. The dispatch is not based on the name, but in the message that is passed on! Some languages, like Swift and Objective-C provide good support for this behavior, and some support syntax sugar, like Ruby. Please note that Smalltalk 98 specification allows some pseudo-messages ─ like conditionals ─ to be implemented “natively”; and disallows overloading them in runtime.

However, an important detail is that, although Java per se doesn’t makes everything an object, the JVM, internally, uses all primitive types as being objects!

Cool! I want to see this in practice!

Some time ago, me and Matheus Albuquerque created a repository aiming to implement the same algorithm in different programming languages and paradigms, from LISP to Brainfuck, from Rust to COBOL! There you can compare the implementations of Java and C#, for example, with the Smalltalk one, and see stars blinking! Renan Ranelli ─ our famous brazilian Milhouse ─ also has a very interesting video in portuguese with a human explanation about OO “hipster” definition.

Please, don’t kill me…

When you learn OO in university, in general, you are learning a programming model based in classes for a language ─ or a set of languages ─ specifically, which inherits and implements ─ but not totally or at most ─ the core concepts of OO. Concepts like polymorphism, inheritance and visibility may be added, but they are not necessarily and intrinsically linked to the largely accepted definition of OO. I don’t what to create flamewars or to say that the language designers are lying to you, but show historical conceptual failures and make the original concept a bit clearer. Complaints, please redirect to Milton. Thanks!

References and useful links

This article is a translation from portuguese one, originally written by Marcelo Haskell Camargo.

--

--