Programming Languages Break Math
Explaining why (most) programming languages break the commutative law of addition
Everyone knows that a+b = b+a right? It’s something we know from elementary school and it is completely reasonable. If I give you 3 apples and then 2 apples, it is the same, if I had given you first 2 apples and then 3.
Of course, when you take your first linear algebra course you learn that it is not always the case in multiplication. In matrices, one of the first properties you learn is that generally A ⋅ B ≠ B ⋅ A or as a mathematician would say, multiplication isn’t commutative.
I have not taken any Group Theory classes, but from what I have searched, uncommutative addition is actually really rare. So, when I heard that addition can be not commutative in programming, it seemed so unatural. After some thought of course, it is logical. Lets see an example in C. We start by defining 2 functions , f and g, and a global variable “Global_var”.
int Global_var = 1;int f(int x){
Global_var++;
return x + Global_var;
}int g(int x){
return x + Global_var;
}
What f(1)+g(1) equals to? First, Global_var becomes 2 and f(1) returns 3, after that g(1) returns 1+2 = 3 so f(1)+g(1) = 3 + 3 = 6.
Lets see what g(1) + f(1) equals to. Global_var is 1 so g(1) returns 2 . Going now to f(1), Global_var will become 2 so f(1) will return 1+2=3 . The result will be g(1)+f(1) =2+3 = 5 .
The conclusion? f(1)+g(1) ≠ g(1) + f(1)
The question here you may ask is why most?
Explaining why (most) programming languages break the commutative law of addition
Why I have written on the subtitle most?
The key is immutable declarations. In Haskell and some other languages, immutable declaration means, when you declare a variable, you can not change the variable again. The Global_var++; part simply will not make any sense.
The problem above comes from g(1) == 3 in the first example while in the second g(1) == 2. We learn in high school that functions are not changing over the period of time. f(x) with the same input will always have the same output no matter what. But as we saw in the above example, programming languages not always behave like our “traditional” mathematics.
You can find an article about the difference between “Math” Functions and “Programming” functions here.
I can’t find anywhere that if the language’s objects are immutable, noncommutativity of addition is eliminated, but I strongly believe that this is the case. I will highly appreciate if someone can provide me with a proof (or somewhere that it states that) or of course, a counter-example.
I write because I want to engage with my readers and learn from them. If any of you would like to add anything, feel free to comment down below.