The Algebra of Programming

Kevin Welcher
Techspiration + Ideas + Making It Happen
9 min readFeb 25, 2016

--

Have you ever stopped and thought, “Man, laws are really great!” Nope? Imagine something super mundane. How about driving to the store and buying some bread. Boring right? Now, imagine doing that without laws. Driving would be madness. If you made it to the store, you would then have to convince the store to exchange bread for bits of paper. By having laws, it makes this seemingly boring task trivially easy without much thought

Am I mincing concepts, loosely interchanging laws, rules, and maxims? Yes… Just squint your eyes and bear with me. Why is this even worth thinking about? When our world has rules, we can trust those rules to hold true and spend our time doing other things. Laws even let us bridge paradigms such as languages and disciplines.

How can this benefit us in programming? To list a few things, laws can:

  • Let us think at a higher level
  • Make refactoring trivial
  • Help improve the modularity of our code

To show this, let’s start with another discipline, math.

Arithmetic Algebra

Programming is more than just coding, it is about solving problems. In How to solve it, Pólya shows us that we can add to our repertoire of problem solving solutions by linking problems together. In other words, we find patterns outside the realm of our current paradigm and apply them to our current situation.

So how can algebra help us? Let’s look into a few laws algebra has that are interesting.

Identity: These are properties that do not modify the equation.
- Addition: x + 0 = x
- Multiplication: x * 1 = x
Associative: Within multiplication and addition we can group similar operations together. The following are equal.
- 1 + 2 + 3
- 1 + (2 + 3)
- (1 + 2) + 3
Distributive: We can distribute and factor out operations. The following are equal.
- 2(x + y)
- 2x + 2y

Why are these laws interesting? We can refactor equations within these operations with confidence that we are not actually changing anything. We can simplify equations just by applying laws.

3(x + 4xy) + 3yx   : Our starting equation
3x + 12xy + 3yx : Distributive
3x + (12xy + 3yx) : Associative
3x + 15xy : Addition
3x(1 + 5y) : Reverse Distributive

Scary? Not at all! We used our laws to simplify our equation. In fact, there are many, many ways we could have applied various laws to transform this equation. How cool is that? Take a moment to process this small feat. We have an equation which we have no idea what it models and are able to transform it with confidence. What if we could do this elsewhere…

Boolean Algebra

Let’s move out of arithmetic into something we use more often, Boolean algebra. Instead of having numbers as our domain we have true and false, and instead of addition and multiplication we have AND (&&) as well as OR (||).

We use Boolean algebra all the time, whether it is writing if-statements or branching our code in other capacities. Just like we were able to refactor our equations in arithmetic algebra, we can do the same with Boolean algebra.

What laws are we going to use? The same laws as before!

Identity:
- A && true == A
- B || false == B
Associative: The following are equal.
- (A && B) && C
- A && (B && C)
- A && B && C
Distributive: The following are equal.
- A && (B || C)
- (A && B) || (A && C)

Using these laws, let’s take an if-statement you might come across and simplify it.

if(
(isUser || isAdmin) &&
hasPosts &&
(isUser && canMakePost) ||
(isAdmin && canMakePost)
)

A bit gnarly right? I write this garbage daily — let’s try to clean it up. To simplify things, instead of names, we will use letters.

A = isUser
B = isAdmin
C = hasPosts
D = canMakePost
(A || B) && C && (A && D) || (B && D)

Let’s churn this alphabet soup!

(A||B) && C && ((A&&D) || (B&&D)) // AND Associativity
(A||B) && C && (D && (A||B)) // Factor out D Associativity
(A||B) && C && D && (A||B) // AND Associativity
(A||B) && (A||B) && C && D // AND Commutativity
(A||B) && C && D // Idempotent law (A && A = A)

We start off grouping expressions with the associative law. With elements grouped we can then more readily see how we can factor out and distribute variables. We continue shifting, grouping, and factoring expressions until we can’t go any further.

Plugging back in our variable names we now have:

(isUser || isAdmin) && hasPosts && canMakePosts

Much easier to read! Was this process drawn out? Yes. Did we need to follow every step? No. We do most of these things intuitively. The fact that it can be done without much thought is an indication to the power of these laws. They let us trivially refactor expressions and know that those expressions will still work.

Functional Algebra

How we can use these laws in functional programming? Just how we described the domain for Boolean algebra (true, false) and operations (&&, ||), we will do the same here. Our domain is pure functions and functors and our operations will be map and compose.

Just a heads up, this section assumes familiarity with currying, composition, and functors. Don’t get discouraged if you have not heard of these things! Just like with the prior two examples, we are focusing more on the laws than the code itself.

Compose

First, a refresher on compose. Function composition is simply stringing functions together. We feed the result of one function to the input of another, starting from the right and flowing to the left.

Want to know something really cool about composition? Functions in composition are associative!

Associative: The following are equal.
- compose(a, b, c)
- compose(compose(a, b) c)
- compose(a, compose(b, c))
- compose(add1, add1, double)
- compose(add1, compose(add1, double))
- compose(compose(add1, add1), double)

Remembering back to arithmetic and Boolean algebra, associativity is often the first step to refactoring. Once we group elements together we can factor, distribute, and more.

Functors

The second part of our domain is functors. In an earlier post, we talked about how functors are just objects that implement a map method which obeys with a few laws. Here is a refresher of those laws.

// F is a functor Functor
function identity(x) { return x }
Identity:
- F.map(identity) == F
Composition: The following are equal.
- F.map(f1).map(f2)
- F.map(compose(f2, f1))

Let’s use a slightly more concrete example to see these laws play out.

Identity tells us that when we map over the identity function, our original functor is unchanged. Composition tells us that we can chain maps together or compose their functions within one map. Let’s rewrite these laws in a slightly different (point free) notation.

Identity: 
- map(identity) == identity
Composition:
- map(compose(f2, f1)) == compose(map(f2), map(f1))

Do they look familiar? With our composition rule, we can take similar maps and factor them out of our compositions. To be explicit:

Boolean Distributive: 
- (A && B) || (A && C) == A && (B || C)
Functional Composition:
- compose(map(f2), map(f1)) == map(compose(f2, f1))

Just like how we were able to factor out the A in the Boolean algebra example, we are able to factor a map out of compose!

Composing it Together

How about we render some blog posts to the screen? We have three hypothetical helper functions:

// getPosts : search -> [PostId]
// fetchPost : PostId -> Promise Error Post
// renderPost : Post -> HTML
  • getPosts: Takes in a search keyword and returns an array of post ids from somewhere.
  • fetchPost: Takes in a post id and returns a Promise of the server response*.
  • renderPost: Takes in post data and produces some HTML.

Using these three methods, we will use compose to build up a function to render a Promised array of HTML posts. Let’s make a new function, searchAndRenderPosts, with compose that takes in a search string and returns some HTML.

The search string is fed to getPosts which returns an array of PostId’s

The array of PostId’s from getPosts gets fed to the next function in the composition.

Since the value coming out of getPosts returns a functor (an array in this case), we need to use map to modify its value. Here we are running fetchPost over each value coming from getPosts. Now we have transformed our array of ids into an array of Promises.

Next we need to iterate over this array of Promises and turn it into some HTML.

Why do we have renderPost wrapped in two maps? Again, map(fetchPost) returns an array of Promises. We use the outer map to modify the array, and we use the inner map to run renderPost over the Promises within the array**.

If this was a bit hard to grasp, don’t fret. Just like with Boolean algebra, we could work with the equations without needing to know the details of each value. Similarly here, we will work within our laws to change this code without worrying about the details. Let’s remember our laws:

Associativity: Functions in composition are associative. The following are equal.
- compose(a, b, c)
- compose(a, compose(b, c))
- compose(compose(a, b), c)
Composition: Functors chain map calls. The following are equal.
- map(compose(f2, f1))
- compose(map(f2), map(f1))

Ok, on to the good stuff.

Take a moment to digest what we just did. We started with a function that composed three other functions. We isolated two functions that were logically grouped together (renderPost and fetchPost), extracted them out and made a new generic function, fetchAndRenderPost. Don’t forget that we did not have any insight into the workings of the three helper functions. It didn’t even matter, we were still able to work with the code.

In arithmetic and Boolean algebra, the laws helped us reason and massage our values. In functional programming, functions are our values. Who ever thought functions could be so fun?

Conclusion

Taking a step back, what did exploring algebraic laws actually give us? We learned that laws surrounding equations and operations allowed us to easily transform those equations from something complex to something more manageable. The types and operations in functional programming share many of these law (and more) affording us enormous reasoning and refactoring power that spans across languages.

For example, the other day I was talking with two coworkers, one uses Swift and the other uses Scala. We were playing around with some Swift code, refactoring some hairy logic. Even though only one of us knew Swift, we were all able to contribute.

This is one of the many things that is so exciting about functional programming. Could you ever imagine doing this in an imperative world where functions modify state willy nilly? It would be like hot potato with grenades! If something as simple as laws are this cool… what is next? :D

Footnotes

* For the sake of simplicity we use Promises, however Promises are not functors. We would want to use Tasks instead.

  • * You might notice that the types don’t match up in this example. We say we are returning HTML but are really returning an array of Tasks. We would normally sequence and flatten this array of Tasks.

Published in Techspiration + Ideas + Making It Happen.

--

--