As a computer scientist, I know two fun facts.

One, insertion into a binary heap is O(lg(n)). And two, if you’re moderately clever, you can implement a structure like a binomial heap, which has O(1) insertion time.

As you can tell, I’m great fun at parties.

Also, I don’t get invited to many parties. So instead I implemented a binomial heap. And by “implemented a binomial heap”, I mean “fell asleep crying with a bottle of wine”. But eventually, I woke up to an implemented binomial heap.

Now, you might imagine that my efforts would be rewarded. Well, we live in a cruel world.

For example, suppose that we were to insert random numbers into our binary heap and binomial heap. One might expect, (or if they spent hours building a binomial heap, hope) our binary heap runs into trouble, after all, we may have to swap up to the very top if our new element is the smallest in the heap thus far.

But, to my surprise, the binary heap performed each insertion in, on average, O(1) time. Why? Well, an element is only swapped up with its parent if it is smaller than its parent. And since each element in the heap is a random number, there’s a fifty-fifty chance that the new element should swap one level up.

We then find (using some simple math) that the expected number of swaps up is 2. In other words, insertion on a binary heap is O(1).

It’s only when you insert in decreasing order that our binary heap begins to struggle, but even then, the fast array access of our binomial heap beats following the breadcrumb trail of pointers that our binomial heap leaves strewn about our RAM.

Basically, keep it simple, stupid.