Combining Arrays in Javascript

Editor’s Note: This is a repost of a blog post by Akinjide Bankole, a 21-year-old developer from Lagos who joined Andela in May of 2015. Currently working as a backend developer with Atlas Money, a fintech startup focused on emerging markets, Akinjide began teaching himself to code after secondary school and is especially passionate about design.

As of late, I’ve been spending a fair amount of time learning new Javascript techniques.

Three days ago, I was studying some Array methods. My idea was simple: I wanted a simple concatenation of two Arrays so that I could later use it for some graph data. But it turned out to be more frustrating than expected.

Let’s start with this scenario:

var numeric = [1, 2, 3, 4, 5, 6, 7, 8];
var alpha = ['foo', 'bar', 'baz', 'bar', 'bar', 'baz'];

The concatenation of numeric and alpha would obviously be:

[
1, 2, 3, 4, 5, 6, 7, 8, 'foo', 'bar', 'baz', 'bar', 'bar', 'baz'
];

After taking a look at this, I realized that the simplest way to concatenate would be to just use Array.prototype.concat() method to merge the two Arrays into one sequentially.

So, I sat down and wrote a first version that looked something like this:

var new_array = numeric.concat(alpha);

This is some pretty basic stuff:

  • numeric and alpha are defined? CHECK!
  • new_array, a whole new Array representing the combination of numeric and alpha? CHECK!

But to my dismay, after running for a few minutes I noticed that this small program will eat all the RAM on my laptop! if the values above were centupled?!

No problem!, I’ll just unset so numeric and alpha are garbaged collected, right? Problem solved!

numeric = alpha = null;

Meh, I realized that for only a couple of small Arrays, this would be fine. But I figured that when working with large Arrays and repeating this process a lot, or working in a memory-limited environment, this method leaves a lot to be desired.

So, being confused about what to do, I decided to dig a bit deeper. I narrowed my idea down to looped insertion. Ok, let’s just append one Array content onto the other, using Array.prototype.push() method:

for (var i = 0; i < alpha.length; i++) {
numeric.push(alpha[i]);
}

Now, numeric has the result of both the original numeric plus the contents of alpha. Better for memory, it would seem.

But what if numeric was minuscule and alpha was comparatively colossal? For both memory and speed reasons, you’d probably want to push the smaller numeric onto the front of alpha rather than the colossal alpha onto the end numeric. No problem, i’ll just replace Array.prototype.push() with Array.prototype.unshift() method and loop in the opposite direction:

for (var i = numeric.length - 1; i >= 0; i--) {
alpha.push(numeric[i]);
}

Unfortunately, this still isn’t very efficient — loops are ugly and harder to maintain. Can we do any better?

So then I tried the following:

numeric = alpha.reduce(function(prev, curr) {
prev.push(curr);
return prev;
}, numeric);
alpha = numeric.reduceRight(function(prev, curr) {
prev.unshift(curr);
return prev;
}, alpha);

Array.prototype.reduce() and Array.prototype.reduceRight() are nice. However, the first major problem is that we’re effectively doubling the size (temporarily, of course!) of the thing being appended by essentially copying its contents to the stack for the function call. Moreover, different Javascript engines have different implementation–dependent limitations to the number of arguments that can be passed.

So, if the added Array had a million items in it, you’d almost certainly exceed the size of the stack allowed for that push() or unshift() call. Ugh. It’ll work just fine for a few thousand elements, but you have to be careful not to exceed a reasonably safe limit.

Keep up with Akinjide on Twitter, Github, and his blog.