An Optimized Queue with Big O notation

yves songolo
5 min readMay 24, 2018

--

If you’re reading this, I imagine you’re familiar with basic abstract data structures like queues and stacks. You may understand them intuitively (everyone’s stood in a line or eaten a sandwich before), but do you know the mechanics behind how they work?

In this article, I will explain how queues work and the best queue implementations to optimize runtime and programmatic efficiency. This article is oriented to beginner-level students of data structures — a good understand of linked lists is preferred. (if you want to learn more about linked List read this article by ).

What is a Queue

Without using too much technical jargon, a queue is an abstract data structure that emulates the concept of “First In, First Out” or FIFO for short. well, what is that really mean? with basic explanation, a queue can be compared to any restaurant line, where first come first serve, as simple as it is.

that’s what happens when you try to hack a queue.

What kinds of apps use queues?

You may have encountered several web and mobile applications that utilize queues in their core algorithms. Some apps — such as Spotify and Apple Music — use queues to allow you to add new music to your favorite playlist or collection in the order of your personal interest. When you start listening to the music (unless you shuffle the order) the first music in the list will be the first to play.

Let’s rock’n roll

Structure

For this tutorial, we’ll be using a simple linked list to implement our queue. (If you want a refresher on linked lists, click here: [INCLUDE LINK HERE].We will implement our queue with linked list data structures for optimal algorithmic runtime

“First In, First Out (FIFO)

There’s many ways to implement queues — two main ways include linked lists and array lists. FIFO, as we discussed before, refers to the structured order of data in a queue: items that enter the queue first will be the data that’s removed first. sound simple, right?

Big O Notation

Big O Notation is a simple way to explain algorithmic runtime.

Everybody loves when their algorithm takes less than a second to run, right?

O(1) or Order 1

O(1) order for runtime represents a programmatic action that is performed nearly instantaneously — in many cases, we call this ‘constant runtime’. For instance, the time it takes to calculate 1 + 1 should NOT take more than a second (If it does, you should probably throw away your calculator and try a little harder. 💁🏾Just saying!)

O(n) or Order n

O(n) represent how fast an algorithm perform when iterating through a list, where n is the length of the list. Consider a line of ten people: it will be pretty easy for you to shake everyone’s hand. Now imagine the line is over 10,000 people (n=10000). Shaking all those peoples’ hands is now a tricky task, right?. Any loop, function, or code sample that must iterate through an entire set of data at least once has a runtime of at least O(n).

O(n²) or Order n²

“Here is where things become crazy! Imagine having two lines of ten people each, where each person in the first line must shake hands with every person in the second line. With ten people, things may work somewhat smoothly. But I’m sure you can guess how inefficient this can be at large numbers of people or data. If you have over 10,000 data values, your algorithm may take a very long time to finish the operation depending on your processor speed. We call this ineffective operational runtime O(n²) or Order n². Nested for loops often have these unintended effects, especially if each loop is iterating across an entire object. (That’s why it’s wise to use nested loops only when absolutely necessary!)”

If you want to learn more about Big O Notation, check out this article: [INSERT LINK HERE].

Since we want to optimize our queue to run as fast as possible on large datasets, our best case to aim for is implementing our algorithm with respect to O(1) runtime or constant runtime. Let’s see how close we can get!

Implementation

A standard queue has three major component processes that we can write as explicit functions: enqueue(), front(), and dequeue().

Enqueue

Enqueue means add a new data element to the front of our queue. Thus, our linked queue will append any new data value to the head of the queue, which is similar to a new person entered at the head of a line. Run time O(1)

Front

Front refers to the method by which we can grab the data value that appears at the head of our queue. It’s important to note that front() only works if our queue is not empty.In this code sample, we return the data at the head of our queue. Run time O(1)

Dequeue

Dequeue is the method that deletes and returns the first data element in our queue. So when enqueue() adds an item to our queue, dequeue() removes it. In code, we follow three main steps. First, we save the head’s data value. Then, we move the head pointer to the next node in the queue. Finally, we return the original head’s saved data. Run time O(1)

Conclusion:

Queues are abstract data structures that work like lines of people. With the right implementation, a queue implementation can work on structurally massive datasets better than naïve solutions. (Think about the line of people waiting for the brand new Jordans versus the line of people waiting for the brand new Sketchers. Can your algorithm handle both?). Thank you for reading my article; if you liked it, you can check out my full queue implementation and other CS-related projects on my Github

--

--