Published in

Analytics Vidhya

# Time Complexity: The (Not So) Hidden Secret to Writing Faster Python Programs

As a beginning web developer, it’s a thrill just to get a program up and running. There were plenty of times when I did a fist pump in the air (to no one in particular) when I got a particularly frustrating problem to work.

But as you progress further on the journey, you may want to begin to think about complexity and challenges to scaling programs up.

As you get further as a programmer, you begin to think less about how you’re going to solve a particular problem and more about why you’re solving it one way or another.

Ok… I lied. You still think about how, but you also have to think about the why thing too.

Sure, your matrix program can handle a 3 x 3 square, but what if you triple that to 9 x 9, or even 90 x 90? Are the expressions you’ve written lean enough to withstand performing on a bigger scale, or will they slow your run time down or, worse, cause your entire program to crash? You wouldn’t want to wait to production to find that out. Better to write lean and mean code the first time out and play it safe.

I came up against just these sort of questions recently on a problem I was doing for Exercism.io. The problem involved assigning different point values to particular letters and reporting the score of any given word, as you would do in a game of Scrabble:

`Letter                                ValueA, E, I, O, U, L, N, R, S, T            1D, G                                    2B, C, M, P                              3F, H, V, W, Y                           4K                                       5 J, X                                    8Q, Z                                    10`

At the end of the program, you need to return the complete score for a given word. For example, the word “PYTHON” would get a score of 14 (3+4+1+4+1+1).

My initial solution to this problem was to throw all of the above values into a dictionary. Each row in the ‘letter’ column of the above table would be its own key, like this:

`def score(word):         letter_values = {        "AEIOULNRST": 1,        "DG": 2,        "BCMP": 3,        "FHVWY": 4,        "K": 5,        "JX": 8,        "QZ": 10             }         score = [letter_values[key] for letter in list(word.upper())           for key in letter_values if letter in key]         return sum(score)`

In this solution, score is a list. It’s a nested list comprehension, to be more precise, that iterates through each key in letter_values as well as each letter in word and appends the value of keys that match the given letter in word.

The function then returns the sum of all the values in this score list for a perfectly serviceable solution.

If we were never planning on scaling this up.

You see, the nifty list comprehension that’s deployed in the above solution is pretty complex. That is, it calls on the computer to do a lot of iterating to get at the right solution. Here it is unraveled:

`for letter in word:    letter_score = letter_values[letter]    score.append(letter_score)`

This is fine, but what if the word is 1,000,000 letters long? (I don’t think any language has a word that long, not even German, but just go with it). That would require a lot of iterating as the computer goes letter by letter in our million letter word and then iterates through each of the keys above to locate just the right letter, then plug in the corresponding value for that letter. It kind of makes me exhausted just thinking about it.

The concept at play here is time complexity. You can read more about time complexity here.

Time complexity is the quantification of how long it takes an algorithm to process or run as a function of the amount of input. Basically, it’s the measurement of how efficient your program is at runtime.

To be honest, I’m not entirely sure that this program would run slower with a one million letter word, I’ve never tried it. But it is an inefficient program that can be refactored. And this is a good opportunity to explore this concept and ensure that the code we’re writing is in anticipation of these kinds of issues coming up. After all, an ounce of prevention is worth a pound of cure, as one printing press dude once said. (Incidentally, apparently he was referring to fire prevention when he said this? Thank you internet.)

So the million dollar (or million letter) question is: how to refactor this so the program is more efficient?

For starters, we could use more lines of code. Although it may seem counterintuitive, by breaking up the key into each letter and having 26 different keys, we’re actually helping the program to move a lot quicker. That’s one less step of iteration we have to worry about in our list of tasks for the computer to perform.

We can then simply append the value of the letter at the given key, like so:

`def score(word):    letter_values = {        'A': 1,        'E': 1,        'I': 1,        'O': 1,        'U': 1,        'L': 1,        'N': 1,        'R': 1,        'S': 1,        'T': 1,        'D': 2,        'G': 2,        'B': 3,        'C': 3,        'M': 3,        'P': 3,        'F': 4,        'H': 4,        'V': 4,        'W': 4,        'Y': 4,        'K': 5,        'J': 8,        'X': 8,        'Q': 10,        'Z': 10    }    score = [letter_values[letter] for letter in word.upper()]    return sum(score)`

This program returns the correct value for each letter by plugging in letter_value[letter] into the score list.

Instead of iterating through both the word and the keys of the letter_values dictionary, we’re just iterating through the letters in word and plugging them directly into the dictionary to elicit the values.

It’s a lot simpler and a more elegant solution. And my guess is that if our computers could talk, they would thank us for reducing the time complexity in their lives.

--

--

## More from Analytics Vidhya

Analytics Vidhya is a community of Analytics and Data Science professionals. We are building the next-gen data science ecosystem https://www.analyticsvidhya.com

## Dan Eder

Full Stack Web Developer, polyglot, songwriter, always learning