Julia Vs Python- Which Is Faster For Deep Learning?

Ever tried a different programming language for AI

Editorial @ TRN
The Research Nest
5 min readJul 17, 2020

--

Photo by Safar Safarov on Unsplash

Are you an AI enthusiast?

Yes?

Then answer this question: Which programming language do you use for developing stuff for AI?

Your answer is Python. See! We knew it.

Alright, we are not magicians, but it is a well-known fact. Python is one of the most popular languages right now, not just in Machine Learning but also for normal development purposes. As of October 2019, over 8.2 million developers use Python. Which is 1 million more than Java and over 6 million more than Swift.

Python ranks first when it comes to development in the domain of artificial intelligence.

To prove it simply, most of the enthusiasts won’t even know any language other than Python (and R mostly), which can be used for Machine Learning. And it’s just because it seems daunting to code ML algorithms in C or Java.

Python is used widely due to its simplicity and community. But we all know one big drawback of Python.

Yep. The speed of execution.

Python is 400 times slower than C++. But we do get around all this in Machine Learning by using the libraries written in more efficient languages like C.

But still, the computational load is heavier. And also, Python is a memory hog.

So, what can we do? We can’t get the best of both worlds!

Well, before you jump to conclusions, let us show you what the folks at MIT have to offer.

Introducing Julia.

No, it’s not another humanoid or chatbot. It is a programming language.

Just like Python, Julia can be used for general programming but many of the features are suited for scientific computations and numerical analysis. It first appeared in 2012. With its stable release in 2020.

But what does Julia have to offer? Well, first of all, Julia is fast. Just like Java, Julia uses a just-in-time compiler. Secondly, Julia is also easier to learn than other computationally efficient languages. And before you ask, yes, Julia is as simple in terms of syntax as Python.

But is it better than Python? If yes, then in what terms?

Let’s conduct an experiment and see.

We would be training a simple CNN on both Python as well as Julia. We would be using the most stable and efficient implementations of CNN in both languages, with the same architecture.

For Python, we are using Tensorflow. And for Julia, we are using Flux.jl, which is a pure Julia stack. Here are some references for you, to conduct the experiment yourself.

We are going to check the time for each epoch as well as the implementational times for the pre-processing tasks.

We are going to implement both the CNNs on the MNIST dataset and see the results. You can try using a different dataset and see if our results relate to them.

To calculate the execution times we use the time module in Python, however, it is only required for the pre-processing time calculation, as the model training time can be easily viewed in the output of the model.fit() function itself.

For Julia, we will be using a very useful macro: @time. We use it on the overall pre-processing code as well as on the train function we use per each epoch.

Something like this:

While using on a single step we may just use the micro directly as the prefix, like:

Don’t worry if you can’t comprehend the Julia syntax yet. Just understand that we are trying to measure the execution times.

Some Results

The specifications of the system used for the experiment:

Intel Core-i78550U — 16 GB RAM

  • The overall pre-processing time of Python was 2.6 seconds.
  • The overall pre-processing time of Julia was 4.2 seconds, including the time for the pre-compilation of the model.
  • The time per iteration in Python was 20 seconds per epoch.
  • The time per iteration in Julia was 16 seconds per epoch.

Now you may think, that is not so much different and better in terms of efficiency.

Well, there is one crucial difference between the execution criteria. While executing Python code, the overall CPU consumption was 87 percent. But while executing Julia, the overall CPU consumption was 18 percent.

So, if we use the parallel computations in Julia and take full advantage of the CPU, we can get the time per iteration as low as 3.6 seconds.

Yes, we can see that Julia is pretty fast in comparison to Python. But that doesn’t necessarily mean that it is going to take over Python anytime soon.

The reason Python is so easily adaptable is due to its huge community. Julia doesn’t have as big of a community as Python has. One more thing is Julia doesn’t by default use the full computational power of the machine it is running on. You need to use certain commands and a specific way of execution to make it do that.

Python is right now at this position because of its simplicity and the libraries and modules it works upon. And Python has been with us for so long. Julia just came. We may not be seeing its full potential yet. But it does show promise of becoming one of the most used languages in data computations.

Julia is faster. Julia is as simple as Python. But the only question is, how fast does the developer community adopt this and make it even better. Is it really worth to make that shift? I guess we will need more data and case studies to answer that.

We hope this article instills a curiosity to look at other languages beyond Python in AI.

References and further reading:

Editorial note-

This article was conceptualized by Aditya Vivek Thota and written by Dishant Parikh of The Research Nest.

Stay tuned for more such insightful content with a prime focus on artificial intelligence!

--

--