A summary of Scala Days 2019

Criteo Labs
Jul 17, 2019 · 6 min read

The Scala language is used extensively at Criteo and the Recommendation codebase makes no exception to this rule. Our Spark jobs, which compute recommendations from catalogues of 6 billion products and logs of user actions coming in at 1 million entries per second, are written in Scala. We also use it for a number of internal APIs that are implemented using the Finatra framework. Finally, we’ve even managed to wedge Scala into some legacy map reduce jobs that were initially written in Java.

Given this affinity, our interest in the Scala Days conference is obvious. Moreover, we knew that this year the conference was celebrating its 10th anniversary and that it was taking place in the very birthplace of Scala, the prestigious EPFL in Lausanne. We just had to go! Therefore the entire Recommendation Infrastructure team set camp for one week in the lovely Swiss scenery.

Here is the imaginary discussion we had on the train taking us back to Paris:

Aurel: So guys, what talks did you like the most?

Clément: I really liked the talk Run Scala Faster with GraalVM on any Platform given by Vojin Jovanovic from Oracle. The topic was GraalVM and in particular Graal’s JIT compiler and Native Image. One of my personal takeaways was that Graal’s replacement over HotSpot’s C2 JIT compiler provides a direct 7% performance improvement on the scalabench benchmark. Even more impressive was Graal’s Native Image feature which converts JVM applications to native executables using ahead-of-time (AOT) compilation. The offered benefits are blazing fast application startup, reduced memory usage and ease of deployment. While not fully ready for production yet, Graal is getting closer to it and I’m looking forward to trying it on our Spark jobs!

Aurel: I think Graal is also used at Twitter and there was a talk about this from Chris Thalinger, right?

Baptiste: Yes, I really enjoyed that one. Chris explained how Twitter uses the Graal compiler and machine learning to improve performance. About a year ago they started looking into how they can tune their JVM parameters in order to maximize the performance of their services. It’s quite a challenging problem at Twitter because they have hundreds of services written in Scala. The solution they found was to use Bayesian optimization to find a global optimum for certain JVM parameters with just a small number of iterations. For that purpose, they use a proprietary framework called AutoTune, built on top of Spearmint, which is an open-source project for Bayesian optimization from Harvard University. The last part of the talk was about the results they got on real Twitter services. In a nutshell, they gained about -12% CPU using the Graal compiler and another 6.2% by tuning the JVM parameters of one of their main services. At Twitter scale, that’s not big, it’s huge!

Aurel: On the topic of performance optimizations did you learn about other things that we might use?

David: Indeed we did! I attended a talk about Flare given by Tiark Rompf from Purdue University . This is a project that started a few years ago and that attempts to reduce the performance gap between Spark and what could be obtained with native code. For a simple aggregation query, the initial observation is that Spark is way slower than hand-written C code. Thus the idea behind Flare is that native code can be generated from a Spark SQL execution plan. Multiple levels of optimization have been implemented. For instance, a first level involves generating native code from the catalyst execution plan and executing it with Java Native Interface (JNI), while still running within Spark RDD to benefit from the fault tolerance mechanism. A second level skips the RDD context, but honestly, I’m a bit sceptical about this. When operating on a large cluster (like what we do at Criteo) failures happen all the time for a wide variety of reasons.

In any case, according to benchmarks led by the author, Flare consistently exhibits 10x performance improvements across a variety of datasets. Moreover, a relevant amount of methods available in the Spark SQL API seem to be supported. The downside is that Flare, which is under active development since 2016, is still in a private beta program. But I think we should still contact the authors to give it a go!

Aurel: If I remember correctly, there was also a talk about doing some more generic optimizations on Scala code at compile time.

Raphael: Yes, that was the talk Double your performance: Scala’s missing optimizing compiler given by Li Haoyi from Databricks. In it, Li explains how they parse the code into a Dataflow graph, which they use to do some kind of coarse grain run of the code with a type lattice. The lattice is a kind of a type tree following type inheritance, but which can also contain a distinction between an integer that never changes, and one that varies for instance. This allows them to identify which branches can be eliminated by essentially detecting conditions that are always true/false in some parts of the code. Maybe the “double your performance” is a bit of an oversell, and I think the actual speedup would vary a lot depending on the kind of code you run, which would have to be balanced against the increased compilation time. This was nonetheless a very interesting and accessible talk in compiler territory, rich in examples and explanations. What they do is made possible by Scala’s very strong typing, but it may be applicable to other strongly typed languages as well. But that’s left as an exercise for the reader.

Aurel: It seems that you guys really enjoyed the talks! What about the other events related to the conference?

Kacem: Indeed there were three days of talks and workshops on a wide range of topics organized on different tracks, so I think that everyone was able to find an appealing talk for any given slot. The keynotes were also very interesting, with the opening one being of course given by Dr. Martin Odersky, the “father” of Scala. Finally, I think that a lot of the attendees also appreciated the organizers’ vision of inclusiveness and the multiple talks on the topic of diversity (even if the special panel about this was a bit long).

On the organisation part, it was almost perfect. I felt the rooms allocation could have been better, as some talks were very popular but planned in small rooms, but overall it was OK since this happened only for a couple of talks. A special mention for the Community Dinner hosted in the Olympic Museum. Everyone enjoyed the visit and the cocktail/dinner was great, with an astonishing view on lake Geneva and the French Alps.

Charles: The venue for the conference itself was also very nice. The SwissTech Convention Center is part of the EPFL campus that we got to visit. This reminded me of my college years as I did a semester at EPFL when I was studying computer science. Knowing the campus I was also able to show the famous Scala stairs to the entire team. If you didn’t know the Scala logo is inspired by the staircase of one of the buildings in EPFL.

Taking some perspective on Scala

All in all, attending the Scala Days conference was an incredibly enriching experience for our team. We feel more connected with the trends and the state-of-the-art of the Scala ecosystem and we got lots of ideas of things we could try out in our codebase. If some of them work out we look forward to sharing them with you next year!

The Recommendation Infrastructure team at Criteo AI Lab

Aurel, Baptiste, Charles, Clément, David, Kacem et Raphaël

Criteo R&D Blog

Tech stories from the R&D

Criteo Labs

Written by

The engineers building the advertising platform for the open Internet.

Criteo R&D Blog

Tech stories from the R&D

More From Medium

More from Criteo R&D Blog

More from Criteo R&D Blog

More from Criteo R&D Blog

Demystification of TC

More from Criteo R&D Blog

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade