Announcing Optuna 1.0

hvy
hvy
Jan 14, 2020 · 4 min read
Image for post
Image for post

Optuna, a hyperparameter optimization (HPO) framework designed for machine learning written in Python, is seeing its first major version release. It is a new framework that aims to make HPO more accessible as well as scalable for experienced and new practitioners alike. This GitHub project has grown to accommodate a community of developers by providing state-of-the-art algorithms, and we are proud to see the number of users increasing. We now feel that the API is stable and ready for many real-world applications.

A comparison of the test error over time measured in minutes between Optuna and an existing library.
A comparison of the test error over time measured in minutes between Optuna and an existing library.
A comparison of the test error over time measured in minutes between hyperparameter configurations obtained by Optuna and an existing framework, both training a simplified AlexNet on the SVHN dataset [1]. The error decreases significantly faster using Optuna.

Concept

  • Eager dynamic search spaces: Pythonic search spaces using Python conditionals and loops without the need for domain-specific languages (DSLs)
  • Efficient sampling and pruning algorithms: State-of-the-art algorithms to efficiently explore large search spaces and prune unpromising trials to save computational resources
  • Integrations: Modules for popular machine learning libraries such as PyTorch, TensorFlow, Keras, FastAI, scikit-learn, LightGBM, and XGBoost
  • Visualizations: Customizable visualizations of optimizations with a single function call
  • Distributed optimization: Optimizations that are parallelizable among threads or processes without having its code modified
  • Lightweight: Simple installation with few requirements

Try it out on Google Colab

Looking back

Existing frameworks were also often not updated to the more recent algorithms. We started by implementing random search and the Tree-structured Parzen Estimator (TPE) [2] which are both known to be performant hyperparameter sampling algorithms. Later that year, we used this TPE implementation in the first Google AI Open Images challenge on Kaggle and took second place, losing to the first place competitor by a mere 0.04% difference.

The increasing popularity of machine learning emphasized the need for a well designed HPO framework, so in December of 2018, we made Optuna public on GitHub.

In the following year, we participated in multiple conferences, starting with SciPy2019 in July which is a conference for the scientific Python community. You can check out the video for our presentation here. In August, we presented Optuna at the Knowledge Discovery and Data Mining conference (KDD) and introduced the ability for users to define their own algorithms and pruning criteria. We were proud to receive 5th place out of over 160 competitors in the AutoML competition. More recently in October, we presented Optuna at the Open Data Science Conference (ODSC) West 2019, a large data science conference.

What’s ahead

Visit GitHub for more details. For an in depth explanation of the features of Optuna, please head over to the documentation or our paper [1]. We also have Gitter chat rooms for interactive discussions which you can join here.

Contributors

A03ki, AnesBenmerzoug, Crissman, HideakiImamura, Muragaruhae, Rishav1, RossyWhite, Y-oHr-N, c-bata, cafeal, chris-chris, crcrpar, d1vanloon, djKooks, dwiel, g-votte, harupy, henry0312, higumachan, hvy, iwiwi, momijiame, mottodora, nmasahiro, oda, okapies, okdshin, r-soga, scouvreur, sfujiwara, sile, smly, suecharo, tadan18, tanapiyo, tohmae, toshihikoyanase, upura and yutayamazaki.

Our thanks to those who have contributed code, issues, or feedback on Optuna!

References

[2] James Bergstra, Rémi Bardenet, Yoshua Bengio, and Balázs Kégl. Algorithms for hyper-parameter optimization. In NIPS, pages 2546–2554, 2011.

[3] Li, Lisha, et al. “Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization” arXiv preprint arXiv:1603.06560 (2016).

Optuna

A hyperparameter optimization framework

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store