Microsoft Releases NNI V1.3 for AutoML Algorithms and Training

Synced
Synced
Jan 7 · 4 min read

Applying traditional machine learning methods to real world problems can be extremely time consuming. Automated machine learning (AutoML) aims to change that — making it easier to build and use ML models by running systematic processes on raw data and selecting models that pull the most relevant information from the data.

To help users design and tune machine learning models, neural network architectures or complex system parameters in an efficient and automatic way, in 2017 Microsoft Research began developing its Neural Network Intelligence (NNI) AutoML toolkit, open-sourcing v1.0 version in 2018.

NNI is a “lightweight but powerful” toolkit that can dispatch and run trial jobs generated by tuning algorithms to search for the best neural architecture and hyperparameters in environments such as Local Machine, Remote Servers, OpenPAI, Kubeflow, FrameworkController on K8S (AKS etc.) and other cloud options.

Microsoft recently released NNI v1.3 as well as a Chinese NNI version. The update provides more comprehensive support for the whole machine learning life cycle by applying AutoML algorithms to steps such as feature engineering, neural network architecture search (NAS), hyperparameter tuning, and model compression.

Microsoft recommends NNI for anyone who wants to try different AutoML algorithms in their training or models or run AutoML trial jobs in different environments to speed up search. The toolkit will also be appreciated by researchers and data scientists who want to easily implement and experiment on new AutoML algorithms, as well as ML Platform owners who want to support AutoML on their platform.

NNI’s GitHub page outlines the properties that make the toolkit so useful:

  • Easy-to-use: NNI can be easily installed through python pip — only several lines need to be added to the code in order to use NNI’s power. Users can use both commandline tool and WebUI to work with your experiments.
  • Scalability: Tuning hyperparameters or neural architecture often demands large amount of computation resource, while NNI is designed to fully leverage different computation resources, such as remote machines, training platforms. Hundreds of trials could run in parallel by depending on the capacity of your configured training platforms.
  • Flexibility: Besides rich built-in algorithms, NNI allows users to customize various hyperparameter tuning algorithms, neural architecture search algorithms, early stopping algorithms, etc. Users could also extend NNI with more training platforms, such as virtual machines, kubernetes service on the cloud. Moreover, NNI can connect to external environments to tune special applications and models on them.
  • Efficiency: The NNI team are constantly working on more efficient model tuning from both system level and algorithm level, for example, leveraging early feedback to speedup tuning procedure.
High-level NNI architecture

A basic NNI experiment starts when the tuner receives search space and generates configurations. These configurations are submitted to training platforms and their performances are reported back to the tuner so new configurations can be generated and submitted. For each experiment, users can follow an easy three-step process: Define search space, Update model codes, and Define Experiment.

In terms of capabilities, NNI provides both a Command Line Tool and a user friendly WebUI to manage training experiments. With the extensible API, users can customize their own AutoML algorithms and training services.

NNI also provides a set of build-in SOTA AutoML algorithms and out of box support for popular training platforms. The team is still adding new capabilities and welcomes outside contributions.

Current NNI capabilities

NNI v1.3 is compatible with the latest versions of Linux, MacOS and Windows. It also naturally supports hyperparameter tuning and neural network search for AI frameworks including PyTorch, Keras, TensorFlow, MXNet, and Caffe2, as well as libraries such as Scikit-learn, XGBoost, and LightGBM.

The open-sourced Neural Network Intelligence v1.3 is available for download on GitHub.


Journalism: Yuan Yuan | Editor: Michael Sarazen


We know you don’t want to miss any story. Subscribe to our popular Synced Global AI Weekly to get weekly AI updates.


Need a comprehensive review of the past, present and future of modern AI research development? Trends of AI Technology Development Report is out!


2018 Fortune Global 500 Public Company AI Adaptivity Report is out!
Purchase a Kindle-formatted report on Amazon.
Apply for Insight Partner Program to get a complimentary full PDF report.

Synced

Written by

Synced

AI Technology & Industry Review — syncedreview.com | Newsletter: goo.gl/Q4cP3B | Become Synced Insight Partner: goo.gl/ucXZDw | Twitter: @Synced_Global

SyncedReview

We produce professional, authoritative, and thought-provoking content relating to artificial intelligence, machine intelligence, emerging technologies and industrial insights.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade