Facebook Says Developers Will Love PyTorch 1.0

At the first-ever PyTorch Developer Conference today in San Francisco, Facebook released a preview of PyTorch 1.0, its open source AI software framework designed to trailblaze “a seamless path from AI research to production.”

Since its initial release in October 2016, PyTorch’s flexibility has made it a preferred machine learning framework for AI researchers. It has more than 19.1k stars on GitHub (Google-backed TensorFlow has 111.4K stars, Amazon-backed Apache MXNet has 15.1K). PyTorch trails only TensorFlow among deep learning frameworks cited in 43,000 arXiv machine learning papers over the last six years. Facebook announced the incoming PyTorch 1.0 at its F8 Conference this May.

Facebook developers dove deeply into the PyTorch preview, speaking on the latest updates and features, cloud service provider support, integrations with chip manufacturers, and upcoming education programs on PyTorch implementation.

Facebook VP of Artificial Intelligence Jérôme Pesenti said at event’s kickoff that PyTorch 1.0 is going to help researchers address four major challenges: extensive reworking, time-consuming training, Python programming language inflexibility, and slow scale-up.

“As we said in Facebook, this journey is just one percent finished. We want to develop a system that really puts users at the center, a system developers and researchers will love.”

New features

First introduced was Torch.jit, a set of compiler tools that bridge the gap between research and production. Torch.jit includes a language (“Torch Script,” a subset of Python), and enables tracing and scripting models from eager mode into graph mode.

The new C++ front end will excite researchers who want to implement high-performance, low-latency C++ applications such as video games. It is a pure C++ interface connected to the backend of PyTorch. Still in beta testing.

To support distributed training, PyTorch introduced a revamped torch.distributed that allows for faster training across Python and C++ environments.

New ecosystem

While PyTorch is backed by Facebook, other tech giants are also embracing the framework in ways that promote their own services. Here are some highlights:

  • Amazon Web Service provides preconfigured environments for PyTorch 1.0, which include rich capabilities such as automatic model tuning.
  • Google announced a series of PyTorch 1.0 integrations across its software and hardware tools for AI development. These include a set of deep learning virtual machine images, Kubeflow, to enable simple end-to-end machine learning pipelines; TensorBoard integration; and most importantly, the availability of Cloud TPUs to run PyTorch.
  • Microsoft Azure now allows developers to deploy and scale out trained PyTorch models from a local computer to the Azure Cloud.
  • Chip giants including ARM, IBM, Intel, NVIDIA and Qualcomm are adding support for PyTorch 1.0.

Get started with deep learning on PyTorch

In a push to get people PyTorch-savvy as quickly as possible, Facebook announced it has joined force with E-learning organization Udacity and machine learning course developer Fast.ai.

Udacity is creating a free introductory-level deep learning on PyTorch online course. Facebook will sponsor 300 course graduates to continue their education in Udacity’s Deep Learning Nanodegree Program.

Fast.ai meanwhile released its first open source software library, Fastai, built on top of PyTorch 1.0. The library provides software ranging from a single consistent API to the most important deep learning applications and data types.

Journalist: Tony Peng | Editor: Michael Sarazen.

Follow us on Twitter @Synced_Global for more AI updates!

Subscribe to Synced Global AI Weekly to get insightful tech news, reviews and analysis! Click here !