PyTorch / XLA Library Reaches General Availability on Google Cloud

Synced
SyncedReview
Published in
3 min readSep 30, 2020

It was announced yesterday in a PyTorch blog post that the PyTorch / XLA library, a package that enables PyTorch to connect to Google TPUs and use TPU cores as devices, is now generally available on Google Cloud, with support for a broad set of entry points for developers.

In 2018, Google made its then-in-house Tensor Processing Units (TPUs) available to cloud customers. The Cloud TPUs offer an excellent solution for shortening the training time of machine learning models. However, TPU access was initially limited to TensorFlow and Keras, leaving many developers who use the open-source deep learning framework PyTorch frustrated.

PyTorch, developed primarily by Facebook AI and introduced in 2016, was designed to be clear and flexible. Expressing DL models in the idiomatic Python programming language can facilitate the building of deep learning projects.

In 2019, a collaboration of engineers and researchers from Facebook, Google, and Salesforce Research announced the PyTorch-TPU project to allow the PyTorch community to access Cloud TPUs’ high-performance capabilities. The PyTorch/ XLA package uses Google’s XLA deep learning compiler to connect the PyTorch deep learning framework and Cloud TPUs.

A Google Cloud blog post notes that the Allen Institute for AI (AI2) has used PyTorch / XLA on Cloud TPUs on several recent projects, with AI2 research scientist Matthew Peters for example investigating ways to add visual components to state-of-the-art language models to improve their natural language understanding. “While PyTorch / XLA is still a new technology, it provides a promising new platform for organizations that have already invested in PyTorch to train their machine learning models,” Peters says.

The latest PyTorch 1.6 is now officially supported on Cloud TPUs with the following update highlights:

  • Support for Intra-Layer Model Parallelism
  • Additional XLA ops
  • Better (Integrations) Experience with Colab and Kaggle notes
  • Support within Deep Learning VM Images

More information on using PyTorch / XLA on Cloud TPUs is available in the various Colab notebooks that can be found on the official PyTorch / XLA repository. There are also open-source implementations of widely-used deep learning models and Google Cloud tutorials available, for ResNet-50, Fairseq Transformer, Fairseq RoBERTa, and DLRM.

PyTorch/ XLA is available on GitHub.

Reporter: Fangyu Cai | Editor: Michael Sarazen

Synced Report | A Survey of China’s Artificial Intelligence Solutions in Response to the COVID-19 Pandemic — 87 Case Studies from 700+ AI Vendors

This report offers a look at how China has leveraged artificial intelligence technologies in the battle against COVID-19. It is also available on Amazon Kindle. Along with this report, we also introduced a database covering additional 1428 artificial intelligence solutions from 12 pandemic scenarios.

Click here to find more reports from us.

We know you don’t want to miss any news or research breakthroughs. Subscribe to our popular newsletter Synced Global AI Weekly to get weekly AI updates.

--

--

Synced
SyncedReview

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global