Recap of the 2019 TensorFlow Dev Summit

TensorFlow
TensorFlow
Published in
7 min readMar 8, 2019

--

Posted by Fred Alcober and Sandeep Gupta, on behalf of the TensorFlow team

TensorFlow held its third and biggest yet annual Developer Summit in Sunnyvale, CA on March 6 and 7, 2019. The event was attended by approximately 1,000 machine learning enthusiasts and watched over livestream by tens of thousands more.

The event was comprised of two days filled with technical updates from the TensorFlow team and presentations from users showcasing amazing applications. We also held hacker-rooms, breakout sessions, and workshops. Attendees interacted with the TensorFlow team, got their questions answered, and collaborated with each other.

In the three years since launch, TensorFlow has matured into an entire end-to-end machine learning (ML) ecosystem, helping to power the ML revolution we see happening around us. It has been downloaded over 41 million times and has over 1,800 contributors from around the world. TensorFlow as a comprehensive ML platform is helping a diverse set of practitioners, researchers, and new users to build amazing new solutions to solve challenging problems with AI. For example:

At the developer summit, we announced the alpha release of TensorFlow 2.0, which marks the beginning of the TensorFlow 2.0 era and brings TensorFlow’s powerful capabilities to more developers and researchers, making it easier than ever to build and use ML. We also announced new community partnerships and education initiatives with O’Reilly Media, Udacity, Deeplearning.ai on Coursera, and fast.ai to help foster open source collaboration and train the next generation of new users. Further, we listened to our community and completely updated developer documentation and redesigned tensorflow.org to make it even easier to get access to resources and information.

Below are the highlights and key announcements from the event.

Easy to use

In TensorFlow 2.0, the biggest focus has been on making the API simpler, more intuitive, and natural for all users. We are making the API components integrate better with tf.keras as the recommended high-level API for most users. This will enable developers to go from data ingestion, transformation, model building, training, and saving, to deployment much more easily. We launched TensorFlow Datasets, a collection of commonly used ML datasets prepared for easy use in TensorFlow.

Developers in Rome new to ML are using TensorFlow to help paleographers decipher medieval manuscripts

Along with the TensorFlow 2.0 alpha release, we are providing conversion and migration tools and documentation to help with the transition from 1.x code to 2.0. These and many more resources, examples, and case-studies on TensorFlow can be accessed from the new tensorflow.org website.

Power to experiment and invent

TensorFlow is accelerating state-of-the-art research. This begins with flexibility for researchers to prototype their ideas quickly, try many experiments, and iterate. With new features in TensorFlow 2.0 such as eager execution by default, intuitive Python control flows, automatic optimization of eager code with tf.function, and greatly improved error messaging, we are enhancing the researcher development experience.

Researchers at NERSC at Lawrence Berkeley National Laboratory, Oak Ridge National Laboratory, and NVIDIA won the Gordon Bell Prize for successfully scaling a deep learning application on the Summit supercomputer using TensorFlow to study the effects of extreme weather

Large scale research in machine learning also requires massive parallel computing. Since last year, we have accelerated training on 8 V100s by more than 2x. Using a Cloud TPU v2, we’ve boosted performance by 1.6x, and with Intel MKL acceleration we have improved inference speed by more than 3 times. Getting great out of the box performance is a big focus of TensorFlow 2.0, and a core part of our progress towards final release.

The TensorFlow ecosystem includes a large collection of very powerful add-ons that expand TensorFlow in new and useful ways. Some of the add-ons that we described include:

Production on any platform with any language

Taking models from research to production has always been a core strength and focus for TensorFlow. Using TensorFlow, you can deploy models on a number of platforms such as servers, in the cloud, mobile and other edge devices, browsers, and many other Javascript platforms.

Engineers at Airbnb are using TensorFlow to accurately classify millions of home images uploaded a day at scale

TensorFlow for Production

TensorFlow Extended (TFX) brings the management of the entire machine learning lifecycle to our users. It has several component libraries and we announced new features in these components, but more significantly, the new orchestration support puts it all together to provide users with an integrated end-to-end platform (see example here). With support for your own orchestrator, the TFX components integrate with a metadata store. This store keeps track of all the component runs, the artifacts that went into them, and those that were produced. This enables advanced features like experiment tracking, model comparison, etc. that will greatly improve production use cases.

TensorFlow for Mobile & IoT

TensorFlow Lite, our solution for running models on mobile and embedded systems, has exploded globally and is now running on more than 2 billion mobile devices — leading the next generation of ML on device. It’s helping users solve use cases like predictive text generation, image classification, object detection, audio recognition, text to speech and speech to text recognition, video segmentation, and edge detection among many others.

At the event, we heard how global internet companies such as Alibaba XianYuin and Netease are using TensorFlow Lite to provide users with better application experiences. We also talked about TensorFlow Lite’s incredible ML performance and how it’s powering ML in marquee Google applications such as Search, Assistant, Photos and Pixel.

There have been major improvements to TensorFlow Lite’s general usability and model conversion features alongside an increased focus on optimization (i.e. quantization) and performance (i.e. GPU acceleration). We also showed how TensorFlow Lite is powering machine learning on the edge and IoT on platforms such as Coral TPUs and microcontroller boards (MCU).

TensorFlow for Javascript

Javascript is one of the most used programming languages, and TensorFlow.js brings ML to JavaScript developers. Since launch, it has seen tremendous adoption by the community with over 300,000 downloads and 100 contributors. At the summit, we announced TensorFlow.js version 1.0. Major features of this release include significant performance improvements (MobileNet v1 is 9 times faster in browser for inference compared to last year), many new off-the-shelf models for web developers to incorporate into applications, and support for more platforms where JavaScript runs. Companies like AirBnb and Uber among others are using TensorFlow.js in production environments and we are seeing amazing new use-cases emerge daily in our community gallery of TensorFlow.js projects.

In addition to deployment, TensorFlow.js can be used for building and training machine learning models directly in JavaScript either in browser or with Node.js support.

Swift for TensorFlow

We also shared our progress with the new Swift for TensorFlow package. With version 0.2 just released, users can see increased usability and try out this novel ML paradigm. To make it easier to get started with Swift, we launched a new machine learning course using Swift for TensorFlow by fast.ai.

Commitment to community growth

The success of TensorFlow is in large part due to its amazing and growing community of users and developers. We have developed TensorFlow 2.0 in close engagement with the community, via an open RFC process, many new Special Interest Groups, and our Google Developer Expert community feedback and testing.

We have launched a new #PoweredByTF campaign and are discovering amazing new projects being built by our users. We announced the Google Summer of Code program where students can apply to come work with the TensorFlow engineering team for some hands-on development. We are also launching a new Powered by TF Challenge hosted on DevPost specifically for users to create and share their latest and greatest with TensorFlow 2.0.

Great educational resources are key to machine learning democratization and adoption. We announced two new educational resources to make it easier for beginners and learners to get started with TensorFlow. The first one is deeplearning.ai’s Course 1 — “Intro to TensorFlow for AI, ML and DL”, part of the TensorFlow: from Basics to Mastery series hosted on Coursera. The second one is Udacity’s Intro to TensorFlow for Deep Learning. Both courses are designed with developers in mind and require no prior machine learning experience and are available now.

And finally, we announced TensorFlow World, a new week-long conference dedicated to fostering open source collaboration and all things TensorFlow. This conference, co-presented by O’Reilly Media and TensorFlow, will be held in Santa Clara, CA on the week of October 28. Our vision is to bring together the amazing TensorFlow world and give an opportunity for folks to connect with each other. Call for proposals is open for attendees to submit interesting TensorFlow project papers or for companies to showcase their solutions using TensorFlow, and we can’t wait to see you there.

At Google, we believe AI research and applications will advance faster when all users have access to the best tools, allowing everyone to participate. TensorFlow is dedicated to helping empower all ML users. We’re committed to working with the community to make TensorFlow easy for everyone and to pursue AI for good!

--

--

TensorFlow
TensorFlow

TensorFlow is a fast, flexible, and scalable open-source machine learning library for research and production.