Posted by André Susano Pinto (Technical Lead for TensorFlow Hub) and Clemens Mewald (Product Manager)
In a previous post we announced TensorFlow Hub, a platform to publish, discover, and reuse parts of machine learning modules in TensorFlow. An important part of this platform is its web experience, which allows developers to discover TensorFlow modules for their use cases. Today we are launching a new web experience for TensorFlow Hub that allows for easier search and discovery, as well as paves the way to a multi-publisher platform.
Explore and discover modules
TensorFlow Hub is a platform for sharing reusable pieces of ML, and our vision is to provide a convenient way for researchers and developers to share their work with the broader community. The Universal Sentence Encoder module is a successful example of speeding up the translation from fundamental machine learning science to application in the broader developer community. The paper referenced the tfhub.dev URL of the module. When that URL is copied into a browser it leads to the detail page of the module, on which the publishers share documentation and a link to a Colab notebook to try out the module. The Universal Sentence Encoder has become one of the most popular modules on TF Hub.
Search and filter
As you would expect, you can search and filter the modules on TF Hub. The applicability of text modules for your problem depends on the data that they were trained on. In the above example we show you how easy it is to search for text embeddings and filter them by Language:Spanish, to find the NNLM module trained on Spanish data.
Object detection made easy
We are continuously expanding the TensorFlow Hub inventory with new modules developed by teams at Google and DeepMind. One recent addition is the FasterRCNN module trained on Open Images v4. The module can be loaded and used to perform object detection with a single line of code:
detector = hub.Module(
With the module, we published a Colab notebook that allows you to load it and inspect its outputs. Below is an example of an image from unsplash.com and the detected objects.
The Colab notebook walks you through downloading the module and applying it, all within a few short minutes.
Other recent additions to TensorFlow Hub include:
- The winners of the iNaturalist Kaggle Challenge 2017 published a paper describing their approach and released their model on TensorFlow Hub, showcasing advantages of transfer learning.
- Jeremiah Harmsen from the TensorFlow Hub team published a Kaggle example demonstrating how pre-trained modules from TensorFlow Hub can be leveraged to solve sentiment analysis challenges on Kaggle.
TensorFlow Hub for product teams
Aside from consuming modules that are published on https://tfhub.dev, the TensorFlow Hub libraries also allow you to publish modules to, and consume from, private storage. This allows teams to share modules and benefit from each other’s work.
Instead of referring to modules by their tfhub.dev URL, you can use a filesystem path:
m = hub.Module(“/tmp/text-embedding”)
embeddings = m(sentences)
In order to create these custom embeddings, please follow our “Creating a module” tutorial.
How to get started
Check out https://tfhub.dev to use our new web experience and https://www.tensorflow.org/hub/ to keep up to date on the latest guides and API docs. If you run into any bugs, you can file an issue on GitHub. To stay in touch, you can star the GitHub project.
We would like to thank Bo Fu, Andrew Gasparovic, Jiaqi Guo, Jeremiah Harmsen, Joshua Horowitz, Zicheng Huo, Elizabeth Kemp, Noé Lutz, Till Pieper, Graham Smith, Sijie Wang, and Sitong Zhou.