This New Technology Extends Bonsai Into a Universal AI Runtime

Bonsai continues taking tangible steps towards becoming your favorite runtime fro artificial intelligence(AI) applications. Fresh off a funding round led by Microsoft Ventures, Bonsai just announced the release of Gears, an extension to its platform that allows the execution and management of non-Bonsai models.

For people not familiar with Bonsai, they can think about it as a new platform that provides a higher level programming model for the implementation of neural networks. Bonsai uses a language called Inkling to model deep learning applications and is considerably simpler than other deep learning frameworks. More importantly, Bonsai provides a runtime infrastructure and toolset that enables the execution and monitoring of deep learning models. If you are not too bored, you can check my recent article about Bonsai a few weeks ago ;).

The initial release of Bonsai positioned the platform as a competitive, and somewhat more sophisticated, alternative to “low-level” deep learning frameworks such as TensorFlow, Torch or Theano. With Gears, Bonsai extends its capabilities to become a universal runtime to those deep learning applications written in those frameworks.

Gears inspects models created in non-Bonsai, and preferably Python based, d deep learning frameworks and integrates it into the Bonsai platform. Specifically, Gears supports frameworks such as TensorFlow, Torch, OpenCV, Microsoft Cognitive Toolkit, Scikit-Learn and several others. Developers can implement neural networks on any of those platforms and still leverage Gears to execute and monitor those models using the Bonsai toolset.

Gears is the type of technology that can make Bonsai the runtime of choice for deep learning applications. That premise is even more important in enterprise environments in which the need for robust runtimes is one of the aspects holding back the adoption of deep learning platforms. There are other additions that, in my opinion, can help Bonsai and Gear’s value proposition as a universal runtime for deep learning applications. Let’s explore a few:

5 Ideas for Bonsai’s Short Term Roadmap

1 — API Generation: Generating REST APIs from Inkling models would streamline the interoperability of Bonsai programs and third party applications. That capability is already present in some of Bonsai’s competitors such as Algorithmia CODEX.

2 — Training Tools: Tools for training specific categories of deep learning models (object detection, sentiment analysis…) is essential to operate Bonsai applications in large environments. Again, this capability is remarkable important in enterprise settings in which Bonsai’s testing and training tools are a must to streamline the adoption of the platform .

3 — Testing Tools & Frameworks: Integrating training tools and frameworks such as OpenAI Gym or DeepMind Lab would help to simplify the lifecycle management of Bonsai applications.

4 — Data Source Management: Extending Bonsai’s management to support test and training data sources should be another welcomed addition to Bonsai in order to enable an end-to-end experience for deep learning applications.

5 — Platform Portability: Ensuring Bonsai’s portability across different cloud and container platform would remove friction for the adoption of Bonsai across heterogeneous infrastructures.