Dynamic Notebooks for Versatile AI

And why it might just be beneficial to generalize models

DJ
The Startup
4 min readNov 11, 2020

--

Dynamic Notebooks for training identical models for every use case, to make Artificial Intelligence easier and more efficient.

Photo by Markus Spiske on Pexels

Growth of Versatility in AI

From the earliest days of Artificial Intelligence, there has been notable importance given to the adaptability of the models, starting from Alan Turing himself who, in his 1936 paper titled On Computable Numbers, with an Application to the Entscheidungsproblem, in the sixth part, mentions a Universal machine. From then on, AI has grown tremendously with GPT-3 the closest thing to Machine Versatility we’ve ever seen.

While making models adaptable is being researched continuously, there’s a void to fill in on the Data and Deployment front. This is where Dynamic Notebooks come in.

The making of Dynamic Notebooks

As there is a lacuna, a need to bridge the gap between the adapting models and stagnant data, we need to simplify this into a single statement defining our solution.

Many datasets, same code.

Many datasets…

From tensorflow.org

Completion of the first part is made a breeze due to the excellent compilation at TensorFlow Datasets, where we see varied datasets, ready to load in a single line of code. For the sake of dynamic notebooks, I chose to use the Image Classification datasets, simply as their integration is the most straightforward.

However, considering that not all datasets are hosted, it is still ideal as the tf.data.Dataset class is able to represent every kind of dataset efficiently.

…same code

This part was more of a challenge than the first, as various datasets of Image Classification have distinct qualities.

Major differences were ~

  • Size of Input Image
  • Grayscale or Colour
  • The number of output classes

so there was a need for preprocessing the images and, inevitably, user inputs.

From Google Colab

Thus, I used a Colab IPython Notebook, and its forms, to make it as seamless as possible for the user to input data about his chosen dataset.

This whole process of dynamizing ML pipelines (for Image Classification, for now) can thus be shown elegantly by the above diagram.

An example of such a dynamic notebook is the one below, that I made for my TFDS and tf.data Tutorial.

Applications of Dynamic Notebooks

  • For showcasing new architectures in Deep Learning, the usage of such Dynamic Notebooks would enable a viewer to readily use hundreds of datasets with that architecture, evaluate it or test it and if found appropriate, deploy it with ease.
  • As a wise programmer said, `to learn code, write code’ and this can be enabled using such notebooks where extended learning is made rapid.

and many more uses for you to figure out.

Future work

Photo by Tomas Ryant from Pexels
  • The immediate step could be adding NLP, Object Detection, Speech Recognition and other fields to dynamic notebooks, on which I’d be more than happy to collaborate.
  • Further, such notebooks could be fitted with a progressive method of importing custom data and datasets from various sources.

Closing thoughts

While such notebooks may already be an efficient way to prototype and showcase AI, there still remains a lot of work to do regarding widespread implementation and versatility in operating fields. I sincerely hope that such extensive usage would, in turn, accelerate AI and make model training an easier thing to do, than it is now.

Feel free to mail me up for any collaborative opportunities, and have a beautiful day.

--

--

DJ
The Startup

I am an undergrad at IIT Kanpur, exploring applied AI, specifically on Web and Mobile, with interests in Rust, C#, Arduino and rocketry