Brief View Of Google’s Trax Library

Pushprajmaraje
Analytics Vidhya
Published in
4 min readAug 23, 2021

What Is Trax?

Trax- An End-To-End Library Which Gives Easy And Understandable Code. The Code In Trax Usually Done With A Much Simpler Structure, Which Counters The Big Codes Of The Other Libraries Like TensorFlow And Pytorch. Actively Used And Maintained In The Google Brain Team.

It Is Originally A Derived Version Of Many Libraries, But Mainly It Follows The TensorFlow Style.

Installation: :-

Fig:- Installation Of Trax

You Can See Above That Using It Is Similar To TensorFlow And Numpy.

In Tensorflow We Define “Import Tensorflow As Tf” Here In This Library We Do It Same As ”Import Trax”.

It Includes Some Of The Basic Models Essential For NLP Tasks (Example: LSTM, ResNET, Transformer). Utilized In Various Fields Such As Research Library For Constantly Developing New Models And Testing Them On The Dataset Which Includes Tensorflow Datasets. T2T Also Known As Tensor-2-Tensor Datasets.

This Library — Used For Python Based Notebooks To Work On Custom Models, As Well As Shell Based Command Oriented For Training Of The Model On Pre-Trained Models.

How Does Trax Work?

Originated From TensorFlow And Uses The Core Python Libraries Like Numpy. It Has Some Packages Which Introduce More Efficient Ways To Code In Trax.

  1. Trax And Fastmath

Its Model Works On An Array Based Structure Known As Tensors, Usually Operated Using The “Numpy.Array” Library Functions. Using This Library And Numpy Library In Combination The Computation Speed Is Increased By Making Use Of GPUs And TPUs To Accelerate Them. This Enables The Need Of Calculating The Gradient Automatically On The Tensors, Which Is Also Pre-Packaged Into “Trax.Fastmath” Package Thanks To Its Backends — JAX And TensorFlow Numpy.

Following Is The Basic Code Of Fastmath And Trax Numpy.

Fig :- Its Working (Https://Github.Com/Google/Trax)

  1. Layers

Layers Are The Necessary Building Blocks Of This Library, A Layer In This Is Capable Of Computing A Function With Zero Or More Inputs Or Zero Or More Outputs. The Inputs And Outputs Are Tensors Which Work As JAX And Numpy.Array. A Trax Layer Which Does Not Have Any Weights Or Sublayers Can Be Used Without Initialization Of The Layer.

The Layers Are Also Defined As Objects, Which Makes Them Easy — The “__call__” Method, This Enables Us To Use Directly On The Input Data.

Code Below — From Its Documentation.

These Layers Are The Same As In Other Frameworks Like Tensorflow Or Pytorch, But What Makes Them Different From Others Is The Number Of Lines These Layers Are Coded Into.

We Will Now Directly Cut To The Implementation Of Its Layers. In Tensorflow The Model Is Defined Using “Sequential”, Here In This, It Is Done Using “Serial”. “Serial” Is A Combinator That Combines The Sublayers Based On The Input/Output Of Each Layer. It Uses Stacking Of Layers Which Makes It Easy To Pass The Inputs To Each Layer.

Example :

It Seems Exactly As A “Tensorflow.Sequential” Model, But It’s The Internal Structure Of The Layers, Which Makes It To Run Fast.

It Also Allows You To Define Your Own Layers And Sublayers.

Here’s How The Trax Model Looks With Actual Layers:

Following Is The Code Block From One Of My Course Notebooks Where I Learnt About Trax.

Conclusion:-

As It Is Still In Development, I Am Fortunate To Get A Hands On Experience Of Using This Library In An Online Course. I Can Say That This Made The NLP Tasks, Rather Implementing Deep Learning And Neural Networks Like RNNs Much Easier.

References:-

  1. This Link Is About “ How Trax Came Into Existence ”

https://coursera.org/share/1bdab833b3fbbee79133006f2cab236f

  1. This Link Follows The Trax Documentation In Detail.

https://trax-ml.readthedocs.io/en/latest/notebooks/trax_intro.html

--

--