Sitemap
This is not a Monad tutorial

Writings, reviews and interviews about programming languages, operating systems, network protocols, artificial intelligence and machine learning

Follow publication

SymJAX: symbolic CPU/GPU/TPU programming

9 min readSep 18, 2020

--

SymJAX's really cool logo

What is SymJAX?

import symjax
import symjax.tensor as T
from symjax.nn.optimizers import Adam# we create a persistent variable to be optimizedz = T.Variable(3.0, dtype=”float32", trainable=True)# the optimization is about minimizing the following lossloss = T.power(z — 1, 2, name=’loss’)# this loss is just a node in the graph, nothing is computed yetprint(loss) # Op(name=loss, fn=power, shape=(), dtype=float32, scope=/)# we minimize it with Adam, we can omit to assign it to a variable since the# internal updates are automatically collected, 0.1 is the learning rateAdam(loss, 0.1)# we create the function (XLA compiled graph) and define what are the inputs# (here none), the outputs and the persistent variable updates (from Adam)train = symjax.function(outputs=[loss, z], updates=symjax.get_updates())# for illustrative purposes, we perform 200 steps and reset the graph after 100 stepsfor i in range(200): if (i + 1) % 100 == 0: # we can use any identifier to select what to reset, (‘*’ is the default) # if we want to only reset the variables create by Adam # (the moving averages etc) one would use (for example) # symjax.reset_variables(/AdamOptimizer*’) # in our case let reset all variables symjax.reset_variables() # the output of this function is the current loss and value of z, and when called it also # internally perform the given updates computed from Adam train()

The SymJAX documentation reads: “The number of libraries topping Jax/Tensorflow/Torch is large and growing by the day. What SymJAX offers as opposed to most is an all-in-one library with diverse functionalities”. What’s the main issue with having to use multiple libraries and how does creating a single library solve it?

The documentation states that one of the goals of SymJAX is to optimize processes. How does the library enable that optimization? How does it compare to other technologies?

log( exp(x) * exp(4 + x) )
2 * x + 4’

Does SymJAX support all state-of-art neural network architectures?

What were the biggest challenges in allowing a broad hardware support (GPUs, TPUs)?

Is there support for dynamic computation graphs à la Pytorch? If not, are there any plans for it?

SymJAX pays homage to Theano in many aspects. What’s different from Theano and why not improve Theano to bring it up to date instead of creating a new library from scratch?

Theano is powerful but in terms of popularity it lost the battle to the more high-level TensorFlow. What is the user you have in mind for SymJAX? How is it better than the other options?

How many people are behind this project? Are you looking for contributors?

What is SymJAX’s current status and plans for the near future? How close is the project to its first stable release?

For our readers who might want to know more, what papers, articles and courses do you recommend doing to learn about symbolic programming and deep learning?

--

--

This is not a Monad tutorial
This is not a Monad tutorial

Published in This is not a Monad tutorial

Writings, reviews and interviews about programming languages, operating systems, network protocols, artificial intelligence and machine learning

Federico Carrone
Federico Carrone

Written by Federico Carrone

A happy member of The Erlang, Rust/ML and Lisp Evangelism Strikeforce. Network Protocol’s RFC fanatic. Big Data and Machine Learning

No responses yet