# Esoteric TensorFlow

TensorFlow’s graph is Turing complete.

We tend to focus only on TensorFlow as a library for neural nets and deep learning. However, it also has aspects of Data flow programming(DFP).

Since TensorFlow has `tf.cond` and `tf.while_loop`, it has expression power equivalent to general programming languages. Algorithms such as sorting and searching that can be written in C language and Python can also be in TensorFlow graphs.

I have tried implementing algorithms such as FizzBuzz · Bubble Sort · Quick Sort · Binary Search · Brainf * ck in TensorFlow’s graph. I will introduce the different aspect of TensorFlow.

### Fizz Buzz

`from __future__ import print_functionimport tensorflow as tfclass FizzBuzz():    def __init__(self, length=30):        self.length = length        self.array = tf.Variable([str(i) for i in range(1, length+1)], dtype=tf.string, trainable=False)        self.graph = tf.while_loop(self.cond, self.body, [1, self.array],                            shape_invariants=[tf.TensorShape([]), tf.TensorShape(self.length)],                            back_prop=False)    def run(self):        with tf.Session() as sess:            tf.global_variables_initializer().run()            return sess.run(self.graph)    def cond(self, i, _):        return (tf.less(i, self.length+1))    def body(self, i, _):        r = tf.cond(            tf.logical_and( # tf.equal(tf.mod(i, 15), 0)                tf.equal(tf.mod(i, 3), 0),                tf.equal(tf.mod(i, 5), 0)),                lambda: tf.assign(self.array[i - 1], 'FizzBuzz'),                lambda: tf.cond(tf.equal(tf.mod(i, 3), 0),                        lambda: tf.assign(self.array[i - 1], 'Fizz'),                        lambda: tf.cond(tf.equal(tf.mod(i, 5), 0),                                lambda: tf.assign(self.array[i - 1], 'Buzz'),                                lambda: self.array                )            )        )        return (tf.add(i, 1), r)if __name__ == '__main__':    fizzbuzz = FizzBuzz(length=50)    ix, array = fizzbuzz.run()    print(array)`

Result :

``['1' '2' 'Fizz' '4' 'Buzz' 'Fizz' '7' '8' 'Fizz' 'Buzz' '11' 'Fizz' '13' '14' 'FizzBuzz' '16' '17' 'Fizz' '19' 'Buzz' 'Fizz' '22' '23' 'Fizz' 'Buzz' '26' 'Fizz' '28' '29' 'FizzBuzz' '31' '32' 'Fizz' '34' 'Buzz' 'Fizz' '37' '38' 'Fizz' 'Buzz' '41' 'Fizz' '43' '44' 'FizzBuzz' '46' '47' 'Fizz' '49' 'Buzz']``

### Bubble Sort

`from __future__ import print_functionimport numpy as npimport tensorflow as tfnp.random.seed(123)class BubbleSort():    def __init__(self, array):        self.i = tf.constant(0)        self.j = tf.constant(len(array)-1)        self.array = tf.Variable(array, trainable=False)        self.length = len(array)        cond = lambda i, j, _: tf.less(i-1, self.length-1)        self.graph = tf.while_loop(cond, self.outer_loop, loop_vars=[self.i, self.j, self.array],                shape_invariants=[self.i.get_shape(), self.j.get_shape(), tf.TensorShape(self.length)],                parallel_iterations=1,                back_prop=False)    def run(self):        with tf.Session() as sess:            tf.global_variables_initializer().run()            return sess.run(self.graph)    def outer_loop(self, i, j, _):        cond = lambda i, j, _: tf.greater(j, i)        loop = tf.while_loop(cond, self.inner_loop, loop_vars=[i, self.length-1, self.array],                    shape_invariants=[i.get_shape(), j.get_shape(), tf.TensorShape(self.length)],                    parallel_iterations=1,                    back_prop=False)        return tf.add(i, 1), loop[1], loop[2]    def inner_loop(self, i, j, _):        body = tf.cond(tf.greater(self.array[j-1], self.array[j]),                    lambda: tf.scatter_nd_update(self.array, [[j-1],[j]], [self.array[j],self.array[j-1]]),                    lambda: self.array)        return i, tf.subtract(j, 1), bodyif __name__ == '__main__':    x = np.array([1.,7.,3.,8.])    _, _, sorted_array = BubbleSort(x).run()    print(x)    print(sorted_array)    y = np.random.rand(20)    print(y)    _, _, sorted_array = BubbleSort(y).run()    print(sorted_array)`

Result :

``[ 1.  7.  3.  8.][ 1.  3.  7.  8.][ 0.69646919  0.28613933  0.22685145  0.55131477  0.71946897  0.42310646  0.9807642   0.68482974  0.4809319   0.39211752  0.34317802  0.72904971  0.43857224  0.0596779   0.39804426  0.73799541  0.18249173  0.17545176  0.53155137  0.53182759][ 0.0596779   0.17545176  0.18249173  0.22685145  0.28613933  0.34317802  0.39211752  0.39804426  0.42310646  0.43857224  0.4809319   0.53155137  0.53182759  0.55131477  0.68482974  0.69646919  0.71946897  0.72904971  0.73799541  0.9807642 ]``

### Brain F*ck

EsotericTensorFlow/brain_fuck.py

`# Code abridgedif __name__ == '__main__':    src_A = '++++++[> ++++++++++ < -]> +++++.'    pc, tape, cur, jumps, output = BrainFuck(src_A).run()    print(output)    src_helloworld ='''+++++++++[>++++++++>+++++++++++>+++++<<<-]>.>++.+++++++..+++.>-.------------.<++++++++.--------.+++.------.--------.>+.'''    pc, tape, cur, jumps, output = BrainFuck(src_helloworld).run()    print(output)`

Result :

``AHello, world!``

### Quick Sort, Binary Search and so on

Other algorithms are here.

### TIPS

#### tf.cond(…)

`tf.cond (...)` is a node equivalent to an if statement. Call the function according to the value of the node pred that returns the Bool value. You must specify lambda or function for `true_fn` and `false_fn`.

`tf.cond(    pred,# Condition    true_fn=None, # Process to be executed when cond is True    false_fn=None, # Process to be executed when cond is False)`

#### tf.while_loop(…)

``tf.while_loop(    cond, # Condition    body, # Process to be executed when cond is True    loop_vars, # Argument to body    shape_invariants=None)``
One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.