Tensorflow Concepts for beginners with examples

Mehul Gupta
Data Science in your pocket
6 min readAug 29, 2019

--

Deep Learning is everywhere. Most of the beginners (and even experts😅) have been using Keras for creating Neural Networks and chilling their lives out!!

but with ease comes restriction.

Keras at times is rigid and it is the time when most of the Data Scientists move to the all-time great !!

TensorFlow

So, let's dive

1. Tensor: A n-dimensional array.

For example, a 1×1 tensor is a scalar, a 1×n tensor is a vector, an n×n tensor is a matrix, and an n×n×n tensor is just a three-dimensional array.

2. Nodes: A node represents an operation, possibly applied to some input, and can generate an output that is passed on to other nodes.

Example: a=matmul(x,w).Here matmul() is a node

Nodes aren’t calculated on the go but using Session object(discussed below)

3. Edges/Tensor objects: As we know nodes aren’t calculated on the go. But the graph has to be created before calling the Session object. Hence the reference to their to-be-computed result can be passed on—flow—to another node while creating the graph. These references, which we call the edges/Tensor objects.

4. Computational Graph: A graph refers to a set of interconnected entities, commonly called nodes(discussed above) or vertices. The output of one node may flow to another node or the graph could stop as well. Even data generating functions are considered a node!!

The below image explains it better

5. Default_graph: If not mentioned separately, all variables, constants and other stuff get added to a single graph called the default_graph.

To create a different graph, create a new Graph() object and set it as default_graph.

6. Constants: They are like C/Java constant variables whose values can’t be changed.

Example: a=tf.constant(1.0,dtype=tf.float32).You can’t change the value of ‘a’ now!!

7. Variable: As the name suggests, they are the ones whose value can be changed.

Example: a=tf.variable(1,dtype=tf.float32).You can change ‘a’.

8. Matmul & Multiply: matmul represents matrix multiplication while multiply is element-wise multiplication.

LHS:multiply()

RHS:matmul()

9. Reduce_mean: This is the same as np.mean used to calculate the mean of a tensor. The ‘axis’ parameter can be used to calculate row-wise or column-wise mean of the tensor.

10. Session: A Session object is like an interface between Python objects and data on our end, and the actual computational system where memory is allocated for the objects we define, intermediate variables are stored, and finally results are fetched for us. In short, to execute a neural network, we need to create a session.

Have a look to understand Session object

11. Global_variable_initializer: In TensorFlow, all the variables & constants must be initialized before creating a session.Global_variable_initializer helps us to initialize all variables/constants associated with the current Graph at once.

12. Fetches: It refers to the variable/list_of_variables we want to calculate passed to session.run(). We must remember that no variable will be calculated if not mentioned in ‘fetches’ or hasn’t been used to calculate any of the ‘fetches’ variable.

13. Logits: This can be seriously confusing as it has many meanings on the internet. In a Neural Network using Tensorflow, it refers to the Final Dense layer outputs before applying any activation function(mostly softmax)

14. Placeholders: They can be taken as containers for feeding various dynamic values to the neural network.

Example: We would want to train our model with batches of data. Since each batch is different, we can’t stagnate the input. Hence with placeholders, we can achieve this. We will be passing each batch to the model.

Now Some Tough Ones😢

15. tf.nn._: Here the ‘_’ can be replaced by any layer(dense,conv2d,etc.).It is a low level operation for neural network layers(conv2d,drop,etc.) , activation-functions like sigmoid,etc.It provides some flexibility over tf.Layers

16. tf.layers._: It can be taken as a Wrapper for tf.nn.Like in tf.nn, ‘_’ can take various values. It internally calls tf.nn._

tf.layers automatically handles a lot of parameters like bias and weights while it all has to mentioned externally in tf.nn(as it is low level).But as mentioned tf.Layers can be rigid for certain parameters and hence reduces flexibility

17. tf.contrib: It is the module/codes contributed(with additional features) and would be soon in the core Tensorflow Library. Codes here might get changed in the future and there aren’t any guarantees.

TRY AT YOUR OWN RISK

18. softmax_cross_entropy_with_logits: You will find this term quite often. Let's break this into 3 parts.

  • Logits- Output of Dense Layer
  • SoftMax:

here, theta represents logits.

  • Cross_Entropy: Look below

Hence, softmax_cross_entropy_with_logits = Cross_entropy(softmax(logits))

19. feed_dict: When we are using placeholders in a tf model while using session.run(), we need to use feed_dict to pass values to those placeholders. Example:

20. Random_normal(shape, mean, std): This functionality returns a tensor of given shape from a normal distribution with a given mean and standard deviation.

21. Truncated_normal(shape, mean, std): This functionality returns a tensor of given shape from a truncated normal distribution with given mean and standard_deviation.From truncated, it is meant that extreme values are not included. The below picture will clear certain things.

The image on your left hand is Normal Distribution whereas the right-hand one is Truncated Normal Distribution

22. Variable_scope: Variable scope in Tensorflow is amongst the most useful concept which helps us to share tf variables. It creates a namespace(a pool) for all variables, constants, op(like +, -, *, etc) and helps us to avoid similar names in the graph and to reuse the predefined variables.

23. Name_scope: Quite similar to variable_scope, it is also helpful in serving the same purpose as variable_scope but with a difference. It doesn’t account for variables created using tf.get_variable().

They both add a prefix to the variable name as shown below:

See the names of different variables used under different scopes.

24. tf.summary: This helps us to create log files regarding the training models in TensorFlow.For logging purposes, we need to create a FileWriter as well which intakes the log directory & computation graph.

To be specific, Summary is a special TensorBoard operation that takes in a regular tenor and outputs the summarized data to your disk (i.e. in the event file). More info can be found out here.

25. Tensorboard: Tensorboard in TensorFlow provides us with visualizing powers. You can use TensorBoard to visualize your TensorFlow graph, plot quantitative metrics about the execution of your graph, and show additional data like images that pass through it. It simply uses the log files we have created using tf.summary.

This would be enough to keep you busy for the week!!!

--

--