A full roadmap to deal with most machine learning problem.
For latest update of this roadmap, please refer to my Github. I will only update over there.
In this blog, we are going to transfer the style from a famous painting to any photo using convolutional neural network.
Let’s try to predict the handwritten digit by RNN.
Here we’ll build multi-RNN cell using tf.nn.rnn_cell.MultiRNNCell + tf.nn.dynamic_rnn
We will use tf.nn.rnn_cell.BasicRNNCell + tf.nn.dynamic_rnn to build simple RNN.
We are going to use tf.nn.rnn_cell.BasicRNNCell + tf.nn.static_rnn to build a simple…
Let’s get our hands dirty and play some simple RNN with TensorFlow.
Support Vector Machines is a supervised learning algorithm to solve classification problem by finding the maximum-margin hyperplane.
Let’s say we have the wights and the bias for a label of panda. When we apply the wights and the bias on a panda…
ReLU activation function is fast to compute and won’t saturate.
ReLU can be written as f(x) = max(0, x). And this is what ReLU (Rectifier Linear Unit) function looks like:
Activation function serves as a non-linear function to classify the input data to a different output.
There are two perspectives to look at the activation function by either forward propagation and back propagation.