# Advanced Learning Algorithms

In this article, we will delve into the realm of advanced learning algorithms and learn about fancy computer techniques called neural networks and decision trees. Imagine neural networks like a computer trying to imitate how our brains work, making clever predictions and decisions. Decision trees, on the other hand, are like a flow chart that a computer uses to make choices. We’ll explore how these things operate, breaking down the complicated stuff to understand how they make smart decisions. By the end of the article, you’ll have a clear picture of how these advanced computer methods work and why they’re essential in the world of technology and smart machines.

# Neural Networks

Consider teaching a computer to identify cats. Instead of instructing it to “look for pointy ears” or “check for a furry tail,” you saw it an extensive number of images of cats. The computer automatically recognizes patterns, such as the typical appearance of ears and tails.

Imagine this machine learning process as a smaller version of your brain. Layers of virtual neurons, which resemble brain cells, are present in the computer and cooperate. It may not be very good at identifying cats at first, but as it views more images of cats, it improves by making adjustments to its connections.

So, a neural network is like a computer brain that learns by looking at examples, making it great for tasks like recognizing images, understanding speech, or even playing games. It’s kind of like teaching a computer to learn from experience, without telling it exactly what to look for.

Let’s start with an example to show how neural networks function. We’ll utilize a demand prediction example where you examine the product and attempt to forecast whether or not it will be a best seller.

**An activation function** determines whether or not to activate a neuron. This means that it will use less complex mathematical procedures to determine whether or not the neuron’s input to the network is significant throughout the prediction process.

There are different types of neural network activation functions. But, we only discuss **sigmoid activation functions** in this context.

Let’s take a look at how a layer of neurons works. Here’s the example we had from the demand prediction example where we had four input features that were set for this layer of three neurons in the hidden layer (suppose there are 2 hidden layers) that then send its output to this output layer with just one neuron. Let’s zoom into the hidden layer 1,2 and output layer to look at its computations.

**Tensorflow Implementation**

TensorFlow is a great option for developing powerful deep learning computer applications. While PyTorch is another well-liked technology, the focus in this specialization will be on understanding and using TensorFlow. We’ll look at TensorFlow’s operation and the various scenarios in which it can be used to enhance computer intelligence.

When you’re roasting coffee, you have control over two things: the temperature at which you roast the raw coffee beans and how long you roast them. In our simplified example, we’ve made datasets with different temperatures, durations, and labels indicating whether the coffee turned out good. The positive labels (y equals 1) mean it’s good coffee, and negative labels mean it’s bad coffee. Picture it like this: if you roast at too low a temperature or for too short a time, the beans are undercooked. On the flip side, if you roast for too long or at too high a temperature, the beans get overcooked and burnt, not making good coffee. Only the points within a specific range, like a little triangle, represent the sweet spot for good-tasting coffee.

We’re going to set x (input layer) to be an array of two numbers. The input features 200 degrees Celsius and 17 minutes.

`import tensorflow as tf`

from tensorflow.keras.activations import sigmoid

from tensorflow.keras.layers import Dense

x = np.array([[200.0,17.0]])

layer_1 = Dense(units=3, activation='sigmoid')

a1 = layer_1(x)

layer_2 = Dense(units=1, activation='sigmoid')

a2 = layer_2(a1)

if a2>=0.5:

yhat = 1

else:

yhat = 0

**Building a neural network**

Imagine you have a bunch of information about coffee roasting, like temperatures and durations. To make sense of it, you can organize this data into a table, where each row represents a different set of conditions. In this case, it’s like having a table with four rows and two columns, where each row tells you specific details about how long and at what temperature the coffee was roasted. The labels, which tell you whether the coffee turned out good or bad, can be arranged in another list. So, you end up with a pair: the table of details (X), and a list of labels (Y) that says if the coffee was good or not. This setup helps the computer learn patterns and make predictions about roasting coffee.

`import tensorflow as tf`

from tensorflow.keras import Sequential

from tensorflow.keras.layers import Dense

from tensorflow.keras.activations import sigmoid

layer_1 = Dense(units=3, activation='sigmoid')

layer_1 = Dense(units=3, activation='sigmoid')

model = Sequential([layer_1,layer_2])

# OR

model = Sequential([Dense(units=3, activation='sigmoid'),

Dense(units=3, activation='sigmoid')])

x = np.array([[200.0,17.0],[120.0,5.0],[425.0,20.0],[212.0,18.0]])

y = np.array([1,0,0,1])

model.compile(...........) # More about next article

model.fit(x,y) # More about next article

**Forward propagation in a single layer**

When the input data is sent across a network in a forward direction to produce an output, this process is known as forward propagation. After being accepted by hidden layers, the data is processed according with the activation function and advanced to the next layer.

`x = np.array([[200.0,17.0]])`

# a1[1] = g(w1[1].x +b1[1])

w1_1 = np.array([1,2])

b1_1 = np.array([-1])

z1_1 = np.dot(w1_1,x) +b1_1

a1_1 = sigmoid(z1_1)

#iterate above process for other 3 neurons

a1 = np.array(a1_1,a1_2,a1_3)

#a_in = Previous layer output

def Dense(a_in,w,b)

units = W.shape[1]

a_out = np.zeros(units)

for j in range(units):

w = W[:,j]

z = np.dot(w,a_in) + b

a_out[j] = g(z)

return a_out

def Sequential(x):

a1 = Dense(x,w1,b1)

a2 = Dense(x,w2,b2)

a3 = Dense(x,w3,b3)

a4 = Dense(x,w4,b4)

f_x = a4

return f_x

*Next Article is about neural network training.*

*Happy Machine Learning..!*