TensorFlow AI model for RGB color recognition (Part 2).

P4k D3velopers
4 min readMar 19, 2024

--

Leonardo.Ai image.

Hi to everyone! As said in the previous article TensorFlow AI model for RGB color recognition (Part 1), in this article I’ll explain how to:

  1. Build an artificial neural network (ANN) machine learning model that classifies RGB colors.
  2. Fit and train the ANN model.
  3. Evaluate the accuracy of the model.

Build an artificial neural network (ANN)

First thing we need to read our input and output datasets, and convert it to Tensor object. You’ll see I have some custom functions to read the datasets.

You can read it as you want, just be sure you have it stored in a ‘pandas.DataFrame’ variable.

In our case, the custom functions to read the datasets are using the pandas ‘read_csv()’ function:

# Read train and test datasets
train_input_dataset: pd.DataFrame = datasets.read_colors_training_input_file()
train_output_dataset: pd.DataFrame = datasets.read_colors_training_output_file()

# Convert DataFrame objects to Tensor objects
train_input_tensor: tf.Tensor = tf.convert_to_tensor(train_input_dataset_numeric)
train_output_tensor: tf.Tensor = tf.convert_to_tensor(train_output_dataset_numeric)y

Once we have our training data in Tensor objects we’ll do preprocessing layer which normalizes continuous features:

# Normalizer layer
input_normalizer = Normalization(axis=-1)

# Compute the mean and variance of values in a dataset.
input_normalizer.adapt(train_input_tensor)

ANN creation

For the creation of our ANN, we’ll use the ‘Sequentialmodel which is useful for stacking layers where each layer has one input tensor and one output tensor.

Layers are functions with a known mathematical structure that can be reused and have trainable variables.

Most TensorFlow models are composed of layers, and our model too. In our case we’ll use a ‘Normalizer’ and some ‘Dense layers:

import  tensorflow   as      tf
from keras import Sequential, losses, optimizers
from keras.layer import Normalization

# Sequential model creation with Dense layers:
ann_model: Sequential = Sequential([input_normalizer,
layers.Dense(16, kernel_regularizer=regularizers.l2(0.001), activation='relu'),
layers.Dense(128, kernel_regularizer=regularizers.l2(0.001), activation='relu'),
layers.Dense(64, kernel_regularizer=regularizers.l2(0.001), activation='relu'),
layers.Dense(16, kernel_regularizer=regularizers.l2(0.001), activation='relu'),
layers.Dense(12)])

Normalizer: A preprocessing layer which normalizes continuous features.

Dense layer: means that each neuron in a layer receives input from every neuron from the previous layer thus it is a densely connected neural network.

In other words, the Dense layer is fully connected, meaning all the neurons in a layer are connected to those in the next layer.

In our ANN model, each Dense layer will have:

  • Amount of neural connections.
  • Regularizers.
  • ReLU as activation function.

But what regularizers and the activation function are?

Regularizer

Regularizers allow you to apply penalties on layer parameters or layer activity during optimization, and performs regularization which is used to reduce model overfitting.

ReLU as Activation function.

The Rectified Linear Unit (ReLU) applies the rectified linear unit activation function, and is the most commonly used activation function in deep learning. The function returns 0 if the input is negative, but for any positive input, it returns that value back.

Fit and train

To do that, we need to do two things: compile and fit our ANN model.

Before to fit the model we have to compile it, where we’ll specify the training configuration. This configuration will include:

  • Optimizer.
  • Loss function.
  • Metrics.
# Adm optimizer
optimizer = optimizers.Adam(learning_rate=0.01)

# Categorical Crossentropy loss function
loss_function = losses.CategoricalCrossentropy(from_logits=True)

# Compile the model
ann_model.compile(optimizer = optimizer,
loss = loss_function,
metrics = ['accuracy'])

Optimizer.

For this we’ll use the Adam that implements the Adam algorithm, where we can apply a learning rate.

Adam is an adaptive learning rate algorithm designed to improve training speeds in deep neural networks and reach convergence quickly

Loss function.

The: is used to compute loss between actual and predicted class labels. And also because my categorical class labels are in the one-hot encoding format

Metrics.

Accuracy metrics calculate how often predictions has equal labels.

Once we have compiled our model let’s go to fit it.

And fit the model with the input and output tensors:

# Model fitting
ann_model.fit(x = train_input_tensor,
y = train_output_tensor,
validation_split = 0.2,
batch_size = 64,
epochs = 500,
shuffle = True)
  • x: Input data.
  • y: Output data.
  • Validation split: Float between 0 and 1. Fraction of the training data to be used as validation data.
  • Batch size: Number of samples per gradient update. If unspecified, it will default to 32.
  • Epochs: number of iterations over the entire ‘x’ and ‘y’ data provided.
  • shuffle: If ‘True’ shuffle the training data before each epoch

Evaluate the accuracy

Evaluate the model like this:

# Evaluate model training
ann_model.evaluate(train_input_tensor,
train_output_tensor,
verbose=2)

The verbose parameter specify the type of the output data when the model is training:

Verbose: can be “auto”, 0, 1, or 2. Verbosity mode.

0 = silent,

1 = progress bar,

2 = single line.

Before execute the training, would be nice if we include a bit of code to save our model when it finishes.

# Save the entire model as a `.keras` zip archive.
ann_model.save('my_colors.keras')

And now yes, the expected moment. Execute your Python file where we have our model. If you see something like the next image, congratulations! your model is training!

Output when training model.

Up to here this article! I hope it was useful and interesting.

Please clap if you liked the article!

--

--

P4k D3velopers

Writes about Python Web Development (Django/Flask) and Data Science (Python/pandas/statistics/) || Scientist || Prototype creator || Side hustles