Hi, thank you very much for this, it helps me a lot to learn keras.
Yoga Yudistira
1

Hi

One way to do so would be to keep the neural net with the existing weights BUT to remove the last layer (softmax) and to replace it with a similar (softmax) layer of the required number of units (1 unit per face to classify). Then you would freeze the weights of all layers excepted the last one and train the network on your dataset to let it tune the last layer weights.

This is well explained in Fast AI part 1 lesson 1 and lesson 2 (http://course.fast.ai/lessons/lesson1.html).

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.