In this article we will understand about generating new artistic images using a Deep Learning Technique called as Neural Style Transfer.
The concept behind Neural Style Transfer is rather than optimizing a cost function to get a set of parameter values for our model network. In Neural Style Transfer, we optimize cost function to get pixel values of image.
The way we proceed is we merges two images that are “content” image (C) and a “style” image (S), to create a “generated” image (G). This generated image is the final output artistic image.
The problem this network solves is that supposes we have a very limited number of images as training examples and we need to do face detection and identify the person correctly.
The concept behind this network is that rather than finding the features and identifying the person, we try to find the differences between the features of one image and the other and then train our network over it.
In Triplet Loss we look at 3 images at a time — anchor image, +ve i.e the correct image of the same person and the -ve image i.e …
The need for residual networks arises because when first CNN-based architecture (AlexNet) that win the ImageNet 2012 competition, then subsequent winning architecture started using more layers in a deep neural network to reduce the error rate. The problem they faced was that after making the neural network very deep after a certain number of layers the error rate was increasing rather than decreasing.
Bagging is a technique used in ML to make “weak classifiers” strong enough to make good predictions. The technique which we use is - we make use of many weak classifiers to make a prediction on our *data and then combine the results by taking their average or by picking the most likely prediction made by the majority of these weak models. The main catch here is that all the weak classifiers used are independent i.e they are not influenced by the other classifier errors or prediction, and they predict individually.
Boosting is an ensemble technique in which the weak…