Roshan NayakAll About Activation Functions Used In Neural Networks.The linear activation function is used to make the necessary adjustments at the output layer of the network.Aug 20Aug 20
Roshan NayakinTowards Data ScienceFocal Loss : A better alternative for Cross-EntropyFocal loss is said to perform better than Cross-Entropy loss in many cases. But why Cross-Entropy loss fails, and how Focal loss addresses…Apr 26, 20229Apr 26, 20229
Roshan NayakinCodeXHands on Data Augmentation in NLP using NLPAUG Python LibraryA sentence with same meaning can be written in multiple ways.Apr 7, 2022Apr 7, 2022
Roshan NayakRandom Forest: An Ensembling Technique in Scikit-LearnTeamwork is proved to have made better decisions than an individual.Jun 29, 2020Jun 29, 2020
Roshan NayakinThe StartupWord2Vec in Practice for Natural Language ProcessingBefore diving into Word2Vec we have to know what actually word embedding is and why is it actually required. So let's get started.May 29, 2020May 29, 2020
Roshan NayakinThe StartupData Augmentation in Natural Language ProcessingHave you played augmented images before? Augmenting images made your model generalize and perform a lot better by exposing it to much data…May 18, 2020May 18, 2020
Roshan Nayakindeep code lifeInsights On Memory Efficient Linked List.In today's world, it is very much necessary to make the proper utilization of the resources to spend less and gain more out of anything…Mar 24, 2020Mar 24, 2020