Shubham KoliNLP Feature Extraction Techniques Every Data Scientist Should KnowIntroduction on Feature ExtractionApr 4, 2023Apr 4, 2023
Shubham KoliinTowards AIWhat is Parametric ReLU ?Rectified Linear Unit (ReLU) is an activation function in neural networks. It is a popular choice among developers and researchers because…Mar 11, 20231Mar 11, 20231
Shubham KoliThe Dying ReLU Problem, Causes and Solutions. Part 1Keep your neural network alive by understanding the downsides of ReLUMar 10, 2023Mar 10, 2023
Shubham Koli“Activation Functions” in Deep learning models. How to Choose?Sigmoid, tanh, Softmax, ReLU, Leaky ReLU EXPLAINED !!!Mar 9, 2023Mar 9, 2023
Shubham KoliDecision Trees: A Complete Introduction With ExamplesDecision TreesFeb 27, 2023Feb 27, 2023
Shubham KoliHow to Evaluate the Performance of Clustering Algorithms Using Silhouette CoefficientMathematical formulation, Finding the optimum number of clusters and a working example in PythonFeb 21, 20232Feb 21, 20232
Shubham KoliL1 and L2 Regularization Methods, Explained From ScratchL1 and L2 regularization are the best ways to manage overfitting and perform feature selection when you’ve got a large set of features.Jan 13, 20231Jan 13, 20231