PinnedHarsh YadavinTowards Data ScienceDropout in Neural NetworksDropout layers have been the go-to method to reduce the overfitting of neural networks. It is the underworld king of regularisation in the…Jul 5, 20227Jul 5, 20227
Harsh YadavinTowards Data ScienceResidual Blocks in Deep LearningResidual block, first introduced in the ResNet paper solves the neural network degradation problemJul 11, 20221Jul 11, 20221
Harsh YadavinTowards Data ScienceComputer Vision: Convolution BasicsA deep dive into the basic cell of many neural networks.Jul 5, 20222Jul 5, 20222
Harsh YadavinTowards Data SciencePreserving Data Privacy in Deep Learning | Part 3Implementation of Federated Learning with NON-Independent and Identically Distributed (non-IID) dataset.Aug 11, 20201Aug 11, 20201
Harsh YadavinTowards Data SciencePreserving Data Privacy in Deep Learning | Part 2Distribution of a balanced dataset into a non-IID/real-world dataset, further divided into clients for federated learning.Jul 14, 20201Jul 14, 20201
Harsh YadavinTowards Data SciencePreserving Data Privacy in Deep Learning | Part 1Understanding the basics of Federated Learning and its implementation using PyTorchJul 6, 20202Jul 6, 20202