Diving Into Deep Learning: The Beginning

We can call this my Hello World Medium post for this series, or in deep learning terms, the MNIST data set post.

But before I go on, let me introduce myself. My name is Ethan Tang and I am a 3rd year student currently studying Computer Engineering at the University of Waterloo. This summer, I was given the opportunity to work in the Deep Learning Infrastructure team at NVIDIA on the DIGITS project.

When I first arrived here, it was intimidating. Initially I did not have much experience with Deep Learning. The extent of my experience includes a bit more than what laymen knows. Deep learning is a subset of the general field of machine learning and it is architected on the network model of how human brains work.

Currently, it has been approximately 6 weeks into my summer internship at NVIDIA and wow did I learn a lot already. I wish I have started blogging earlier to keep myself up to date with what I know but it is never too late to write about it. I personally feel really grateful to be here and working some of the best people in this field. At first it was intimidating to stand before the giants of this field, but I realized that to move up, you have to look up and see something to reach for.

The goal for this blog is that every week for me to really post what I have learned that week or before. I also hope that with my explanation of what I learned are able to teach you, the readers, of how something works. At the same time, I would also like to record the challenges I met and the pitfalls I encountered. This is so that you will not fall into the same traps that I have.

Overall, I hope this series would help both me and you learn more about how Deep Learning works and it truly is something that is revolutionary.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.