The Fundamentals of Neural Networks: A Comprehensive Tutorial Without Internet or GPUs

Ben Lahner
3 min readDec 18, 2023

--

Find the Fundamentals of Neural Networks tutorial on GitHub here!

The Motivation

A few years ago I started volunteering with The Educational Justice Institute at MIT to teach Python programming to inmates around New England. Many of my students fell in love with Python and wanted to learn more advanced material, but they faced three main challenges:

1) Advanced material often requires internet access (e.g., documentation, package downloads, Stack Exchange).

2) TEJI and the correctional facilities do not have enough resources to teach an advanced course for only a couple students.

3) Computer hardware and computer access is limited in correctional facilities. The students must often work on outdated laptops within specific timeframes.

The solution

I chose to make a tutorial on neural networks because it was interesting to my students, a great followup to an introductory Python course, and a fun introduction to the power of machine learning. However, from my own experience learning about neural networks, a student often must piece together dozens of incompatible tutorials to make sense of a concept — I thought a single, unified tutorial on neural networks was lacking.

I made sure this self-guided tutorial does not require internet access after the initial material download and does not depend on GPUs. The tutorial is divided into five parts: (1) neural network overview, (2) neural network math, (3) coding a multi-layer perceptron (MLP) in NumPy, (4) coding a MLP in PyTorch, and (5) coding a convolutional neural network (CNN) in PyTorch. The variables used in the part 2 math explanation are kept the same in the part 3 NumPy MLP implementation. Parts 3, 4, and 5 all share the common task of MNIST classification but differ in framework (NumPy or Pytorch) and neural network architecture (MLP or CNN). In this way, new concepts are introduced while the old are being reinforced.

While I believe this tutorial addresses many challenges that existing neural network tutorials have, it is by far not the end-all be-all tutorial. I supplement each of the five parts of this tutorial with other resources that I had benefited from when learning. These resources include book chapters, math refreshers, PyTorch documentation, MLP tutorials, blog posts etc. If you know of a great tutorial or helpful resource you think I should add, please leave a comment!

What makes this tutorial different from other tutorials?

  • Free
  • No dependence on GPUs or internet
  • Consistent variables between math and NumPy code
  • Heavily commented code
  • References to additional offline resources
  • NumPy code is directly comparable with PyTorch code to give a deeper understanding of what’s going on behind the scenes in PyTorch dataloaders, backpropagation, batching and more.
  • Weight and activation extraction and visualization in NumPy and PyTorch for both MLPs and CNNs
  • The three coding portions all share a common goal of MNIST classification but differ in framework (PyTorch or Numpy) and neural network architecture (MLP or CNN).

How to use this tutorial

In a correctional facility setting, I think this tutorial can be most efficiently used if paired with a weekly 1-hour check-in with a volunteer tutor to check progress and answer any questions. It can be a great crutch for those looking to brush up on their neural network concepts and supplement other academic course material.

The challenges this tutorial addresses are not unique to correctional facilities — many people have unreliable internet, bad computers, and would benefit from a comprehensive resource. Now more people can develop these fundamental computer science skills on the train commuting to work, traveling on an airplane, or simply at home offline.

You can access the entire tutorial — code, walkthrough PDF, material downloads — at this GitHub link!

--

--

Ben Lahner
Ben Lahner

Written by Ben Lahner

I enjoy tackling questions that appear impossible to answer. Current PhD student @MIT. Website: https://blahner.github.io/