# Open Source, Live, Notebook-Based Teaching & Learning

## Introducing KlassLive

Before COVID-19 made Zoom-teaching synonymous with, well, teaching, the future of education appeared to consist of (a) uploading videos online, and (b) sitting back and letting people watch them. The most prominent examples have been MOOCs: massive open online courses, designed to expand reach by orders of magnitude to thousands of new learners. While MOOCs have dramatically expanded access to all sorts of content, it’s hard to find any given learner getting more out of the MOOC format than more traditional, personal methods.

In that sense, Zoom and MOOCs have a few lessons to share. First, live-learning eats pre-recorded-learning’s lunch. Interacting with people matters. Second, interactive-learning eats passive-learning’s supper. Doing stuff rather than passively listening really matters. Combining the two is especially powerful. Organizations such as Lambda School and Stanford’s Code in Place have made the realization that being online does not necessarily mean asynchronous, pre-recorded, and passive. …

# Solving Sparse Matrix Systems in Rust

## And introducing sparse21 on crates.io

This chapter of Software Makes Hardware dives into two topics:

• Part 1 introduces sparse-matrix solvers: where they’re needed, how they work, and popular algorithms & data structures that solve them efficiently.
• Part 2 provides a short intro to the Rust language, and surveys its pros & cons for the problem at hand.

All of the code shown here is available in the sparse21 crate on crates.io, Rust’s built-in package repository.

# Part 1: Sparse Matrix Solvers

Tons of physical problems boil down to solving systems of linear equations. In a past chapter of SwMakesHw, we introduced one such problem: (analog) circuit simulation. It can be broken into two areas: (a) linearizing non-linear systems of equations, and (b) solving those linear systems. …

# Why Analog Lost

## Quantum Computing Probably Will Too

I’ve spent most of my career building analog stuff — primarily the innards of silicon chips. For most of the past decade that was at Apple, where I designed the analog parts of overwhelmingly digital processors which serve as the brain of every iPhone, iPad, and Apple TV.

Point being, I have a whole lot of personal incentive for analog to, well, matter. The value of my personal skill-set is pretty well tied to the value of analog electronics in general. So it would be well within my self-interest to back a growing trend, and predict a coming analog renaissance.

But I won’t. …

# Analog & Transistor-Level Simulation

Our last Software Makes Hardware chapter introduced the simulation model of most digital circuits: a combination of event-driven and reactive logic. Many circuits — including nearly everything analog, and most described at a transistor-level, don’t fit this paradigm. Predicting their behavior requires an entirely different view of how they work, which we’ll refer to as a different paradigm or model of computation.

This mode of circuit simulation is often referred to as SPICE, after the seminal program originally developed at UC Berkeley in the early 1970s. …

# Event-Driven & Reactive Hardware

Past Software Makes Hardware chapters introduced a hardware abstraction ladder, including physical, structural, and behavioral layers. Here we’ll dig into the common patterns at the behavioral layer — which will look surprisingly familiar to users of modern, asynchronous, concurrent environments such as NodeJs or Python’s asyncio.

Behavioral hardware programming — or as chip-folks call it, hardware *description* — came into vogue around the mid 1980s. The industry’s two most popular hardware description languages (HDLs), Verilog and VHDL, were both introduced in these early HDL days. While Verilog and VHDL both support the lower structural layer, their new contribution was the addition of the behavioral features described here. …

# Can {Player X} Lead A Team To An NBA Title?

## A deep learning, big data analysis

Using the latest advances in machine learning and artificial intelligence, leading experts at the University of California, Berkeley* answer an age-old sports media question: Can {Player X} Lead a Team to an NBA Title?

A diagram of the neural network used to make these predictions:

Advances in AI interpretability also generated a semantic description of this model:

`Has {Player X} won the regular season MVP?`

# Yes, the AI is a Joke.

In case it didn’t land, everything about machine learning, AI, and interpretability here is a joke. (Although it may still work better than many other ML-led investigations.)

The real thesis: there is a single high-confidence predictor that {Player X} can lead a team to a championship - whether they have won the league’s regular-season MVP. …

# Models All The Way Down

In the first chapters of Software Makes Hardware, we introduced the basics of how electronics and silicon are generated (in code), and the primary languages used to design them.

One term we used regularly — with no introduction whatsoever — is model. This is a common term for hardware, which tends to mean very different things to different audiences. Here this term will be used in an atypical, and hopefully more valuable sense. (The authors of How to Read a Book would call this one of our key terms.)

Here, a model is a representation of a future piece of hardware. …

# Who Deserves the NBA’s Gold Patch?

For the 2014–15 season, the NBA introduced a gold patch to be worn on the back collar of the league’s past champions. Inspired by the World Cup, the patches are a visible reminder of just how hard it is to win an NBA championship. As of the 2019–20 season, only 18 of the league’s 30 teams wear the patch. The other 12 do not.

But League Pass aficionados will have a counterintuitive reaction to those numbers. It seems even fewer teams should wear them. Some just don’t look right:

Here we review each of the 30 teams’ claim to the patch. Based on two principles for patches, we find that six of the eighteen patches deserve to be revoked. Two teams particularly deserve this revocation. …

# The Languages of Hardware

Here’s a thought experiment: close your eyes and picture the office of a thriving software company. Say, Facebook or Google. What does it look like? You probably imagine a group of young people strewn about a bright, open office plan. Everyone has between one and ten large-screen monitors. And on those monitors, the one quantity you can be sure you’ll see: code.

Now let’s repeat this experiment, but replace the software group with a similarly successful team designing a chip. (Maybe it’s the same chip that “team A’s” code will run on.) What does this look like?

Many of us likely have a far less vivid idea. Is it the Intel clean-room? A high-tech manufacturing line? Are there robots? Maybe this looks more like a science lab, full of equipment taking measurements and readings of, well, something. …

# How Software Makes Hardware

It’s never been easier to get started in software. Creating new applications, products, companies, and even industries is easier than it has ever been. For the new programmer, information abounds on how to get started. (Even more is available to the veteran practitioner.) Software engineering has experienced a quantum leap in productivity, largely creditable to a series of modern, designer-centric programming languages and tools. A thriving open-source community provides a seemingly endless supply of production-tested, high-quality libraries and frameworks. Nearly all of our favorite software products use these free tools, to one extent or another, so much so that entire websites are dedicated to who uses what. Low-cost, easily accessible cloud-computing platforms can then deploy code worldwide, in some cases in just minutes. …