Feynman and TDD

Richard Feynman is smarter than you. And he wrote tests

Bobby Grayson
5 min readMar 11, 2014

A setup

I have always been helplessly intrigued by the man known as Richard Feynman. My initial foray into this curious character’s work, like many, was one I will never forget. It was 8th grade, I was just getting into tech culture, my first attempts at running Linux were recently successful having gotten Ubuntu 6.06 to run on my home desktop (I requested the free discs online, which to this day is one of the best favors anyone has done me. But that’s another story.), and TechTV had not yet been ruined by Comcast. It was a wonderful time for me, fantasizing about WarSpying, WarDriving, hacking Windows Components into a Mac Mini after Kevin Rose did the same, and other projects of the hacker ethos that took over my rebellious minds capabilities and lack of stimuli from other sources. On a program known as the Screen Savers, for a reason I can’t remember, they referenced a problem of physics:

A fisherman rowing his boat on a very small lake throws his anchor into the water. Does the water level of the lake rise, fall, or stay the same?

This is a classical example of a problem from Feynman’s Lectures on Physics and, as many of you know, to a teen just getting into science, technology, and the surrounding world, the answer to this problem is quite fascinating, and at first glance, unintuitive. The water level falls.

Once this introduction came, I was a fiend to find out anything I could from this guy’s lectures. I began to read every single one. Now what does Feynman have to do with TDD, programming, or anything relevant to these tangentially?

His Approach.

Big problems, simple solutions

Feynman was largely known for his prominent method of primarily trusting 19th century calculus as his tool to deduce what was happening in a physical system or to describe some phenomenon. However, his lectures and methodologies, especially the legendary Feynman Diagram, all are extremely intuitive, simple to grasp, and easily explained. People are often surprised to see the amount of complex mathematics that goes into those squiggly lines, straight ones with half arrows, and more. But what let him make this so simple to grasp? Was he a genius? Of course. But he was a genius with a method.

In Danny Hillis’s essay about working with Feynman over a Summer at his company Thinking Machines, he described this as the approach he would take:

“For Richard, figuring out these problems was a kind of a game. He always started by asking very basic questions like, ‘What is the simplest example?’ or ‘How can you tell if the answer is right?’ He asked questions until he reduced the problem to some essential puzzle that he thought he would be able to solve.”

This method is clearly effective. The man is a Nobel Laureate, has solved some of the hardest problems that quantum physics presented the world at the time, and contributed to countless other industries, projects, and fields of science using it. There is something to be said for simplicity. Breaking a large problem into a simple one is exactly what any programmer does when approaching it from the beginning. After all, a computer lacks the context our lovely network of neurons globbed together inside our skulls provide us.

Sometimes it isn’t fun

Sometimes TDD is a pain. You’ve got an idea. It’s sparked. You want to get in the console and experiment, play in IDLE, whatever. You get distracted and soon a class has been authored in it’s entirety. Oops. Many developers now go back and write the tests, some don’t. But there is a piece that comes to be missing at times when this approach is taken. You may have broken down many of the finer points to their simplest piece, but by actually breaking it down and going piece by piece, in a series of failing tests, almost every time had this route been taken there would be more exceptions, possibilities for failure, or gross oversights caught.

Feynman dealt with this too. Taking the complex calculus that powers something as simple as a basic Feynman Diagram of an electron and positron colliding to emit a photon would take several hours of lessons at a bare minimum to teach someone with an understanding of calculus even a remote idea of what is going on. Were talking about going from this:

Some math actually written and scratched out by Dr. Feynman himself

To this:

An electron positron collision decaying into a photon, and then back to an electron positron pair

However, by breaking it into these “simplest ways something works or happens” throughout the entire process, once he completes this he was able to create what is debatably the most intuitive methods of illustrating complex physics in the modern day, making it accessible to any elementary school child with a pencil, paper, and a teacher who understands the diagrams principles as well.

TDD: Do something somewhat annoying to create something beautiful

Well tested code speaks. A new developer knows what will happen, where it will happen, and what he is getting into. Customers do not see these tests, but their services generally are quite reliable and things won’t implode near as often. Everyone wins.

I guess my overarching point here is this: Some moments I would rather fight a reasonably large pack of wolves than write another unit test. However, for the greater good I find myself defeating this distaste and just doing it. Correlation may not imply causation, but I have found that I have become a much better developer the more I do this. It has forced me to learn to fully understand what every move and change I make is doing, and what it impacts. Unforeseen side effects are caught, exceptions handled, and I have far fewer “dear lord, how will I debug this” moments.

So, anyways, keep on testing guys. It’s chicken soup for the coder’s soul.

If you enjoyed this follow me on Twitter.

--

--

Bobby Grayson

Semi-nomadic developer and attempted comic/sailor who loves OSS.