Testing Software Using Phi Calculus

Getting beyond lambda with human-machine interaction

More math musings can be found here. For more information, please contact us at CodeCraft.AI

Mo Money Mo Problems

NYC can’t help but smile as Bay Area crypto-anarchists-turn-gazillionaires enumerate the differences between various pinot noirs and grow out the dreads as not to offend their new Big Tech executive ‘teams’. Although banking might be new to the blockchain crowd, it isn’t new for banks and the traps of seduction are well-known. Not long ago, the attraction of decentralization was to recognize how human frailties can corrupt even the best of intentions, much like getting into a relationship with that really hot blonde that you just know will end in disaster about 18 months out.

East vs West

This sort of distraction is only the latest example of how the aristo-kitty community should be wary of getting played and Eastern Europe knows these sorts of games well. Silicon Valley has been fighting for control of Wall Street’s money pot for a long time but now that central banks have added west coast tech to the exclusive TBTF club, Wall Street realizes that special treatment after 2008 can no longer taken for granted.

Another rivalry that didn’t end well

Decentralization is only the latest in a series of existential threats: Facebook and Google cleaned out Madison Avenue. Amazon gobbled up Fifth Avenue. Twitter replaced the Gray Lady. Uber even took out the yellow cabs.

So how does Wall Street intend to respond to this threat? Apparently by building a death ray:

Banks are certainly into some strange stuff these days. Before the advent of quantum computing, discussions about space-time would raise questions from HR about drug use within the dev team. Nowadays, it is fairly common for banks to have physicists in senior management working diligently to protect grandma from rogue nation states by blasting her pension through a 16 mile ring of various hell-spawn, or making sure her ATM PIN doesn’t get entangled with nuclear launch codes from Diego Garcia. Or something like that. And you thought banking was dull.

CERN bank stress tests

Strange Duality of Code and Data

Which brings us to today’s topic. Since the days of client/server architecture, Silicon Valley accepted the idea that code and data were divorced — they lived on separate hardware and seldom saw each other, except maybe at the country club. Life is simply cleaner that way, especially considering that painful love affair with OODBMS tech (mostly a rehash of IMS) that ended very badly for unexplained reasons — so badly in fact that said systems were largely banished to trading systems (hint) never to be heard from again. But then dynamic languages came along — particularly a hot mess called JavaScript that left clothes strewn all over the floor and was regarded as a scandalous affair.

If we accept that programming can be boiled down to executing a series of machine state transitions, we can see how the software industry has roughly aligned itself accordingly into either the red “data” or blue “code” world-views:

The Feynman view of software

While programmers and Ethereum virtual machines love code, the business/money community doesn’t trust it as far as they can throw it — they want to see the data. If only things were this easy. Einstein demonstrated that “state” is relative to the observer and even Alan Turing had to admit you might be dragging down the clock speed of the entire universe whenever you fire up your favorite debugger. At least you have a darn good excuse for not using one.

The Persistence of Memory

Curry-Howard-Lambek was destined to be a skunk at the garden party. It was only a matter of time before in-memory databases would expose the industry’s treatment of program “memory” as primitive at best — suitable for late 1960s hardware but marginally better than banging rocks together. East coast ruffians began floating around heretical proofs that boiled modern software down to a Sophie’s Choice of either (1) an elegant rigid typesafe program that literally can’t do anything or (2) navigating a minefield of never-ending runtime exceptions.

San Francisco keeps a watchful eye on its resident bad boy

Curry-Howard armed NYC executives with proof that the preponderance of crappy brittle software throughout the industry implied junk science was to blame somewhere, even if they couldn’t put a finger on it. The Great Financial Crash taught accountants a valuable lesson — that fancy mathematics should no longer get a free pass if you want to avoid bankruptcy. Suddenly, even Haskell developers were starting to feel a bit exposed.

Pure logical thinking cannot yield us any knowledge of the empirical world; all knowledge of reality starts from experience and ends in it
— Albert Einstein

The Software Calculus Food Chain

Most of the debate can be traced to the below diagram:

The post von Neumann scene

Basically the pink line is where the industry shifts from code-orientation to data-orientation. Silicon Valley prefers to be on bottom and NYC prefers to be on top.

Lamba (𝜆) Calculus

The west coast has a bit of an Achilles Heel if you know where to look (which NYC discovered the hard way). Despite the best efforts of PL/PX innovators, most thinking is still built around steam-era lambda calculus — largely unchanged from Alonzo Church (1930s). If you stare at enough code, the rigid influence becomes clear — whenever Silicon Valley cooks up another “new and improved” programming language, it’s usually a rehash of something else where they switched around the tabs, brackets and parenthesis and maybe added some new Unicode characters. It’s about squeezing more miracles from elaborate Jacquard punchcards and yet at the same time acknowledging a fundamental frustration that it just isn’t enough. SQL developers will immediately recognize the issue: it’s like staring at a massive list of INSERT, UPDATE, and DELETE statements and trying to imagine what the database contains after it runs. Good luck with that.

The man from Budapest looks askance

Despite decades of pipelining, von Neumann chips still have no notion of “transaction”, only an unintelligent clock that fires and the CPU blindly advances to the next instruction. Without an undo option, the code therefore better damn well be correct almost to the point of cargo cult. At the same time, letting data drift around in memory with just an internal address handle (and maybe some key/value stores) is about as flimsy as it gets. So the lambda folks understandably regard data as toxic. After all, you can’t have side effects when you have no way to back them out. As such, most monadic ‘functional programming’ essentially boils down to repurposing the call stack as a basic undo/rollback segment. Taking FP any further is just a closet affair with its alluring but psychotic sister — OO —the only difference being how the ‘this’ parameter is sneaked behind the monad when no one is looking.

Wall Street keeps a watchful eye on another bad boy

The database community would argue that “avoiding side affects” is really just sloppy treatment of transaction isolation, because any non-trivial program must have side effects — and you better have a formal way of dealing with them. Nonetheless, some ‘purists’ continue to argue the exact opposite, which is why they have been largely faded east of the Hudson. To be sure, the von Neumann approach is invaluable for lower-level programming but one must be willing to accept a glaring hole when it comes to data. Lambda fans are quick to point to benefits like strong static type validation, but this is possible only when call chains are tightly coupled to data flow, which raises all sorts of code maintenance and data authority questions, particularly in distributed situations. And the poor state of affairs when it comes to testing (particularly with deep call stacks) is daunting contrary evidence.

Pi (𝜋) Calculus

At some point, you’ll want more than one von Neumann engine on your little lambda railroad. Robin Milner designed a coordination system essentially the same way that trains operated before cellphones and GPS. Shared data was largely non-existent but at least you could trigger a semaphore or blast a loud horn before entering a critical section of track. The mail run waited patiently in the hole until the green light from the switch tower and… eventually the notion of event-driven, reactive software was formed.

.NET hasn’t changed much since the early days

Meanwhile Wall Street built its empire off an annoying little creature called Excel and everyone was happy. At least until Singh and Higgins helped Goldman gobble up the universe.

Rho (𝜌) Calculus

Despite global domination in the 1960s, IBM was not known for innovation. Unfortunately, we see the same pattern today within Silicon Valley upper management — it’s all about groupthink and avoiding confrontation. You can’t really blame them — the Bay Area social media empires were built around the notion of “popularity contest” — and justifiably so since certain technologies require mass adoption to work.

Original NYC micro aggression

Once you start getting serious about reflection and wishing you could query your program like a mini-database, you risk getting booted from the golf club (another reason E Europe is a hotbed for emerging tech). If anything, dealing with “dirty” data is more of a Wall Street thing.

Flat file thinking

But maybe your team still needs to make delivery schedule, so you decide to arrange your program at least in some sort of Hilbert space or coordinate system so you can find things. Of course, the notion of ‘file’ is lost once things are loaded into memory. For this reason, persistent memory systems are typically arranged as data trees or graphs and not surprisingly the human brain tends to also think in similar terms. Even then, you still need a way to observe this space — preferably without renting a hadron collider.

What’s the value of x?

It’s hard to get humans to agree on anything — especially where money is involved — and gets a lot harder once divorce attorneys figure out ‘smart contracts’. Even if the value of x is known, it’s another matter to “see” it. The trip down the rabbit hole begins when someone asks when you want to see the value of x and suddenly everyone is trying to synchronize Lamport timestamps. At some point it dawns on you that a database isn’t some ‘unnecessary’ overhead—it’s really a global clock — and the only reliable way to get all observers to agree on the quantum state of the machine.

How software gets expensive fast

For example, Curry-Howard reminds us that a typical SQL database can contain views… and oh by the way the observer can’t tell if something is real or a view. So perhaps x doesn’t really exist at all because you might be staring down the barrel of a function f(a,b) that projects onto x. The problem is that the observer can never been sure which end of the barrel is being held— although you can join Wall Street in spending hundreds of millions of dollars trying to find out. The point is, either you accept the need for a database up front or end up writing a half-baked one from scratch.

You try to read any Wikipedia article about any math topic, it’s just bunch of incomprehensible formula of mumble jumble
— Vitalik Buterin

Phi (𝜙) Calculus

Back in the 60s, women dominated the IT department (particularly where data was concerned) and the teletype machine was how they interacted with a computer. The choice word here is interacted. Because the teletype had a pretty good idea of what was going on, operators were able to hold some semblance of human-machine conversation, which was a lot friendlier, if not downright chatty interface to work with a computer. While primitive by today’s UX standards, in some sense the teletype still remains far ahead of its time.

Is she expected to write C++, Rholang and Solidity smart contracts or not?

The main drawback with a GUI is that once the mouse starts whipping around the screen, the computer is unable to keep track of workflow, although you can pay Selenium et al a lot of money to try. It is interesting that both artificial intelligence and women started disappearing from the software industry about when the GUI showed up. Well Microsoft has about run the GUI concept to the point of exhaustion… yet pro users usually prefer to stay on the keyboard as much as possible — and texting apps and chatbots are bringing back the idea of interactivity to the UX, not to mention Apache Zeppelin and Project Jupyter.

If you really want to move human-machine interaction beyond the go-go sixties, you really need full engagement of the senses:

  • The GUI connects with sight and touch
  • The (forgotten) command line acts as mouth and ears

Complexity Management

I’ve said this many times before but NYC got serious religion after 2008 and now views clunky enterprise software — particularly the inability to change stuff quickly — as a risk item. Fixing perpetually broken software (and leaving industry helplessly exposed to takeover) is lucrative business for the west coast so we should not expect them to be terribly proactive in this regard. One would think DARPA should be very concerned about the current state of the software industry but it’s no secret the Beltway military-industrial complex is practically rewarded for budget overruns. So here we are.

In any case, the usual industry response is to “hire better programmers”… not “build better machines”.

Ariadne has some concerns about complexity

The truth is that it doesn’t really matter how ‘smart’ your dev team is because inevitably software complexity grows beyond human ability to comprehend it all. Be wary of decentralization schemes that only a select few “high priests” can understand. Perhaps the better answer is to maximize audience by pushing complexity as much as possible down to the infrastructure level. For example, the original creators of relational technology (IBM System R etc.) realized that the underpinning math was far too esoteric for the average enterprise user to swallow and instead placed a SQL “surface” on top. Wall Street too realized that their quants needed to publish their models via a simple interface. When you are moving millions of dollar around, you better have complexity on a short leash.

Create and Perceive

Which finally brings us back to Cobb in the Inception cafe scene.

The phi calculus

There is a fundamental — if not downright disruptive — shift once the programmer is able to develop inside a running system. Unlike traditional workflow where the programmer hopes at best to assert a minimal expected state before teardown and restart, there is genuine discovery when one can see what is really going on and take action as things unfold. In 1997, Peter Wegner stirred up trouble by arguing the value of interaction over algorithms by pointing out the time dimension inherent in smart contract language not unlike the dilemmas we now see in blockchain world. The phi or “philosophy” calculus is an old AI concept built around the notion of a stateful intelligent agent that understands (remembers) context and in particular is tolerant of “soft errors”. The command line interface was not an accident of history — Multics was famous for being an ‘interactive’ operating system. This allows incredibly complex systems to be built incrementally through discovery and experimentation.

Before Alexa, there was Eliza

This sort of flexibility may seem unnerving but remember the database world has operated in this paradigm just fine for decades and NYC simply applied Curry-Howard to extend this into program space via “persistent memory”. The Oracle RDBMS does not collapse in a heap just because the user enters a typo in SQL, nor does Oracle need to be stopped and reloaded from backup with each stored procedure change. To give you an idea of the formidable mental block created by von Neumann thinking, most non-database programmers equate a “running” system with a “production” system (where changes on-the-fly are strictly taboo) — and never considering for a moment that a development system can also be “running” (even where continuous delivery is declared to be the goal). Moreover, these systems are naturally collaborative which tends to freak out teams that otherwise would call themselves “agile”. Without regular volcano blood sacrifices, it’s a wonder that databases work at all, right?

I want to begin the programmer’s training as a full-fledged navigator in an n-dimensional data space
— Charles Bachman

If lambda and pi calculus are like building a railroad, then phi calculus is more like an open world game like Minecraft where complexity and collaboration is relished rather than reviled.

Testing Hell

The relationship between programmers and ‘dumb’ computers has been adversarial for so long you will just get blank stares if you try to describe any alternative, so let’s walk through a concrete example. Incidentally, TDD is actually closer to goal-driven programming which is a topic for another day.

Inspired by the Multics audit/continuous integration process, suppose we would like to test some code. We log into our codebuilder as user “max” and see that user “kemi” has recently created two functions foo and bar:

Looking at foo, we see a JS function that adds two parameters together. So we run the function foo(4,6) and see that it returns the value 10:

Satisfied that this makes a reasonable test case, we tell the system to “remember” what we just did:

Because an interactive system can follow what we are doing, the remember command generates the test case. Let’s verify the test was created:

Let’s run the test case and verify that it passed:

Now let’s ‘break’ the function by having it return x+x instead of x+y, so foo(4,6) will return 8 instead of 10:

Now we re-run the test and sure enough it now fails because it returned 8 instead of 10:

Leave your tests phizzak

I realized we *cheated* by actually having a computer help us (heaven forbid) which makes Silicon Valley flat filers a bit upset, but testing doesn’t have to be so painful right? I mean it’s been a half-century and California has failed to produce anything better—so maybe it’s time to move on. Long live the humble teletype.


If you are wondering the significance of the golf video, it was a Steve Jobs thing.