The Programming Lessons of Jurassic Park

Jack Ryan
7 min readAug 20, 2022

--

It’s not just a story about terrible genetic power; it’s also a horror story about bad programming.

The Tyrannosaur blasts through the electric fence in Jurassic Park.
Pictured: Stunning software engineering malpractice

When I was a child, nothing brought me a greater thrill than watching the mighty tyrannosaur of Jurassic Park burst through the electric paddock fence, stepping onto the road and proving, inevitably, that genetic power cannot be controlled. At this particular moment, with Colossal Laboratories and Biosciences contemplating the de-extinction of the marsupial Tasmanian carnivore called the thylacine, I imagine that the lessons of Jurassic Park will fill op-ed columns once again.

But now that I’m older, and a software engineer, I have found something else to fear in Jurassic Park. There’s a particular exchange between several of the characters that I think best describes the monster hiding in the movie. Here’s the transcript of the conversation in question:

Arnold: Vehicle headlights are on and they’re not responding, those shouldn’t be running off car batteries. Item 151 on today’s glitch list. We have all the problems of a major theme park and a major zoo and the computers aren’t even on their feet yet.

Hammond: Dennis, our lives are in your hands, and you have butterfingers?

Nedry: (Exasperated laughter) I’m totally unappreciated in my time. You can run this whole park from this one room for three days. You think that kind of automation is easy? Or cheap?

Yes, Malcolm was right: Genetic power was a responsibility too great for the park’s managers to handle. Eventually, it was bound to result in tragedy. Michael Crichton was, in 1990, sounding a warning call for a technology that only in the past decade has approached the kind of capability that could be used to start projects akin to Jurassic Park. But, Crichton predicted another more pressing concern: Jurassic Park’s complete reliance on code, and the utter failure of the code to protect lives. More than telling a tale of rampaging dinosaurs, Jurassic Park is a warning about what happens when code, and when coders, go bad.

I’m Totally Unappreciated In My Time

Let’s start with the obvious, undisputed villain of Jurassic Park.

Dennis Nedry, Jurassic Park 1993
Newman.

Newman’s position (particularly in the novel) is one that we might be able to empathize with. He’s obviously a remarkably talented systems engineer — after all, in the movie he seems to be the only coder on the island — and he’s suffering from some undisclosed financial issues. In the book, his position is even less enviable: The Jurassic Park leadership asks that he work without pay to rush the completion of the park’s systems, all without telling him what he is helping to build.

But, the best villains are the ones we can sympathize with, and Nedry’s woes simply fall short of justifying his crimes. The film’s version of Nedry obviously has plenty of open tickets for bugs (151 in a single day, oof) and very little interest in actually closing them out. The technological limitations of 1993, we’re led to believe, mean that fixing glitches requires an abnormal amount of system downtime, though the downtime might have simply been a smokescreen to cover his embryo heist.

However, after Nedry’s unfortunate demise, the park’s computer systems continue to be inaccessible — and this, not the gaping maw of a ravenous rex, is where the carnage begins. Let’s examine from here not to how the dinosaurs killed people (the mechanics of it are pretty straightforward), but how the code failed in three key responsibilities:

  1. Code intelligibility
  2. System accessibility
  3. Failsafes & crisis mitigation

You Could Run This Whole Park from One Room for Three Days

Here follows the actual, truly-consequential timeline of events that befell Jurassic Park:

  • Nedry’s plan commences. Minor security systems like cameras and door locks go offline. Phones also appear to go offline.
  • Larger systems, like perimeter and paddock fences, begin to fail.
  • Attempting to access the security system and pull systems back on, lead engineer Ray Arnold runs into an undocumented authentication protocol (effectively a virus) that Nedry has introduced to prevent access to the system.
  • Power systems begin to fail holistically. Self-driving track-car power system fails, cars are stranded and undriveable without computer assistance. Tyrannosaur paddock fence power fails, tyrannosaur escapes, cars have no means of departure. One death, one injury.
  • Arnold discovers that Nedry’s program has disabled the computer’s ability to preserve his keystrokes, meaning that Arnold would need to parse 2,000,000 lines of computer code to figure out what program Nedry ran to pull security systems offline.
  • Having exhausted options (kind of), Hammond and Arnold decide to manually reboot the computer systems by restarting the facility’s power. This appears to succeed, but it trips breakers that must be turned on again — these are on the other end of the compound. Arnold is killed trying to restore power.
  • Ellie Sadler and Robert Muldoon also attempt to restore the power. Muldoon is killed. Sadler succeeds.
  • Park systems are restored, but every Jurassic Park employee who knows how to use them is dead.
  • Door locks being automated, it is not possible to manually lock doors. Park evaluators Grant and Sadler are nearly killed owing to this fact.
  • The founder’s granddaughter is fortunately familiar with Unix and is able to use a complicated GUI to restore door locks, phones, and other security systems. This precipitates the rescue of all survivors.

All told, four people died and one person was critically injured. In each incident, a dinosaur was (of course) involved, but so was a mis-judgement or misappropriation of the park’s systems code. Not to put too fine a point on it: The park could not be run from one room for three days with minimal staff. Far from being a boon, the reliance on automation proved a hindrance.

Easy? Or Cheap?

Take a moment to consider how present the threat portrayed in Jurassic Park is in your day-to-day life. How many of us (particularly now that Jurassic Park is basically a cinematic staple) could be compelled to beta-test a dinosaur park on an island off the coast of Costa Rica? Whether for genetic-tech limitations or insurmountable PR hurdles, I think (hope?) it’s safe to say that no real Jurassic Park will exist in our lifetimes.

So, fear not the dinosaurs. But what about fearing the total failure of the code that runs a large theme-park? So much of our critical infrastructure (planes, hospitals, cars, security systems, banking) runs on software to some extent — often, to a greater extent than most people realize. But, these systems are built by consummate and careful professionals, right? Dennis Nedry definitely isn’t a Cambridge-affiliated developer, right?

Not only are large-scale software disasters entirely possible, many have already happened. Look to the Therac-25 malfunction, where a race-condition in the code for the Therac-25 radiation therapy machine administered deadly radiation to at least six patients. Or, look to the death of Elaine Herzberg in a collision with an Uber self-driving car just last year. I’m afraid the list goes on. Most of these incidents aren’t even on the scale of the gross negligence portrayed in Jurassic Park; many of them arise out of small, uncaught, everyday programming problems.

This brings us, at last, to the moral of the Jurassic Park story beyond the dinosaurs: regardless of easy or cheap, critical programs have to be correct, and dependable. The systems of Jurassic Park were not correct. This is in large part because of a malicious coder’s interference, but it’s hard to ignore that the system was not well-equipped to handle failure.

It’s also the only horror story I know of where the villain is an archetypically arrogant coder who might be recognizable in our current workplaces. Was Nedry the most talented coder of his age, as he estimates it? We have no reason to classify Nedry as a bad coder —he is Cambridge-approved, he seems to be the only coder on the staff (this is not true in the novel), and he’s certainly the only staff member who fully understands how to operate the systems he’s constructed, to his discredit.

But while he’s a good coder, Nedry is a terrible professional. He is not a strong communicator, standoffish and resistant to criticism. His system, while mostly functional, is not possible to use without his help. On this fact alone, the Jurassic Park systems applications must be considered unmaintainable code (2,000,000 line code review, any takers?).

Revolutionary automation is hailed by so many people in the tech space as the answer to any problem. I would encourage the “futurists” of tech to consider the downsides of fully automated systems, particularly as they pertain to the failure and recovery of those systems. If the system fails, how many people know how to fix it? How can we lock the doors? Restore power without leaving an emergency bunker? Drive cars out of imminent danger without the car-driving protocol online?

Ensure that off-course missiles can be remotely disarmed? Make sure that self-driving car passengers are notified of possible collisions?

Item 151 on Today’s Glitch List

Actual Unix-based file navigation GUI used in Jurassic Park
Based on the IRIX Silicon Graphics 3D File Navigator System

Yes, I am still suitably thrilled and frightened by the prospect of having an arm severed by a (justifiably) frustrated velociraptor, but I think most people today understand why bringing dinosaurs back from the grave is an idea worth resisting.

But irresponsible programming, the kind that could well be running many facets of our current, day-to-day life, stays out of the news unless a catastrophe occurs. That, more than sharp teeth and reptilian eyes, will keep me up at night.

--

--

Jack Ryan

CTO @ FinGoal. Node Programmer, Rust learner. Former English major. Occasional fiction writer.