France’s Lost Computing Generation

Growing up as a computer enthusiast in 90’s France.

While the mid-80’s saw the BBS subculture flourish in the US, I accidentally fell into a parallel experiment in the future of computing that was quintessentially French. Back then the country had been blanketed in free Minitel terminals, which gave everyone access to a broad palette of online services, from directory assistance to online shopping. For about two decades the network came to define France’s pre-Internet era, putting connectivity at the fingertips of everyone. It was expensive and slow, but very reliable and remarkably intuitive by the standards of that decade. It was the primary way my family interacted with our bank and mail-order retailers during those years.

On the family Exeltel, late 80's.

My parents bought what could have become the ultimate upgrade to the service. It was a home computer called the Exeltel. A fully-fledged computer driven by a GUI, which interfaced seamlessly with the Teletel network that backed Minitel terminals. Imagine being able to download video games and software onto your computer in minutes from the phone network back in 1986, in a manner a 10 year old could actually navigate and understand. It was amazing. I could spend hours teaching myself BASIC and trying to defeat Wizord (a Galaga clone). Unfortunately, the ecosystem failed to innovate and bring down costs, and was swept away by the Internet wave — which, in hindsight, is probably for the best.

By the early 90’s I had moved on to the PC platform with the IBM PS/1 and PS/2. Though they opened up the software palette, their connectivity, at least in my corner of rural France, was limited. My networking was downgraded to exchanging floppy disks with friends. Some of my first hacks involved punching holes in 720k floppies so that the system would recognize them as 1.44M disks (free storage!), introducing a delay loop written in Turbo Pascal into the IRQ that controlled the real-time clock in order to slow down my blazing fast 80386 enough to be able to play older games, and going to hunt for the password checking subroutines of video games to disable them. Fun times. I also had my first brush with AI, trying to unsuccessfully write an agent to play Connect-Four in C — I have since taken my revenge and learned one using TensorFlow, just to convince myself that I had made a modicum of progress since.

My next couple of years were spent in boarding school, where desktop computers were no longer an option. In that world, the amazing HP48 was the queen of computing platforms. The calculator was in many ways a unique feat of engineering. Despite its limited form factor, it was remarkably hackable, and a lot of lore had developed around it, including books that explained all its architecture and low-level language in excruciating details (en Français!). It also used reverse polish notation, which literally rewired your brain in ways that enabled you to express complex expressions seamlessly. You could control its LED emitter, and beam programs from one device to another — or use it to turn on your roommates’ stereo at midnight undetected.

The next couple of years in engineering school (‘Grande École’) was a giant cold shower for my computing enthusiasm. On the one hand, it was 1996 and something was clearly happening with the Internet. Gopher was quickly surrendering to HTTP. Our student rooms were wired with a super-fast ATM network, our computer room stocked with top of the line machines with a fast pipe to the rest of the world via Renater. The network did everything to speed up access to a then US-centric Internet, including providing a local edge content cache for all web assets. Privacy and security standards were different back then: you could actually browse the cache and see every piece of content every student on the network had pulled from the web. It was … educational.

On the other hand, there was a not-so-subtle warfare being waged against computer science by the education establishment. The often-repeated mantra was: ‘computer science is a tool, not a career.’ Our Intro to Computing class even had a slide, complete with campy clip-art, explaining what a terrible career decision getting into computer science would be:

Average salary vs. year, from my Intro to Computing class, 1997.

Much of the rest of the class focussed on the ‘networking fad,’ and how no self-respecting executive would fall for the obviously flawed cost-benefit analysis of ‘client-server’ architectures. Software engineers were branded ‘analyste programmeur,’ a subtly demeaning title that emphasized that once you wrote code, you were no longer an engineer. We had one programming lab, focussed on regurgitating the syntax of the C language, and one real-time computing class, where a disenchanted lecturer threw concepts of mutex and semaphore at people who, by and large, had no concept of what threading or multiprocessing actually meant. Students who wanted more were left to fend for themselves: the VideoLAN project for instance, which is still thriving to this day, was one of the few skunkworks projects that grew largely out of unquenched appetite by the student body to take advantage of the possibilities modern networking capabilities enabled. I am thrilled that, 20 years later, the project’s founder received a well-deserved accolade from the French government.

The stigma ran deeper than just one institution’s lack of enthusiasm: real engineers were expected to build power plants, cars, or TGVs. There were no industry role models for computing or networking, but plenty of failures like Bull, just emerging from a government bailout, the moribund Minitel ecosystem, or the ill-fated Bi-Bop cellular network. Hardware was absent from our curriculum, and the only software company at our recruiting events was Dassault Systèmes, whose affiliation with the defense industry made respectable. There was a strong sense that whatever was happening with the web, it wasn’t happening here. Students themselves were taking the cue from their environment and singling out computer enthusiasts (‘les babasseurs’) as people who were wasting the unique opportunity they were given.

So I, like the vast majority of my peers, turned away from computer science. Most of us were steered toward finance or consulting jobs, though I set my sights on a Signal Processing specialization as the closest respectable proxy to software engineering — respectable because it was largely math, and at that time wavelet analysis, buoyed by many theoretical advances by French-speaking mathematicians, was all the rage.

I can’t help but reflect on the missed opportunity of my generation in regards to computer science. There was a lot of grassroots enthusiasm for it, but little appetite from the supporting ecosystem to channel that energy. I can count on one hand the number of my peers who landed in the computer industry fresh out of school, and even fewer who still live in France.

Fast forward to today, and I see the same grassroots swell of enthusiasm developing for AI and machine learning. Recently, I gave a seminar in one of France’s top institutions, and was honored to be invited to meet with some of its administrators before the presentation. Walking to the main amphitheater where I was to give the lecture, I couldn’t help but overhear the ranking official berating my host that they should really have picked a smaller room more suited to the niche appeal of the subject matter of my talk — deep learning — down to the last second when we all entered a large amphitheater packed to the brim. I want to believe that the growth and importance of machine learning and data science is being noticed by the country and its institutions, and many hopeful signs point in that direction, but my own journey is a vivid reminder that it doesn’t take much institutional inertia to derail the most enthusiastic of vocations.