Unlike seemingly every one of my CS peers, I didn’t write my first line of code until sophomore year of college. It was immediately overwhelming and exhilarating, so much so that it steered the remainder of my college career. I jumped into the underworld of the technological age, no longer protected by interface and design like the rest of the uninitiated, and began my new life as a creator. To quote Jay-Z out of context: “dammed if ain’t open up pandora’s box,” but, you know, in like a good way.
I remember the day I first overhead a classmate raving about vim and how he couldn’t possible understand how anyone uses anything else. I didn’t know at all what that was (although I’m now so good with vim, I can save and quit), so I googled it, saw it required terminal use, and promptly dismissed it as something my future self could worry about. I was incredibly self-conscious about development back then; everyone seemed to have very strong opinions about languages and text editors and operating systems, and I just had what I knew: Java (barely), Notepad (not even ++), and Windows (7, at least).
Soon after that, I started taking notes on my eavesdropping. The programmers-english is a hard initiation; every developer has their own vernacular, and intelligibility is often asymmetric (it’s quite like trying to understand Scottish). Before I learned what organized meant, I would just list out any foreign terms in a text file; at the end of the day or week, as time allowed, I would google everything, append any interesting notes I found useful or amusing, and delete anything I didn’t care to learn more about. This system worked well in the beginning, but I was eventually drowning in poorly-named, unformatted text files. I’ve since migrated to other note-taking schemes (I’m partial to OneNote), but the intent is the same.
I noticed a trend after a few months of this practice: if I found something interesting and wanted to learn how to use it, I would almost always fail at the first attempt only to need that tool or skill for some other project weeks later. Having already failed once, the second go was usually a smoother failure, plus a bit of progress. That sentiment has necessarily become my definition of success.
This seems like a good time for an example. In my first computer science class we were required to submit our programming assignments by copying our files to a shared directory on the departments RedHat linux computers. Near the deadline for program one, I overheard someone explaining how to use a thing called ssh to connect to the linux lab from home and copy the files that way instead of going into the lab to do it. I made a note.
The day our program was due, I finished writing in the evening while in the comfort of my own home. To avoid a long walk and putting on pants, I decided to try using this ssh technique. I did some brief reading and gave it a go. Windows does not natively support ssh, as I learned after a number of failed attempts and ignored errors, so I gave up and put on pants. Just before that though, I discovered a tool called PuTTY that let Windows users ssh. I made another note.
The next week, a few people complained that turning in assignments was too complicated. Our professor posted a brief tutorial on ssh and said everyone should follow it for the next submission. Having already fumbled with ssh once, I decided to jump right to PuTTY, and after a few tries and mistypes, I successfully connected to a linux machine only to discover we needed to use something called scp to copy files. Before putting on pants to head to the lab, I made another note.
These incremental failures eventually accumulated into success. By the end of the semester I could ssh and copy files, and had amassed enough notes to attempt to switch from Windows to Ubuntu. My entire college and professional career is the sum of these experiences and failures. I’d like to eventually fail once at everything on my lists; hopefully some things stick along the way.
Email me when Rob Jauquet publishes or recommends stories