Writing a Game in Assembly for an Old Computer Taught Me Lessons Relevant Today

Matt Lacey
6 min readJun 3, 2016

--

It’s highly likely that you have never heard of the Atari Falcon, and that’s for good reason. The Falcon was Atari’s last computer, released in 1992 and discontinued the following year. It was never a huge success, forming part of the final wave of the home computers of the 80s, all of which finally lost out to the PC. It looked almost exactly like the venerable Atari ST line (the prototype follow-up based on the 68040 processor was actually referenced in the Sony PS2 patent as they used a very similar design), with a few styling changes and some extra ports on the back, but it was powerful for it’s time. It could even record CD quality audio straight to hard disk thanks to a Motorola Digital Signal Processor that worked along side the 68030 CPU, and sported a 16-bit true colour graphics mode. I wanted one of these machines ever since reading about it ST Format, and a few years ago I finally got my hands on one.

The Final Result

I love to code, and I enjoy writing games, so there was no way I wouldn’t do that for this system. I’ve always been fascinated assembly programming but never really practiced it in earnest; I had something of a primer in x86 at university and dabbled a little with ARM on the Raspberry Pi, but it’s always remained on the ever-growing list of my ‘todo’ programming list until now.

Lesson 0: Have a Goal

I started off from a blank slate, spent some time learning how to set the video mode correctly (which I never actually managed) and reading a lot on 68k assembly on the fly. After some tips from others in the community I settled on using the DHS demo system as a base and quickly started writing routines to draw sprites and tile maps, but I didn’t have a specific goal, and that’s always a killer with software.

One weekend I dismantled a Sega Megadrive (Genesis in the US) with the goal of adding S-Video support, and I tested out my work by playing Columns for a few hours too many. One I finally turned the machine off I realised I had a goal: clone Columns. Sure, it would never result in something original but it gave me a specification for what I needed to create and therefore I had solid goal to work towards.

For most developers well defined goals are key, and every task should be well defined before you embark completing it, otherwise the familiar fiend called scope creep will come for you while you sleep.

Lesson 1: Plan Ahead

This may be true for other developers at a similar point in their career, but I think I’ve fallen into the trap of becoming complacent with regards to software design. Once you’ve been working in a particular field for a while and on the same system, you become so familiar with the limits you need to work within that you often know roughly what your code structure will look like in your head before you’ve started anything. I know this is true for me when working on the Salesforce.com platform and I suspect it rings true for other developers on systems where CRUD is the norm. The problem is I often fail to account for this when starting out on something less familiar.

Refactoring is often hard. Not least when on the surface it appears like you’ve taken something that works and spent weeks breaking it, only to get back to where you started. People observing this process often struggle to understand why it must be done and think it’s a waste of time, but the developer working with the code can see the beautiful forms that she has crafted out of sludge and the over-wrought misery of quick hacks on top of quick hacks.

The thing about programming is that a seemingly simple idea can get incredibly complicated at an alarming rate once you start writing code.

As it turns out, refactoring assembly code is really hard. Had I understood at the start just how hard it was I would have broken out the pencils long before pressing any keys. Honestly I think I’d rather start again than try to refactor the beast that I created, or rather, evolved during this project. I believe that every developer should constantly refactor on the fly to minimise technical debt, but that’s not an excuse for failing to plan in the first place.

Lesson 2: Know The Limits

More specifically, know the limits of the environment you’re developing in. I’m not a stranger to memory and performance-limited environments, but I failed to really consider the specifics of this machine when I started.

Columns involves a grid, and knowing I was working on a computer with reasonably limited RAM (14MB) I immediately decided to optimise for lower memory usage and used a byte per grid cell: using the three lowest bits to indicate type and the remaining five as various state flags. I’ve tried to make a habit out of not optimising too early, but sometimes it just gets the better of me.

One Of The Prettier Bugs

This choice was another failure on my part. I already knew the machine was reasonably constrained in terms of performance from various experiments, but for some reason thought memory would be the real issue. Not even close. I’ve not actually measured the memory usage because I’ve never needed to, it’s never been an issue. The binary is around 1.4MB and it creates a few buffers etc. at run time so at a rough guess I’d say it was around 2–2.5MB, comfortably inside the available space. The final product doesn’t really struggle in terms of CPU time either, mostly because the rendering is optimised to do very little each frame.

The upshot of this overly frugal use of memory meant that a lot of my code and logic got far more complicated than it needed to. Towards the end of the project I solved an issue with my double buffered rendering by creating another buffer for the grid, solely to store it’s current render state (for instance, clearing a gem needs to be done on two consecutive frames so I needed to track that). Had I just used more of the available memory at the start my life would have been much, much easier.

Almost everything in programming is a trade off between processing time and memory usage (be that database storage or RAM), working out what the real constraints are likely to be — and how tight they are — at the beginning of a project can save a lot of pain later down the line.

Lesson 3: Perseverance Is Underrated

I don’t think I know a single developer who hasn’t hit a problem that just seemed insurmountable and all but thrown in the towel before suddenly having a moment of inspiration and solving it. Honestly I think that’s one of the things that gets people addicted to programming in the first place. Programming is an almost continuous process of encountering and then solving problems, and those moments of reward can make up for considerable amounts of torment; the harder the struggle the more rewarding the victory.

The thing about programming is that if you want to succeed you have to be determined enough to deal with situations such as a compiler error on line -1.

There are many developers willing to struggle on past the point where I may throw my hands up in despair, and kudos to them. The first commit I made on this project was on August 8th, 2015, and I know that was somewhere between 6 and 12 months after I first started (it took that long for me to move the code from the Atari to a machine with git—https://github.com/mattlacey/columns). In short, it was a very slow burner. I nearly gave up many times, but each of those minor victories kept me going. Solving a bug I was stuck on for six months was a particularly shiny moment and was only achieved once I found some new tools to debug what I was doing.

The thing about programming is that it teaches you that you can flip-flop between feeling like a genius and an idiot considerably faster than you ever expected.

Programming is never easy, from the first code you write you will run into frustrations and heartache, and often it turns out to be your own fault. But every developer has their victories, from the first “Hello, World!” to finally working out why the seemingly-impossible-to-occur bug is perfectly obvious; Revel in them.

--

--

Matt Lacey

Co-Founder of @SPKeasey & @ProxInsight. @Code_Coverage co-host. Code addict. Fan of science, snowboarding, beer, Atari & Sega. More at http://www.laceysnr.com