Archive Architecture for Conference Sites

Paul Bailey
PyTexas
Published in
3 min readFeb 1, 2018

A problem I’ve encountered personally while working on conference sites and confirmed with others, is moving forward with new sites while not breaking your old site. Usually our goal is to provide a fresh new conference site every year. However, doing this often breaks our old sites because we would like to be lazy and reuse at least part of the old site code. Over the years and with some newer web browser features I’ve come up with a strategy to archive our old sites so that our new sites can break any previous dependencies.

First some of the requirements I try to meet with this architecture:

  1. Data driven backend
    - CMS to control content
  2. Once a conference is over, the site can be frozen and never changed again
  3. New sites can reuse old code but can also break old code without consequence

The web architecture that I found that can help you meet all these requirements is a Single Page App that is offline enabled via Progress Web App features. The Single Page App architecture forces you down the road where most of your code is just static files and you call your backend API for updates. You could also accomplish this with a static site generator, but by using an API backed by a database, you can structure your data more and work with more traditional CMS systems.

Now that we have 80% of our code on the frontend, the only thing left to worry about is our API calls. When building a Progressive Web App (PWA), we want to think about caching all our content and in our case even the API calls. The PWA features in themselves don’t really help you archive your site, but they get you thinking how would you cache your API. If you have that figured out, when you’re ready to archive, it makes the task easy. For PyTexas.org I even went so far to make sure all API data is retrievable from one API call. Now when I’m ready to archive, I just copy my static frontend and download a copy of my API data.

The Consequences

By using this architecture you also end up building a very fast conference site that is offline enabled and easy to archive. However, there are additional negative consequences. First, your site is more complex. It has lots of layers and may not be easy to manage by one person. Also, you introduce the problem of invalidating your Service Worker cache when your data updates. This is not the easiest problem to solve and involves newer technology with fewer libraries to help you out. On PyTexas.org, I solved this by adding a web socket that sends a notification to the frontend code whenever a new backend is deployed or data is updated. Again this involves multiple layers: database, backend HTTP, backend web socket, frontend, & frontend service worker. Keeping all these layers in sync can be annoying when starting out.

Archiving Your Site

Once your ready to archive, you can turn off many of those layers and just archive your static files and API. I usually use Github Pages to host the files. To get a Single Page Apps to work on Github Pages copy your index.html to 404.html also. This will enable your frontend router to work.

At the end of the day is all the complexity worth it? I would say yes because it enables your organization to be more nimble in the future and creates a really awesome offline site experience. Your conference attendees can then access information fast and even when in low bandwidth situations.

--

--