How not to build a website

It’s not nice to call out on the quality of others’ work, but sometimes it has to be done 

This week saw the launch of, which is a UK Government website intended to educate people and small businesses how to be more secure online.

First impressions do not give it the best start, alas. It is a Government-branded site that says it’s from the Home Office, but it doesn’t have on the end of it. Rather than pages, you’re presented with a set of “shops” on a cartoon “street”. This is not only an incongruous metaphor, but it makes it look like a site aimed at children rather than adults and business owners:

There’s also lot of “click here”s and “watch this” type-labels on features to make clear what is they do, which is an indicator that the UX is unintuitive.

And despite all the clicking required to navigate, a lot of the content of the site is just links to other sites:

But I’m not enough of a designer, or a UX expert, or an editor, to make a qualified critique of these design decisions. Instead I’m going to look at the design of the site’s architecture under the hood.

Even in a Web 2.0 age, conventional wisdom for a web site that delivers chunks of static content (rather than a fully interactive web application), you deliver the majority of your content from your CMS as HTML pages; meanwhile JavaScript is used to control interactions on that content within each page.

This site rejects that model in favour of something I’ve never seen before at this scale — it pre-loads the entire website’s content as one massive JavaScript global variable (350kB worth, shown above) in the source code of the homepage. Then there’s another very large script (over 290kB and 8,000 lines) which parses & outputs the relevant page or section back into HTML for the browser to render. Throw in the CSS and other scripts (jQuery, analytics etc.) and the weight of the text files alone, not including images or fonts, is nearly 1MB.

This is the kind of code you might write for a bet (“Let’s replicate a CMS and HTML tree builder in pure JavaScript!”) rather than put on a production website. And it’s substandard practice for a number of reasons:

  • Embedding all the content in JavaScript in the homepage bloats the page, especially for anyone on mobile or dialup (a check on iOS Simulator’s debugger shows mobile users get the same homepage with no differentiation on user agent).
  • The processing involved slows down the browser rendering, especially as it’s using jQuery and not native methods. Older or lower-powered devices will have trouble with this — there’s a noticeable lag in loading sections on my iPad, which is only a year or so old.
  • Putting all the content inside JavaScript makes it difficult (not impossible, but difficult) for search engines to crawl and index the entire site, and thus for users to find all relevant content.
  • 8,000+ lines of JavaScript is a lot. The more code you write, the greater the potential for bugs, and the longer it is going to take to test. Combining presentation and style into JavaScript also makes for difficult maintenance in the future.

It’s an even more odd design decision to make once you realise the site is built with Drupal, which is out-of-the-box capable of providing multiple pages of HTML in the usual way.

You might be thinking at this point: “Fine Chris, so it offends your technical sensibilities and is a bit slow and you would have done it differently. So what?” But it goes beyond my taste in architecture. The problem is that the site will break the moment you have JavaScript turned off, which not only goes against the guidelines for Government websites and the principles of good Drupal, but also makes the site inaccessible to anyone who does not have JavaScript, whether by choice or not.

That can include some visually-impaired people who rely on screenreaders (although even with JavaScript enabled, a screenreader is going to have a hard time navigating that layout, which may have legal repercussions). It also covers those on poor connections or mobile where the script download times out. Or if a person is operating a whitelist of sites permitted to use JavaScript, or behind an over-zealous firewall, for security reasons.

Still, for the rest of us, it’ll still be fine, right? Well, whether it’s due to the speed issues or the snazzy effects, users of IE6 and 7 don’t just get a broken site, but are shut out entirely and told to upgrade, even if JavaScript is enabled. Of course, they may not be able to upgrade, whether due to not knowing how, or security restrictions

You can argue IE6 and 7 are not used much these days, and you shouldn’t be designing for them, and I would agree for most websites the experience need only be adequate. But this does not mean to block them entirely, and this is especially true for a site that is designed to educate inexperienced users about online security. In this case, the people who most need your advice are people who use the oldest and least secure browsers.

It cannot be emphasised enough how counterproductive shutting out these users is.

None of that is covered in the blog post from the agency that built it. Instead the focus there is on the artwork and look & feel (which is conflated with user experience throughout). It’s telling that barely a line is written about the architecture and code underneath. It’s web design where everything went into the “design” bit and nothing went into the “web”. We’ll be needing FOI requests to find out how those decisions were made.

It’s still amazing that a site like this made the light of day in 2014. So let’s make a resolution — it’s still January, after all, and though it may be getting a little late in the month, better late than never. One thing we as developers should all take up is to stop making sites like this, and to stop thinking design is just about how something looks.

(Thanks btw go to @adrianshort @JoeTheDough @simonwheatley & @technicalfault for conversations on Twitter yesterday which helped me some way to assembling these thoughts)