photo by helloquence on unsplash

7 things to triple-check before launching a website

John Vaghi
Fuzz

--

Especially these days — with the frenetic rush to hurl javascript’s newest poster-child into production — it’s easy to put off certain tertiary facets of deploying a web app (‘functionality first’, certain coworkers like to say).

This isn’t a comprehensive list any means, but if you’ve been a developer long enough, you’ve likely neglected or forgotten to address at least a few of the following before an important launch.

  1. How is your SEO?

This is always the thing the web developer dreads having to address. Worrying about SEO is tedious and rote and often at odds with their dream application architecture. No one wants to scrap their single-page app idea just because Bing might not be able to crawl it correctly.

We recently made the decision to build a fairly large ReactJS app isomorphically (which I’ll write about in a future post), primarily because we wanted to avoid having to setup a pre-renderer later on in the project. Was it worth it in the end? Probably, although given the complete opaqueness of search engine logic, it’s hard to tell what sort of payoff our efforts ultimately had.

Either way, if you’re launching a site with any kind of consumer facing implications, it’s important to consider how your site will be interpreted by the rest of the internet and, ideally, to align on SEO expectations and implementation tactics early on in the project. At some point, the client is going to wonder why their site isn’t showing up in Google or, if it is showing up, why it doesn’t have the right description copy or why it’s not showing up with that fancy card data like their competitor.

Plus, if you’re building something pertaining to cats, which you undoubtedly are, you’ll need all the SEO help you can get.

801,000,000 people cat be wrong

2. Test your site’s social share-ability

Falling squarely into the category of the “I’ll do this right after launch, I promise”, social sharing is another important tenant of major web initiatives that developers usually want to have nothing to do with (luckily, there’s AddThis which’ll get you pretty far).

Have you defined OG tags for every route? If you share a link to your website on Slack or Facebook, what will appear in the preview? Did you define your Twitter cards? What even is a Twitter card? Are your share images absolute paths that are publicly accessible?

Some advice is to use Facebook’s Open Graph debugger constantly as — similar to the aforementioned SEO concerns — dynamically injected meta information isn’t always read by social crawlers.

At the very least, make sure you have some kind of social boilerplate defined up front— even if it’s the same for all pages. Don’t add LOREM IPSUM to you meta tags with the intention of changing it later. You’ll inevitably forget and have your glorious post-launch humblebrag ruined by sloppy meta tags.

Although I don’t think anyone would mind if Bill Murray was the og;image default for every site

3. Sitemaps

“Isn’t Google smart enough to do this for me?” is a way I like to rationalize putting off generating a sitemap.xml file for my site. However, if your site is large, primarily dynamic (e.g. blog posts), and contains pages that are not directly accessible by crawlers (e.g. cat detail pages accessible only via a cat search)- a sitemap becomes crucial to have.

Libraries like react-router-sitemap for ReactJS or sitemap-generator for node are good tools for establishing an infrastructure that allows your sitemap to be generated dynamically. You don’t want to be manually hard-coding a sitemap at the last minute as it will inevitably fall out of sync with the site’s actual pages as it evolves over time.

It’s also good to have a Google Webmaster Tools profile setup for your site as well. This service is more beneficial for the SEO / Ad people of the world, but developers should be checking the site’s profile occasionally to ensure e.g. that there are no crawl errors or indexing problems.

4. Browser Compatibility

The modern internet was built for modern browsers — I get it. Don’t go insane trying to support everything. All I’m saying is to make sure you have a good understanding of what the client expects will be supported and an idea of what COULD be supported.

Thanks to tools like caniuse.com, developers should have a good sense of which parts of their code are going to prove problematic in certain contexts (or they’ll have no idea thanks to tools like autoprefixers). As such, if the client eventually decides that supporting IE9 is suddenly super important, it would be good to know if achieving that is as easy as adding a few polyfills, or as dire as requiring a complete rewrite.

Don’t rely on QA to catch all of your compatibility issues and, just because you’re not supporting a certain browser, it doesn’t mean you should ignore it completely. If it comes to pass that large subset of the trackyourcats.com audience is inexplicably using that old, terrible stock Android browser, is it fair to ignore them entirely? (…yes. In this case, yes it is.)

5. Confirm any necessary DNS updates or redirects

So trackyourcats.com works totally flawlessly. But, depending on how your DNS is managed, www.trackyourcats.com may not. Ensure all possible ways a user might access your site are working as expected. If you’re using SSL, ensure http redirects to https. If there are any vanity domains, make sure those are pointing to the correct destination.

Additionally, if you’re relaunching a site with updated routes, make sure you’ve added any appropriate 301 redirects so users who have old routes cached in their browser are redirected to the relevant new page. See what pages show up in Google and make sure you’ve accounted for those as well.

Remember you can only blame the network cache for so long if you screw up the DNS at launch (but definitely blame it while you can).

6. Don’t forget to load test

For certain applications, (e.g. a URL explicitly advertised during the Super Bowl) load testing is definitely not a last minute detail. But even for smaller, relatively low-traffic sites, it’s good to have a sense of how it’ll perform amidst unexpected volume (how can trackyourcats.com not go viral?). We typically aim to support at least 10x the expected daily traffic and, using a CDN like Cloudfront or Cloudflare the concern is really about ensuring the CDN cache doesn’t expire at the wrong time.

Tools like Locust have really helped us in the past as load testing is sometimes cumbersome to setup. Paid services like flood.io exist as well for those who have the budget.

Remember, if you’re aiming to create controversy, just make sure the controversy is regarding the moral ambiguity of spying on your pets and not how easy it is to crash your site.

Thankfully, HUFFPO was there to post the video and steal all of 84 Lumber’s traffic

7. Be aware of integration rate limits

Isn’t sandbox mode great? “No credit card required.” “Free two-week trial.” There are tons of services out there that web developers love to use in modern web applications, but not all of them are going to be free once your app makes it to production.

Hopefully PETA never saw this #freebird

If you’re using the Google Maps API, for example, only your first 25,000 requests per day are free. For their geocoding API, it’s only 2,500.

Before launch, double check all of your integrations to ensure they’ll scale post-launch. Use this time to upgrade any accounts or attach a payment method to any service that might start incurring costs. It’d sure be a bummer if your cat tracking map goes down right as Chairman Meow starts veering towards that treacherous, dog-infested alley you told him to never go near.

--

--