I rebuilt my website in Node.js, and it died in Google. Here’s what I learned

Stuart Costen
Oct 27, 2020 · 11 min read
Photo by Nathana Rebouças on Unsplash

There is always a slight trepidation about putting a new website live, and for a good reason, it’s usually untested by anyone (or anything).

Now, I think I’m fairly OK with UX and UI. I understand a website should be easy to use, look good, and most of all, be interesting and engaging for the user.

I was fairly happy with the way my new site looked and worked.

So I pressed the Go Live button, and for a week or so everything was going great, then suddenly my website fell off a cliff.

Oh oh

Google’s Search Console performance graph began to head downwards, my website visits plummeted, and the contemplation to turn my old site back on was playing in the back of my mind.

But not one to take the easy route I decided to turn it into a challenge.

Google’s Search Console became a window to a world I’d seen on many an occasion but never stopped long enough to have a proper look round.

I also started investigating what techniques other websites, who were top of the rankings for keywords I used to rank for myself, were using.

After a month of tweaks, harsh lessons and hours spent checking Google for my chosen keywords, the performance graph is starting to rise, but I am by no means out of the woods...yet.

I’ve decided to share my findings and lessons I have learned along the way with you all, mainly so you don’t make the same mistakes I did, but also to give you a good indication that a small oversight can have long-lasting damaging effects.

But first, let's start at the beginning…

My WordPress website was clunky. It was loading slowly and failing miserably on Google’s page speed metrics.

Although still doing fairly well in Google’s SERP (Search Engine Results Pages), I was hungry for a faster website and one that was simpler to theme, so I settled on the technology I was going to use and started work. Having plenty of time during lockdown certainly helped.

Now, as a web developer by trade, I have the power to mould a website any way I choose. I find the usual issue with a CMS driven website is that the various plugins you install, and the general mechanics of way CMS’s are built, mean that some elements are out of your control, mainly how many stylesheets/scripts are referenced, the code under the hood being miswritten from a speed point of view, or lots of unnecessary stuff there to handle a plethora of situations that won’t happen with your setup.

This means that when you run the Lighthouse report in Chrome, it hates you.

My main pages were getting below 40 on the performance score out of 100.


I’d read somewhere that Google was going to start using site speed as a ranking factor. So to give me a little edge over the competition, which most are sitting at below 40 for performance themselves, I decided to rebuild the site in NodeJs completely and set out employing all the page speed tactics I could find.

I yearned for a performance score over 90, and I wasn’t going to get that from my old WordPress site.

And so I did it. Built it, tested it and pushed it live.

Winner winner

And about 3 weeks in…that’s when the problems started.

I noticed a slight rise at first, as Google was probably noticing I’d made a change and showing its appreciation for updated content, but then I noticed my usually top 4 keywords were now falling on to page 2, and as you are probably aware page 2 might as well be page 100 for the amount of traffic you are going to get.

Panic set in, what have I done? I’ve taken my well-ranking website and shot it to pieces. And for what? Failure.

My first port of call, and quite rightly if you want to rank higher in Google, was Google Search Console.

Secondly, I had an old version of my website. I could cross-reference for data differences.

And thirdly, just doing searches in Google itself for keywords you want to rank for can reveal a lot more than Search Console can itself sometimes.

So, to the lessons.

I’m no SEO guru; I’ve no SEO qualifications; I have no cold-hard Google algorithm facts.

What I do have is anecdotal findings and musings of the past month of my life.

My site is starting to rise again in Google Search Console performance charts so I must be doing something right.


But as with any kind of SEO ‘tips and tricks’, what works for me, may not work for you and I have a few things on my side already that certainly help which I will explain later.

If you are building a fresh website with a brand new domain, you may not find anything I am writing about makes much of a difference.

Lesson 1: One URL to rule them all

I had the slug of my pages, and the slug of my categories, so I originally linked to posts using the /{category_slug}/{post_slug} rule. The big problem with this is that some posts could be in two or three different categories, so two to three separate pages were being generated for each one, killing my ranking as Google had no idea which one was the one.

I also had some links that ended with / and some that didn’t — Google actually treats these as different pages.

The site could also be reached via HTTP, HTTPS, www, non-www, and to my horror, after doing some keyword testing I found Google had indexed my IP address and was serving two pages with the same content one from my normal web address and one using the IP address instead. Horrified doesn’t even come close.

So what did all this mean? Google was finding 1000’s of pages for a 400-page website, but it was because there were many different routes it could take to crawl the pages, so it was effectively crawling more pages on the website than it had and marking me down because of multiple duplicate content issues.

How I fixed it:

Luckily, I was still using WordPress as my headless CMS giving me access to some neat WordPress functions such as get_permalink() which gets the path of a post. I could also take advantage of all the good work Yoast SEO does for you that I didn’t know about.

I started grabbing the permalink from WordPress; this was a good start as it meant there was a definite path to the post as WordPress builds the link using the category as well, so if a post appears in multiple categories you can pick the primary category which will make up the path structure.

The post path = sorted.

I then decided to make sure that all my URLs ended with a slash. This was as simple as checking the last character of the current path and if it didn’t have a slash, use a 302 redirect to go to the link with a slash on the end.

I then made sure all my URLs were HTTPS, this involved checking to see if the header was HTTPS or not if it wasn’t redirecting the URL to the HTTPS version using a 302 redirect.

Then I made sure that you could only visit the site using www so again I made a check that checked the host began with www. and if it didn’t, I redirected using a 302.

Remember, I’m a programmer, so such things are easy for me to implement; however, it’s just worth noting the problem and solutions.

Magically the HTTP redirect also solved my IP issue, but there were other problems I started to discover.

For my categories, I found my category ‘page 1’ worked with ‘/page/1’ as well as without, two new pages, same content.

So how did I fix it? Yep, anything with a page/1 redirected to the category’s first page using a 302.

Now everything redirects to its correct page, even if you access a post via its old path it redirects to its new one.

Lesson 2: One URL to rule them all — Part 2

Of course, Google had indexed the GET parameters!

My above tricks wouldn’t solve this as I needed the filters on the category pages, I couldn’t just redirect them.

I also noted that my IP address indexed content wouldn’t go away.

What was I missing?

It was only when searching through “Coverage” part of Google Search Console, I noticed an “Excluded” tab, and my non-user selected canonical links had gone from 0 to 3,000 in a month.

This was a bit of a breakthrough, as it highlighted in plain sight all the stuff I’d applied in Lesson 1 and confirmed the next thing I had to do.

I added <link rel=”canonical” href=”[the URL to rule them all]” /> to the header HTML of every page. This meant that Google knew this was the page it needed to focus on. And because of the redirect work I had done, it worked easily as the correct URL path would be the only one Google ever saw. Therefore it further highlighted to Google that everything else it had indexed on my site had been a lie. The IP indexed content quickly dropped out of Google and the GET parameters which had really made a mess of my indexed links vanished.

As you can see why the One URL to rule them all is so important for SEO. I wouldn’t want to pick up two books from the library with different names and find out that the content inside them was the same!

Make sure you have some sort of link ‘bank’, which could be a CMS or a CSV or whatever. Ensure Google knows the link because if it gets a whiff, there might be more than one, then you’re in for a slippery ride down the slopes of SERP.

Lesson 3: Rich snippets FTW

Again, I’m a web developer; I can whip up some rich snippet code that integrates with a CMS in a matter of minutes. It might take you or your development team a little longer.

The best way to get rich snippets off the ground is to search on Google using the keywords you rank for or want to rank for.

You’ll see that when they appear in Google, they may have extra dropdowns or extra links associated with the result. These are usually rich snippets.

Rich snippets are like a coding language for, well, language. It’s not just giving the name of something; it’s describing it in a shared, meaningful way.

For my site I employed the schema definitions; FAQ, LocalBusiness, Restaurant, BreadcrumbList, Organization and Event but there are loads more, and you may get a little wrapped up trying to find out which ones are best for you.

So if your content is right for Rich Snippets, my advice would be to get them set up.

Lesson 4: Google Search Coverage tab is your friend

A lot is going on in Google Search Console, some of it is useful some not so, but the area which opened my eyes was the Coverage tab and more explicitly the ‘Excluded’ tab within there.

I was shocked and amazed at how many pages in there were being ignored by Google because of mistakes I had made, and it’s this discovery which lead me to devise the ‘one URL to rule them all’ rule. If Google is finding these pages and having to decide whether to keep them or not, it must be hurting the rankings as you’re wasting valuable crawl time that could be spent on your actual site.

Get to grips with Google Search Console, seriously, it doesn’t tell you everything, but if you analyse the data, you can find some quite interesting things out.

One thing I did learn which I’ll pop here. I noticed that sometimes, in the Performance tab, that my average position was falling, but my total impressions were going up, how can that be? I thought.

I singled it down to one reason, and that’s pages that possibly didn’t rank for keywords before, now did, but were starting way down the pile.

If you have a new keyword that is ranking at position 50 for example, it will bring your average position down quite considerably and because it's a new page ranking for a new keyword, then your impressions will, of course, go up as your pages are now appearing in more searches.

Make sense? I wish someone had told me that at the beginning!

Lesson 5: Fix them dastardly 404's

There are two parts to this, internal and external and both for different reasons.

Why fix internal 404's?

Fixing internal 404’s is just good practice from a user point of view.

If they can navigate their way through your site successfully, you’ve done something right.

But if they are met with a page not found then, that is obviously going to be an issue, and because of my recent redesign and restructure, I had created a lot of 404's.

As we know, Google is obsessed with user experience, so having 404 you would have to guess this would also hurt your ranking score. I can’t tell you if it helped rebuild my ranking, but it’s just good practice anyway.

Why fix external 404's?

When I was using SEMrush for a week, I noticed that I had about four to five backlinks from domains with a good score that went to pages that no longer existed due to my recent changes.

This got me thinking, if a backlink from another website went to a 404 then would Google treat it with the same respect? Probably not.

So I went back to my favourite tactic, I created 302 redirects from all these broken backlinks to pages that were of a similar vein.

Again, only time will tell if this will help my ranking, but again it is good from a user point of view, so well worth the time to resolve.

That’s all folks!

There you have my five most important lessons when rebuilding or launching a new site.

Just get your link structure down from the get-go; honestly, it's the most important thing you can do.

Try and aim for a nice link structure like this.

And then everything else will hang nicely off it.

I was a little surprised how much the Yoast SEO plugin did for my WordPress site, and actually how much SEO WordPress itself handled.

However, with tools like this, you don’t really understand what’s going on, and I found that Yoast was actually doing something that hurt my SEO as it didn’t understand my site content. Now I understand, I have fixed that issue which had been left unchecked for years.

But with some hard work and a bit of time, you can easily emulate what these tools do for your site. You need to kind of know what you’re doing and why you’re doing them to make them effective.

So, after all of this, what is the best SEO technique I hear you ask?


To view the website in question head to www.lovederby.com (please ignore Lighthouse scores if you do test it, Google Ads drops it by at least 30)

The Digital

Digital Discussions for the Digital Age.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store