Image Deferring vs Lazy Loading vs SEO?

Philip Badilla
6 min readOct 19, 2018

Before we get started, a disclaimer — I’d like say that I’m not in anyway an SEO expert. In fact, I have very little knowledge of the field. However, I’d like to share my experience during a critical performance update our team undertook as part of our search engine optimization routine.

The challenge: Bring down the page load speed of our WordPress multi-site from 4–5 seconds down to less than 1-2 seconds!

SPOILER ALERT: We did managed to get it down to an average of 1.5 seconds with browser caching enabled, and around 1.8 seconds on initial load. I know it’s still relatively slow and we would love to get it down to load in less than a second (I’d probably make another glorified street talk out of that someday), but that was all we could achieve with the limited time-frame.

There were obviously lots of other improvements that we had to implement to shed that measly 2 seconds off that page load speed. Server caching, code refactoring, database query optimizations, minifying scripts and styles, image compression and you’ve guessed it — images loading optimization (we haven’t implemented CDN yet).

I thought to myself, “How hard could it be? There are tons of ready made Javascript libraries that can do this right ?“ — Well, what do you know? There are indeed lots of it! Unfortunately, I’m not gonna waste my time listing them here.

Now while working on this image loading optimization. I stumbled upon two concepts— Lazy Loading and Deferring. So what’s the difference?

I’d like to dumb down stuff whenever I’m having difficulty understanding concepts and ideas. Think of your website like a restaurant — your pages are tables, your browser is the waiter, your kitchen staff are your servers and the food item on the menu are your images.

Default (Fast Food)

Imagine you only have 5 items on you menu (burger, fries, diet coke, sundae, chicken nuggets).

A model that may fit this specification would be a fast food restaurant. The kitchen staff would be preparing the menu as soon as you relay your order to the cashier(waiter), however you’ll only get to sit down and enjoy your meal once everything is served and paid for. It works, food is fairly easy to serve so the customer won’t really have to wait that long before they can eat.

Defer (Fine Dining)

However, not all dishes are created equal.Now, if you have a slightly complex set of dishes in your menu which takes time prepare, a fine dining restaurant might be a better model.

You sit down, order, and get to enjoy your appetizers while the main course is being prepared. Although the total amount of time spent during the food preparation is significantly longer compared to fast food, you won’t really notice because of how it is served.

Fine dining isn’t obviously a bullet proof approach. If you don’t believe me, try dining on a Valentines day on your favorite restaurant. Believe it or not, waiters can only handle as many customers at once, the chef can only cook as many dishes. That’s the same with browsers — different browsers have different persistent connection limit.

Enter — lazy loading.

Lazy Loading (All you can eat buffet!)

As you can already tell, this model is best for infinite scrolling(all you can eat). Some of you may suggest that you can also use lazy loading for your fine dining type business. If you hate having to remind your waiter about an order you made an hour ago, then you shouldn’t use lazy loading for your fine dining model.

Back to the Task

As I’ve already explained our website has a fine dining setup so we ended up using defer to optimized the image loading. And with the mind set of keeping it simple, I’ve implemented the easiest way I know I can defer the images which I would assume most of you already know. Browsers would only load the image if the src path is set to the path of the image. By adding a data attribute to temporary hold the actual image and using a placeholder image on the src attribute we’re able to control when to load actual image.

<img src="placeholder.png" data-defer-src="actual-image.png" />

Or better yet you can use a base64 version of your placeholder image.

The Dilemma

There are few fundamental issues with this approach.

  • If the script failed to run or browser has disabled scripts — you’re screwed.
  • Search engine crawlers may not be able to index your images
  • If you’re designing HTML pages using WYSIWYG in IDE’s this approach wouldn’t make your life so easy.

The first two issues may have a work around according to this, that is if you are willing to create a noscript version for every images you want to defer, or if you don’t mind creating a sitemap for all your images and if you’re confident that search engines will be able to crawl your images and willing to take the risk on your ranking, sure why not. Here’s one that says otherwise. However, I still can’t find a deterministic article that mentions Google being able to crawl lazy loaded images.

Here’s a recent Twitter discussion about this.

Now for me to confirm if my defer implementation wouldn’t affect the image indexing, I used a Google Webmasters Tool Fetch. As we can’t afford the risk we had to make sure. And indeed, images in the DOM appeared as

<img src=”placeholder.png” data-defer-src=”actual-image.png” />

Now I’m screwed! We can’t push these to production(or can we?). No we can’t(or can I?). No I can’t — I asked(and begged).

Now I’m really pushed back to a corner. Luckily, I strive working under pressure under strict time constraints(or maybe I was just afraid to lose my job, my family needs me!). So I buckled down and wrote down a library that aims to defer images without the negative impact on SEO. And I think I might have made something worth sharing.

Safe Defer

You can find the repository here.

The concept is fairly simple(otherwise I wouldn’t be able to do it).

I need the DOM to be in a correct form during design and even if scripts don’t run. So with that requirement I have no choice, it has to look something like this.

<img src=”actual-image.png” />

As I’m planning is to defer all defer-able images once DOM is ready, I needed to tag all images that needs to be deferred.

<img src=”actual-image.png” data-safe-defer-src/>

So I have to wait for the DOM to be ready before I start the defer process. I did this like so

  <div>My last DOM element</div>
<script>
(function() {
// remove the all sources!
})();
</script>
</body>

Here’s a what’s it looks like in the background during the process

Images getting cancelled during defer process
Images loaded after the page has finished loading

To make sure my hypothesis is correct, all we got to do is cross check with Google Fetch.

Now this approach does not in any way give you the best performance(but it’s pretty good for what I need), as before the defer process starts, some of the images may have already started loading. I then have to iterate every element that has the defer attribute and try to replace their src with a placeholder, and save the actual source in a backing attribute.

By removing the images from the src attribute it cancels the image downloads and allows the browser to complete the page load. Once the browser is done with the page load event, we can now put back the original images on the src attribute.

We can do that by doing the process in the window load event.

window.addEventListener("load", function() {
// return all the sources!
}, false);

So this approach may sound dumb, but it works. You can check for your self using the test page in the repo. It’s also very easy to implement in your website and shouldn’t have any negative effect in terms of SEO, so it should be fairly easy to confirm via Google Fetch.

The Good

  • DOM is intact during design
  • DOM correctness is not dependent on script
  • Designing HTML using WYSIWYG still works (who does this though?)
  • Performance improvement
  • Image indexing is not affected, Crawler friendly

The Bad

  • Performance gain is not as good as my initial defer approach

I guess, that’s it. Let me know in the comments how I can improve the library as I’ve only worked on that repo for a few days and I might have not yet tested every edge case. Keep coding in the free world!

--

--