Are slow pages silently killing your website? We’re going to explore the 7 rules of optimising page speed, and by the end I hope you’ll have understood three things :
- how to diagnose the problem,
- how we can to solve it.
- But first, before all that, why should we care?
Your website’s speed impacts both how much money it makes, and how much it costs. Page speed directly impacts traffic, bounce rate, user experience and conversion rate. Any goal you might have for your webpage is directly linked to those factors. Let’s take a look at that.
In a 2019 survey by Unbounce, 70% of consumers said page speed impacts their willingness to buy from an online retailer. And other studies confirm that. On one website, the conversion rate was 0.6% when the loading speed was over 5 seconds. But when the loading time was under 2.4 s the conversion rate rose to 1.9%. Half the time spent waiting, three times as much money made.
A large part of that is what is called the bounce rate, the percentage of people reaching the page who leave it. Let’s face, people are impatient. The more they wait, the more they leave. But even for those who don’t leave right away, slow loading speed leaves a bad impression. And Google has stated that’s why Page speed is a big factor in how they rank pages.
In short, page speed impacts :
- how many people come to the page,
- how many people stay on the page,
- and how many people carry out the action you’d like them to carry out.
To put things in simpler terms: whatever goal you have for a webpage, that goal will be hindered by slow pages and helped by fast pages.
And I’ve explained why page speed might be important for you, but allow me to explain why it is important for me, David. You see, in my day job, I work at a company called EcoTree. EcoTree is not sponsoring this… but they do pay my salary, so I can’t claim to be an impartial observer!
And at EcoTree we focus on planting and restoring forests, and helping people and companies have a lasting impact on the environment. Part of the way that works is that people can buy trees that are in our sustainably managed forests. And they also make wonderful gifts. Incidentally, while we’re on the subject, you can head to
ecotree.green and use the promo code KODAPS to get 10% off.
All that to explain why for me, it’s also essential to manage the website’s carbon footprint.
And it just so happens that making pages faster can also lower a website’s carbon footprint and costs. Optimising for page speed really can be a win-win situation.
I say that like it is easy to do. It isn’t. We’re going to see how to attack the problem — and how I’ve been attacking the problem with my teammates.
But first let’s look at how to identify it. Let’s talk about…
Identifying which pages are slow
The first thing to always keep in mind is that about 60% of the traffic on the internet is via a mobile device. In 2015, that figure was only at 30%. And users browsing via mobile devices do so with less bandwidth than desktop users.
So through this we will be focusing primarily on the mobile experience. And you know what: If it loads fast on mobile, it will load even faster on desktop.
**And that is tip number 1 : Focus on mobile speed. **
Now, how do we identify the pages that need to be optimised?
For a global view, I use Google Search Console.
There’s a “experience” section, on the left, and in here we have the “Core Web Vitals”. This tells us how users experience our webpages on both mobile and desktop, with data reported back from users’ browsers. So, this shows us the pages with the most problems.
Now, how do we go about identifying issues on specific pages?
Let’s open up the mobile report. We see a list of pages problems. This example mentions a “CLS” issue and an “LCP” issue. What are those?
The CLS is the “Content Layout Shift”. It measures how much your page layout changes while it loads.
The LCP means the Largest Contentful paint. This is basically the time it takes for the largest element to be painted on screen.
Now, how do we identify the problems on a given page?
For that, we’ll use another tool: “Page Speed Insights”.
In the Search Console, If we click on the group, here, then click on one URL here, we get a popup where the last element is “Page Speed Insights”. This opens the tool for that URL.
We can, of course, run the Page Speed Insights tool directly for a given page.
What can the Page Speed Insights page teach us?
The first thing we see are 6 metrics.
These are the values our users are experiencing in practice. Their browsers send the information as to what these “core metrics” were like when they loaded the page.
Our friends the CLS and LCP are here. What are the four others?
The tool provides a detailed explanation of each of these metrics. Let’s take a look at the TTFB.
TTFB or Time To First Byte
The TTFB, or Time To First Byte, is crucial to the page speed. The TTFB is basically the time the browser takes to receive the data for the page.
This metric is critical because it impacts all the other time-based ones. Basically, everything except for the Cumulative Layout Shift.
And the reason for that is simple: The browser can’t start reacting until it gets the information back from the server. There is nothing you can do client side if the server takes too long to respond.
So tip number 2 is : Make sure your server respond quickly
How can we do that?
It is subject in its own right. The answer is highly dependent on the server technology and on what you are building. A lot of the time the answer is threefold:
- removing useless logic and queries
- optimising queries to be more efficient
- storing data in the cache
The key takeaway here is : if your server is slow, there is nothing you can do on the client side to make the page load fast.
LCP or Largest Contentful Paint
The LCP or Largest Contentful Paint is the time it takes to render the biggest element on screen. Google uses it as a proxy for when the user feels like the page is mostly loaded.
What determines the LCP?
Well, when the browser receives the page (the TTFB) it then needs to read all the page content, the HTML. Then it downloads the images and scripts. And finally, it evaluates the scripts. And then it runs them.
For a simple HTML page, the LCP happens when the biggest image is downloaded and displayed.
This brings me to tip 3: Use client-side applications sparingly
Now…. How can we tell what to improve?
Well, the good news is the work has already been done for us. The Page Speed Insights gives you a whole number of indications on how to solve the issues.
The exact indications will, of course, depend on the page and how it has been coded. But here are some of the things we set up based on the problems we saw.
What we can do?
There are several things to do. I’ll be going from the most general advice to the mosts specific.
And the first thing you can do, and that is my **tip number 4, is to be minimalistic. ** The most basic advice I can give is : remove useless stuff. Limit the number of fonts you use. Don’t use widget fonts to display a few images.
Incidentally, the minimalism rule goes a long way.
We’ve removed code that is no longer called, and fields in the database that have become out of date. It’s worth going over old pages and making sure we don’t have bits of code being included that is no longer used.
Only include what serves your user’s needs. It might soothe your ego to have exciting content, but if it hurts the loading speed and turns people away… Our goals are more important than our egos. Remove content that is fluff. Don’t make the user download useless stuff. And that brings me to our next point.
**Tip number 5: Tell users’ browsers what they don’t need to download again. ** When you’re serving content to users, you can set up your servers to add information to the files saying “don’t bother downloading this again if you already have it in memory. You can keep for another year”. This browser memory is called the cache, and the header value to set the amount of time is the “Cache-Content” field.
Tip 6: use lazy loading The other way to limit what users download is to tell the browser to only load the content if the user is about to see it. Because you see, most users only look at the top third of the page. There really isn’t any point of loading an image all the way down at the bottom of the page. This is called “lazy loading”.
So we just need to add
loading="lazy"on images that are not going to be shown to a user that does not scroll down. And then the browser only loads the image if the user is scrolling down towards it.
Now on 74% of browsers this also works on iframes. Iframes are a way of embedding content from other people’s website onto our own. When you copy the embed code from a Youtube video, that’s actually an iframe that loads about 250 kilobytes of data. Which is actually more than a lot of images.
A fast solution is to add “loading=‘lazy”” to the iframe code. However there is an even better way.
Tip 6: Embed iframes dynamically
Now, Images are another very common culprit of slow pages. And in my experience graphic artists don’t necessarily always aim to keep file size as low as possible. You want to be relentless in tracking down excessively heavy images.
Here there are several things you can do.
Tip 7: Prefer SVG files
First: For illustrations and icons, use SVG files instead of bitmap formats like JPG or PNG. This is because SVGs are a vectorial format, so instead of storing the value for every single pixel, they store the information that for example a line needs to be drawn from point a to point b. Less information means lighter files and more speed.
Tip 8: Use responsive images
The second point is to have your user’s screen size in mind. Only 1% of desktop users have screens that are wider than 1920 pixels… and most users are on mobile which is between 360 and 440 pixels wide.
If an image is only going to fill half the screen width to the user there is no point in providing an image that is 4000 pixels wide. And yes that’s a real-life example.
But this raises an important question: how do we cater both to users will tiny mobile screens and to users with very large screens ?
Well, thankfully there is a solution within HTML itself. When we display an image we use a tag called <img>, and set the image source using the “src” attribute. However we can also provide a set of sources, using the “srcset” attribute. We basically tell the browser : this source is 400 pixels wide, this source is 1000 pixels wide, this source is 2000 pixels wide. And then based on the size the browser things the image is going to be shown at, it can load the relevant image.
In fact we’ve even taken this a step further. You see the <img> tag is intended for images with a fixed ratio. So, for example, rectangular images. On our website, the main image at the top of the screen is rectangular. But on mobile most of that image would be wasted space, so for us it makes more sense to load a square image on mobile.
And for that we can use the <picture> HTML element. This allows us to define a group of different images. We can then state under which conditions the browser should display which image. On mobile, show a square version. On desktop show a rectangular one.
And we’ve actually even taken things one step further.
Tip 9: Use modern image formats You see, most websites serve images in JPEG or PNG formats. But there are two new image file formats that compress even better than JPEG does. These file formats are WebP, which is now a few years old so it is supported bu 97% of browsers. And AVIF which is more recent, more efficient and supported by 75% of browsers.
We use a content distribution network (or CDN) to serve our image files. (That also allows us to serve the files faster). We’ve added code to the CDN that reads the information from the browser as to what type of image format they accept. So for example if the browser says: “I would like an image based on this one, I want it 128 pixels wide, and I’m able to read Jpeg and WebP files but not AVIF”. On the CDN we check if the reformatted file already exists, and if it doesn’t we create it, save it and send it back.
I think CloudFlare’s CDN services allow you to purchase a similar service but… We coded our own. I’ve shared the blog article we used as a basis for our code in the description.
As a matter of fact it can be quite discouraging. Like SEO it can feel discouraging at times. But improving page speed is a win/win situation. Both in terms of achieving our goals, and reducing our costs, and limiting our carbon footprint.
And if you wan’t to do even more for the planet… well you have the promo code in the description!