Enter a search term such as “mobile analytics” or browse our content using the filters above.
That’s not only a poor Scrabble score but we also couldn’t find any results matching
Check your spelling or try broadening your search.
Sorry about this, there is a problem with our search at the moment.
Please try again later.
It’s widely accepted that minimising page load speed is good practice for ecommerce, to ensure visitors get a fast service and search engines don’t mark the webpages down for usability.
But what does this mean? What are the techniques you need to use to achieve this? And if you’re not technical, how can you make sense of the jargon to make sensible decisions?
In this blog, I look at the most common causes of slow page loads speeds, which will hopefully give ecommerce managers a useful starting place.
When I started out in ecommerce, I didn't get this at all but i've spent time learning over the years to ensure that I know what questions to ask and what to look for.
You don't have to grasp the detail of the technical implications to understand the essentials and that for me is something all ecommerce managers must master.
It’s important that ecommerce managers understand the different elements of webpage design that can impact page load speed and website performance.
When working with development teams, it’s essential that someone is there to ask the basic questions that ensure obvious answers aren’t overlooked when problem solving takes place.
In my experience, responses to site performance problems can often become over-complicated and solutions appear that mask rather than resolve the underlying problems.
By understanding the basics, you ensure these are covered before any complex diagnostics are run and costs incurred to address problems.
I’ve known clients where the SI has run up significant costs for doing advanced diagnostics on a website with intermittent performance issues when the core problem could have been resolved by checking the basics.
I’ve also seen people sign-off investment in new servers where misguided advice recommended throwing hardware at what was a config or software problem.
1. Optimise images
This is always my first check. I like to use page load tools like Pingdom or Webpagetest.org and then use the waterfall view to check file sizes. Anything over 20Kb gets my attention and I’ll ask designers if there is anyway to reduce without compromising quality.
The sizes of images have significant impact on download speed of the website. However, you have to agree on an optimum balance between the size and quality of the image – whilst some images can be >20kb they are often the key brand/category/product merchandising tools and replacing them with low quality images may impede sales in other ways.
A quick win is addressing issues with thumbnail images in carousels. I often find these aren’t optimised for the web, or images are being scaled to size when the page is called. Getting the dimensions correct and providing the right size image really helps.
You should also evaluate the potential for compressing images to reduce their file size. The Google Developers diagnostic tool is useful for running checks but be careful when implementing the recommendations, always get them validated by a technical specialist.
One option to further reduce page size is to switch from baseline to progressive jpeg images but there is a potential trade-off: progressive jpeg images load the full image straight away but with only some of the pixel data, meaning the image briefly looks pixelated before sharpening.
Ecommerce teams may well have decide the slightly larger file size is preferable to a marginally diminished UX.
2. Browser caching
Setting an expiry date or a maximum age in the HTTP headers for static resources instructs the browser to load previously downloaded resources from local disk rather than over the network.
This needs to be applied across all pages with static resources and is an important way to reduce the number of HTTP requests – the more requests, then the slower the page load will be (unless you’re using multiple domains for loading assets, such as a CDN – see below).
3. Enable keep alive
Enabling HTTP Keep-Alive or HTTP persistent connections allow the same TCP connection to send and receive multiple HTTP requests, thus reducing the latency for subsequent requests.
The host domain should enable Keep-Alive for resources wherever possible.
Be careful though, don’t defer JS files that will affect how the page performs if they’re needed upfront. Review the implications of deferment with the developers
5. Minimising scripts
6. Remove unnecessary script
Script builds up over time even with good house keeping. It’s sensible to undertake regular reviews to identify ‘dead’ code such as redundant HTML and remove this from the page.
Removing or deferring style rules that are not used by a document avoids downloading unnecessary bytes and allow the browser to start rendering sooner. Unused CSS can easily be identified on webpages by running simple diagnostics using tools like the Google Developers diagnostic tool.
Where there are disjointed inline scripts, these should be moved into a single file wherever possible and minified. They should be moved into an external file if possible. If not they should be put under one <script> tag and moved to the bottom of the page.
7. Minimise HTTP requests
The more requests made by scripts, the longer it takes to load page content. One way to reduce requests is to convert image assets into sprites to reduce the total number of individual HTTP requests e.g. arrows in scrollers, common backgrounds like basket.
Assets that are being reused in multiple pages should all be consolidated into sprites and cached.
8. Optimise favicon
The favicon should be less than 1Kb in size and versioning should not be used, a small issue I’ve noticed on a few sites. Versioning results in downloading of the asset every time there is a deployment. It’s unlikely that the favicon will change that regularly.
Compression reduces response times by reducing the size of the HTTP response. Gzip is commonly used to compress resources but it is worth asking your development team to confirm which compression technique is being used on the servers.
Using the Google Developer diagnostic tool you can see resources flagged as compressible to reduce their transfer size.
10. Remove error code and duplicate scripts
You should validate all main pages of the website using the above website and ensure that there are no errors but this is time intensive.
Taking the Homepage as the example, there are three errors being generated that need to be addressed.
You can always go to http://validator.w3.org and enter any URL you want to validate any webpage. This should be a general practice, which needs to be followed on a regular basis.
11. Remove 404 errors
(Updated following feedback from Frederic).
You need to minimise the number of 404 errors as these are dead-ends. Google analytics can be used to identify these pages. It’s important to benchmark this problem using the data available in Webmaster Tools or via other web-based software.
A 301 can be used when a page has a logical replacement e.g. catalogue structure has been updated and product X had now moved URL to Y. The 301 is an important link between old and new whilst the new URL is established.
However, just because a page has a 404 doesn't mean it should be given a 301 to another page - the most common error is to redirect a 404 URL to the homepage when the original page is no longer relevant e.g. product has been permanently removed from the catalogue.
In such cases a 301 is not good practice because you’re sending people to an irrelevant substitute page and as these erroneous 301s build-up, it can adversely affect your SEO. Further reading at https://support.google.com/webmasters/answer/181708?hl=en&ref_topic=1724951
It’s good practice to ensure your 404 error page explains that the page is no longer available and suggest alternatives, providing links back to key pages on the website including the homepage.
Some websites provide a site search tool on their 404 to let the customer browse for relevant information.
12. Reduce DOM elements
However, a webpage with more DOM elements will not necessarily load slower, provided the other optimisation techniques are being used effectively.
For example, for one client I benchmarked leading ecommerce brands and found that sites like John Lewis had up to 20% more DOM elements yet faster load speeds.
13. Content Delivery Network (CDN)
A CDN is most useful when the website’s user base is located at geographically distant locations e.g. if you experience a continued increase in traffic coming from non-UK locations. It's also useful for video & image rich websites.
Using a CDN can benefit your page load speed as the resources are called from external servers, reducing the load on the origin server. However, there can often be significant costs associated.
As an example, New Look appears to be using Amplience as their CDN provider for product images and videos.
14. Other general tips
The following are good coding standards to be followed in general, as advised by some developer friends of mine.
- Put Style sheets at the top.
- Avoid CSS Expressions.
- Minimize the number of iframes.
- Choose <link> over @import in CSS files.
- Don’t scale images in HTML.
- Use GET for AJAX requests.
- Use smart event handlers.
- Reduce cookie size wherever possible.
What do you look for when evaluating page load speeds?
I’d be interested to hear from other people and understand what you look at when trying to evaluate page load speed issues, especially from the tech obsessives amongst you who can add to my explanations and probably challenge me on some of the points!
If you don’t agree with something I’ve written, please let me know, I’m always open to suggestions and advice.