Fifteen constantly monitors our websites’ physical performance. We’ve written integrations with Googles API to keep an eye on the scores our websites are hitting, we receive daily notification for all of our websites and note any potential improvements we can make.
This morning our email notification was substantially larger than the norm. This created curiosity within our dev team which very quickly turned into extreme amounts of geeky excitement. Google has quietly released a major update to its PageSpeed scoring mechanism introducing Google Lighthouse as default.
If you’re that way inclined and aren’t terrified of modern tech it’s well worth having a read of the official release notes directly from Googles Developer portal. If you’re looking to find out what impact this might have on your website and how Fifteen will cater to the changes, carry on reading.
Fifteen’s Approach to Performance Optimisation
Fifteen optimise our websites for performance when we launch them. Our internal score target is 90 (out of 100) for both Desktop and Mobile. We have to take a common sense approach to this as too much performance optimisation can be detrimental to the User Experience (UX). Generally, as the head of the development team, I’m content with an 85+ score which is ‘Good’ in Google’s eyes. Performance Optimisation does need ongoing maintenance so we have baked in said maintenance with several of our Service Level Agreement offerings. As mentioned above, our bespoke monitoring platform checks PageSpeed scores automatically at a frequency of once every 5 minutes, we pick up daily summaries but more importantly, we’re notified should a website fall below the thresholds we have set. This allows us to be super proactive and maintain an incredibly fast experience for your users.
Earlier Versions of Google PageSpeed
Prior to version 5 of Google PageSpeed, the analysis engine looked at around 20 or so metrics. These most important being as follows:
Server Response Time
My favourite – Google requires a server response time of 0.2 seconds to pass this rule. You’re not going to get this with cheap shared hosting. You’d be lucky to hit 1 second. Fifteen “don’t do cheap hosting”. We operate on the incredibly fast Amazon Web Services platform utilising several of their products. Combining this with an awesome server software setup of Varnish Caching, OpCache code pre-compilation, NGINX, CloudFront CDN, RDS Database servers we laugh in the face of 0.2 seconds. My personal best is 0.06 seconds and I’m determined to beat this.
Leverage Browser Caching
This is the process of keeping local copies of images/stylesheets/assets etc. in the end users’ browser/system. This means that users aren’t repetitively downloading the same file over and over again. Saving load time and in turn bandwidth.
We quite often see clients uploading images through their content management system at up to 10 or 20 times the size that is needed. This is one of the worst things you can possibly do. If you’re wasting your user’s time and bandwidth by uploading unnecessarily large images Google will incur penalties. I’ve seen improvements of 50 points just by optimising website images. This has led us to automate the process on all of our websites. If you upload an image through one of our websites we automatically resize the images to the required sizes and serve them. We also compress images to the required standard as defined by Google’s analysis engine.
You might not even know what these are but they are scripts that generate what your end user does and sees. It’s all technical stuff that goes on in the background that the end user will never see(unless they go snooping around). We write these ‘scripts’ and to keep code readable and maintainable. We use all manner of new lines, tabs, whitespace and variable naming conventions (we follow the PSR-2 standard if you’re interested). All of these spaces, lines and such use valuable bytes and bandwidth so the version that is served to end users is completely stripped out. This makes human reading really difficult but machine reading no problem.
This is just a small handful of the metrics that require a “pass” to hit the high scores that we crave as a technical team. Over the past few years this has become second nature to us and although there is always challenges to overcome good scores, for us, are moderately easy to come by.
Version 5 – a MAJOR Update – Fifteen’s Approach
Around a year or so ago, our SEO team made us aware of something called Google Lighthouse. You may remember an article one of our developers, Charles, wrote on the subject ‘An Introduction to Google Lighthouse’. We were one of (if not the) first digital marketing agencies to adopt this approach. We researched, we ‘devved’, we tested, we failed, we researched and tested some more and eventually became triumphant. With some really modern technologies, our geek juices were flowing. We rolled out Google Lighthouse and it was clear to see that improvements were vast. Not only were these websites faster, but they were also more accessible and search engine friendly. Since then we have been offering Google Lighthouse optimisation as an optional service.
Google has released version 5 of Google PageSpeed Insights and Lighthouse has become the default in its PageSpeed analysis engine. For some older websites, this means that their page speed has decreased. We can however now make Lighthouse default in our service offering. Making all of our websites super fast, super accessible & super SEO friendly. Our hope is that all digital agencies follow suit which will 100% make for a far better internet browsing experience!
Over the next few weeks, we’ll be applying Google Lighthouse methodologies to all of our websites under SEO Retainer or an SLA that encompasses performance optimisation. We’ll also be including Google Lighthouse in pricing for all new websites we launch.
What is Google Lighthouse and how do we accomplish compliance?
As I mentioned previously, PageSpeed Insights use to comprise of around 20 metrics. Version 5 has introduced around 20 new rules. Whilst I’d encourage you to hit the Google Developer Portal to read in-depth. I’ve picked out a few of my favourites and explained what they mean.
- Only send the code that your users need.
- Minify your code.
- Compress your code.
- Remove unused code.
- Cache your code to reduce network trips
Defer unused CSS
Whilst maintaining, improving and scaling up a website, it’s very easy for a developer to neglect their CSS. A common issue is neglecting to remove relevant CSS from relevant sections of a website that have been removed. I’m absolutely in love with the fact that Google is now marking down websites that do this. It stops developers from being lazy, it keeps websites trim (saving time and bandwidth) and keeps the source code as maintainable as possible.
Defer Offscreen Image loading
When you load the page of a website, typically, you will load every image contained within that website. Consider a large page that might take 5 minutes to read, why would you load all the images at the bottom of that page within your initial page load. Now consider the same page and the number of people that don’t make it to the bottom of the page. It doesn’t take to much thought to realise how much, time, bandwidth and page load time that can be saved by ‘Lazy Loading’ images. ‘Lazy Loading’ images is a methodology of loading images just prior to when they will actually become visible.
We love it. We love speed. We love optimisation and we love Google for keeping us, as developers, on our toes and continuing to challenge us. If we continue to follow the trend of Google’s page speed rules the internet will be a friendlier, more accessible and faster place. Sadly, not everyone shares our passion and we often come across websites that have been neglected in terms of performance – we’d love to help!
Feel free to give me a call, drop me an email or arrange to come in for a coffee – nothing would make my day better than having a chat about performance optimisation and sharing our passion for speed.