Over the last few months Google has rolled out the Core Web Vitals report into Google Search Console, effectively replacing the previous speed report. Even though the speed report has been removed, a lot of the previous metrics have been pulled through to this new report, which is being powered by its new web vitals engine.
As with most releases from google, they’ve kept their cards close to their chest and have yet to reveal just exactly what has been changed. However, it does show various new metrics that website owners need to be familiar with going forward and shows a split between mobile performance and desktop performance.
As we’ve seen over the last 18 months, Google has been pushing through its mobile first update, so it’s not surprising to see that user experience on handheld devices is at the forefront of this new update.
With such a focus on user experience including speed, mobile friendliness algorithm shifts and improving search experience to offer featured snippets, structured data, ‘zero search results’ pages (where only one definitive answer to a query is provided) it shouldn’t come as a surprise that Google is now shifting the limelight back to websites.
Google’s focus now seems to be on providing a top quality user experience, and Web Vitals is their attempt to provide a set of guidelines and quality signals to allow website owners to ensure their site meets the modern needs of users in 2020.
Of course, this is all common sense – by providing the best experience for your users on your website you can reap the rewards of more visitors and actions on your site, whether that’s through sales, leads or enquiries.
The criticism in the past seems to be that a lot of the suggested metrics and changes that web owners should be making on their sites is too technical, and requires too much development knowledge, and whilst that might be the case still, Core Web Vitals aims to break this into more manageable sections.
The current set of metrics for 2020 splits out the user experience into three facets: loading, interactivity and visual stability – as well as a set of thresholds for acceptable levels of delay for each.
For each of the above metrics, to ensure you’re hitting the recommended target for most of your users, a good threshold to measure is the 75th percentile of page loads, segmented across mobile and desktop devices.
The easiest way of seeing your Core Web Vitals is within your Google Search Console account and navigating to the property in question.
Otherwise, you can use the newly released Web Vitals Chrome Extension to see your own website’s performance and, crucially, see how you can compare with your competitor’s websites and other sites.
A lot of the optimisations and improvements that are suggested to be made on your site will vary depending on your platform, server capacity, hardware and many other variables, but we’ve cherry picked some key optimisations below to help you get started.
Unlike First Contentful Paint (FCP), which measures how long it takes for the first bit of content to render, Largest Contentful Paint (LCP) looks at how long and when the largest content element on your webpage becomes visible.
A good score would be a LCP load of under 2.5 seconds, whilst anything over 4.0 seconds is considered poor.
The most common causes of a poor LCP score are:
A faster response time from your server improves this LCP metric and just about every other. You should look to optimise your server where possible, which may include implementing a CDN to serve media, caching assets, serving HTML pages cache-first or establishing third-party connections (such as payment gateways or marketing integrations).
Image and video elements as well as other media can affect load speed. Optimising, resizing and compressing your images is one of the most effective ways of improving performance which is commonly overlooked. You could also implement a CDN to serve media, preload important resources where possible, use progressive image formats such as JPEG 2000 or WebP or simply don’t use images if it’s not relevant to the content.
First Input Delay (FID) measures a webpage’s responsiveness, from when a user first interacts with a page to when the browser is able to respond and serve the interaction. This could be clicking or tapping on a link or entering a string of text into a field.
A good time would be a FID of less than 100ms, whilst anything over 300ms would be an indicator of a poor score.
Problems with FID occur when multiple requests are being parsed, compiled and executed on your website at the same time and cause errors when the user’s browser cannot process the requests.
There are several ways of improving First Input Delay, but may require substantial scoping and development to rectify.
A piece of code which blocks the main thread for over 50ms is considered to be a Long Task, which can hamper page load and cause unresponsive pages. By breaking up these tasks into shorter, more manageable requests you may be able to reduce the delay in load.
Both first and third party script can cause issues and delays with interaction readiness, including long execution times and ineffective fetches. Reduce the waterfall effect of fetches to improve latency and, where possible, lazy load third party code until it is needed so as not to impact on the load time of critical elements that your users are expecting to see first.
Layout shifts are the bane of web users on just about every device. Think about that time you were looking for a quick recipe, only to be thrown down the page as a huge image loaded, or if you’re reading a news article only to be interrupted by an intrusive advert.
Not only are these inconvenient to users, they could also push them further away from committing an important action on site – i.e. if your Add to Cart or Subscribe button jumps away from your user at the last minute, it can cause all kinds of frustration.
Cumulative Layout Shift (CLS) aims to measure the stability of a website by measuring elements on a page that are still moving after 500ms of a click, scroll or other input from the user.
A good score would be a CLS of less than 0.1, whilst anything over 0.25 would be an indicator of a poor score.
A CLS is most commonly caused by on-page or dynamic media content, which has the knock on effect onto other elements on the website, causing the viewport to move around unexpectedly until the website is fully loaded and rendered.
This does not just include ensuring that your media have sufficient height and width tags in your code, but by managing the aspect ratio required within the CSS to ensure that there is sufficient space reserved in the initial load to serve the image without disruption to the viewport. The same applies to externally served content from iframes or embeds – think Google Maps, videos from YouTube, Twitter post embeds etc.
To explain this element easily, think of the GDPR consent pop-up. You land on a website, only to be shunted down the page by an intrusive pop-up which disrupts the rest of the page’s layout. Again, ensuring that space has been reserved for it within the CSS can reduce the risk of the page jumping around.
Web Fonts are becoming increasingly popular, but if your fonts are not loaded correctly you could be at risk of your users seeing either unstyled text or invisible text whilst your font downloads and renders. The ‘font-display’ rule allows you to reduce the risk of these occurring but ideally you should aim to preload your web font and serve at first paint.
It’s inevitable that more updates and tweaks to these signals will occur over the next few months and as more data is collected on website performance there’s no telling how many metrics will be interred into the Core Web Vitals pack in future.
Similarly, it’s not yet been confirmed whether or not these metrics will become a significant ranking factor for websites, although with such a focus on customer experience over the last few years then it is surely inevitable.
As such, by taking action now you can start to work through the most pressing issues that cause slow loading, lag on site and improve your overall user experience.
Not only will this help you produce a more stable, effective and pleasant web experience for your customers, which in turn should lead to more conversions, but you will be able to ensure that you are meeting some of the clearest performance criteria we’ve seen from Google in years.
Although Core Web Vitals is a relatively new aspect to the Google Search Console report, the signs that something like this has been in the pipeline for a number of years.
It will be interesting to see how older websites with clunky UI and websites such as local news services who rely on large, intrusive, third party ads to supplement their revenue will fare as the creases are ironed out over the next few months.
Ultimately, this update comes down to providing web users with a quality experience – something that website owners should be striving for in the first place.
What’s different, is that these new metrics come with a clear threshold of what Google deems to be good, acceptable and poor levels of performance and if you don’t comply, there’s every chance that you may lose traction in search results.
If the everyday user becomes more aware of these changes, even subconsciously, they will be more demanding of a quick, high quality online experience then this should raise everyone’s game and produce a lot more quality websites across the board.
If you’re unsure how your website is affected, get in touch. We can provide an in-depth performance report with a clear audit on which variables should be addressed first and how you can prepare yourself should there be any changes in Google’s search algorithms over the next months as a result of this new tool.
Whilst a lot of the work required to improve your website’s speed may require development, the framework surrounding Core Web Vitals provides a checklist of must-have, quick-win and ongoing performance improvements that can provide instant results.