On the 12th of April, the highly anticipated BrightonSEO returned, meaning we could spend a full day at the UK’s biggest search engine conference alongside 4,000 fellow SEO, PPC and digital marketing enthusiasts.
In this blog, we look back on the overarching themes from the conference. Although Google’s Webmaster Trends Analyst John Mueller closed the event, we’ll start with some of the key insights we gained from his talk, before running through some takeaways from a couple of the day’s best talks.
Grilling Google: The John Mueller Keynote
Mueller had barely sat down before being prodded to give the audience a little insight into his own role and what’s important to him personally. To most of us trying to stay at the front of the SEO race, Mueller’s Twitter updates help us to stay on top of SEO industry updates. However, the reality is that sometimes his responses are slightly vague, especially to some of the more urgent questions around algorithm updates or lost features in tools, such as Search Console. Below are some of the key takeaways from the session.
The Google mission statement
“Organise the world’s information and make it universally accessible and useful.”
The continued challenges to stick to this mission was eluded to and underpinned a lot of the developments in SEO. Mueller has noticed a significant shift between technical SEO and Content SEO, as Google representatives find themselves talking more and more to front end developers. This suggests that too many little mistakes in the coding of the page prevent Google from accessing or reading it. What became apparent in this discussion is that, as the structure and sophistication of web pages continue to evolve, Googlebot is challenged to try and keep up – this being a core reason as to why technical SEO has been breaking away into its own independent profession.
Google controversies were covered
Mueller was challenged to justify Google’s continued movement toward holding more traffic on the search engine results pages (SERPs).
It’s a common complaint amongst SEOs and publishers alike, that when Google shows quick answer boxes or featured snippets, users are not clicking through to the site. This results in diminishing returns for publishers, as ads placed on their site are not viewed. Meanwhile, Google retains the user on the SERP, showing their own ads and using the publisher’s content to do so.
Is this fair?
Mueller’s response questioned why publishers should be creating such long-form content to answer such a short and objective query, and used ‘How to Boil an Egg’ as an example. What we may not consider is that Google uses these snippets to help their ever-evolving algorithm recognise the variations in search intent. When the answer to a question isn’t objective, the snippet will provide a small piece of an answer, to help users judge whether this site has the information they are looking for.
If users do proceed and follow the snippet’s web link to obtain the whole answer, then Google knows it has retrieved the correct result for the query.
Links are still a ranking factor
The most direct answer we had from Mueller was that incoming links from other sites are still very much a ranking factor. However, Google is finding it much harder to use them. It was pointed out that whilst the web is somewhat infinite, the amount of highly authoritative sites is very few. Even more problematic is when those high authority publishers, such as The Guardian, Telegraph or Daily Mail, start to implement the ‘nofollow’ tag on all outgoing links, meaning far less valuable ranking signals to the beneficiary sites. Mueller felt this was a short-sighted solution for the publishers who are frequently pestered to link out to the sites they reference. For Google, this results in several significant challenges. First of all, it is a challenge to find fresh content for the index and also to identify which content the publishers are referring to – and ultimately the original content that users may want to see. Ultimately, if links are one of the strongest ranking signals, a monopoly on search engine-provided information for the major publishers can arise.
My thoughts? Why can’t Google reward sites which are implementing ‘follow’ links to the original content by using it as a ranking signal? Should publishers not have a duty to give credit to the original source of information, to show Google that it’s from a credible source?
So, let’s dissect a couple of the day’s other talks, keeping John Mueller’s keynote in mind…
Speed & performance optimisation: How to meet users’ high expectations
Rachel Costello from DeepCrawl introduced BrightonSEO’s page speed sessions by setting the user’s experience as the focus of optimisation efforts. It’s clear that Google places great importance on page speed; after all, they want to provide users with quick and efficient answers, thus keeping the user satisfied and always returning to the same search engine. Google promotes a range of tools to help webmasters stay on top of their site speed, including the Lighthouse audit, PageSpeed Insights and Test My Site.
In line with keeping users at the forefront of optimisation considerations, the focus in this talk was that personalisation is the future of page speed performance. We must remember that not all devices have the same performance capability, meaning it’s important to review the most common devices and browsers through which users interact with a website. For this analysis, we are urged to utilise the ‘Audience’ tab in Google Analytics. With these insights in mind, there are a range of tools available for testing our site speed performance which are tailored to our own users’ behaviours. For example, Uptrends have released a website speed test that allows you to test your pages with custom variables including location, browser and device, meaning you can completely tailor the test to your audience.
Progressive enhancement
There are many ways that site experience can be poor for users, as sometimes certain scripts or style sheets fail to load. This is where progressive enhancement web design strategies are introduced. This emphasises the web pages’ most important content first so that even when the web page struggles to load, users can still get the information they want and not associate your site with completely poor experiences. When working with developers, some of the core considerations are:
- Ensure all web browsers can access the most basic content
- Externally link CSS for enhanced layout
- Utilise unobtrusive, externally linked JavaScript, which means axing all inline event handlers
Getting a 100% Lighthouse performance score
The following talk considered a far more metric and benchmark-based approach to optimising your site for page speed. Polly Pospelova from Delete provided insight into achieving the highest possible score on Google’s Lighthouse audit. It was confirmed that a 100% score in Lighthouse is in fact possible. It was achieved through rigorous testing and changes on their own agency website.
Some of the more critical optimisations were the following…
HTTP/2
For users, a website moving from HTTP/1 to HTTP/2 will not be a visible change, but it will improve a site’s load speed across all devices. What is also great about this protocol is that all modern web browsers support this newer protocol, and even where older browsers don’t, the HTTP/1 standard will come into play automatically.
Why is it so fast? HTTP/2 was originally designed to reduce website latency, meaning that it can send all the requests of the web page to the server in one stream, whilst HTTP/1 could only send six concurrent requests.
Optimising images
It is common knowledge that large images will slow down web pages. Our efforts to optimise them usually involve attempts to decrease their file size without harming the quality of the image. However, the use of deferring offscreen images was also discussed, this is essentially lazy-loading. As Google state in their developer tools help guides, there is no reason to download images that users cannot see in the initial page load. To determine when to load these off-screen, or below-the-fold images, you can use an IntersectionObserver.
In continuing the theme of personalisation for the user and recognising the variation in devices that users may be accessing your pages from, we can use an <img srcset=””> tag. By implementing the srcset tag on your images, you are giving the user’s browser the power to select the most appropriate image source. This avoids the problem of users, regardless of the device they are using, being served giant images which take a long time to load. This does mean you will have to create multiple sizes of your images, but it also means full control over the quality, and general practice seems to settle on providing an option of four different sizes.
In the name of reducing image file sizes, WebP is Google’s recommended format here. This new image format offers compression which is lossless and lossy. This format is 26% smaller than PNGs and 25-34% smaller than JPEG. To start converting your images, there are a number of free tools available, one being webp-converter.com.
That covers the speed aspect of satisfying a user’s experience, but what about helping them click through to the correct organic search result for their query, as touched upon in Mueller’s keynote?
Featured snippets – The achievable SERP feature?
Despite the controversies around featured snippets as we discussed in John Mueller’s keynote, many SEOs consider the featured snippet to be the holy grail of rankings. Emily Potter from Distilled agency questions just what it takes to satisfy the Google search algorithm for it to award your site with them.
The reason they are so popular with SEOs is because of firstly, the prospect of position ‘0’ and the increased brand awareness this brings, and secondly, those all-important clicks.
Are these snippets a little bit overrated? Emily conducted her own research into winning featured snippets and the resulting effect. This research took a very competitor-based approach. She looked at what the current owner of a featured snippet is doing better than her own site, the general format and headings. After a week of relentless snippet hunting, one trusted method came down to simply stealing the competitor’s <h1> text.
Emily’s results confirmed what we all suspected about the relationship between the click-through rate (CTR) of a site and the featured snippet. The graph below shows the average organic position, plotted on the X axis, and the CTR, on the Y axis. The blue circles signify the ownership of featured snippets whilst the red shows where they are not owned.
What Emily concluded, along with a very well-known HubSpot study into featured snippets, was that for the top five results, backlinks and other authority signals matter much less for winning a featured snippet. Again, this somewhat ties in to Mueller’s comment around Google learning from users’ behaviour when a snippet is shown.
Another very interesting note made by Emily was around the creation of a confirmation bias from these snippets. Take the following two examples:
The way in which the searcher phrases their query can drastically alter the information that they receive. This is an important consideration when researching the snippets that you want to appear for.
The recommended actions in achieving featured snippets for your site were as follows:
- Find keywords that you rank for, where a competitor has featured snippets. SEMRush can help with this
- Highlight the keywords where you rank higher than the current owner of the snippets
- Look for low-hanging fruits
- Steal headings from content, with snippets
- Add on-page content for well-performing pages
- As Mueller points out, many snippets will provide only a partial answer, so you can offer your user more value
- Re-format your content to match that of the current owner
- Turn paragraphs into lists or lists into tables
The day’s conclusion
As mentioned in Mueller’s keynote, technical SEO has become an independent beast, an industry still growing as users and search engines demand accessibility and speed. A common observation made by those in the industry looked at the gaps between SEOs focused on content production and technical SEOs or developers. To bridge this gap, developers and SEOs must unite and work a lot more closely. It is far more time-costly to implement performance and speed-related optimisation techniques on a web page once it has already been created. However, it can also be hard work to convince web developers to create pages with SEO in mind. It may be easier to convince them to create pages with the user experience and page speed in mind.
To find out more, or to speak to Croud about your SEO strategy, contact us.