Investigating the benefits of progressive enhancement with regards to search engine optimisation (SEO).
Source: Google Search Central
Despite these benefits, this can cause a whole host of issues. The most prominent is user-agent sniffing. Serving a different version of a page depending on the user-agent looks great on paper, but since search engines have a whole host of different user-agents – and some are more notable when it comes to indexing than others – this approach can lead to severe indexing issues.
For example, when serving one version of a page to Googlebot Desktop and another to Googlebot Smartphone, in most cases we saw that while Googlebot Desktop was seeing the correct version of the page with all SEO tags and signals, Google Smartphone was either seeing something completely different, or nothing at all. As Googlebot Smartphone is the favoured indexing crawler, this has a massive impact on indexability and overall visibility.
Using this setup can be risky, especially for SEO. However, there is a solution. And that solution is progressive enhancement.
What is progressive enhancement?
The strategy is based on creating a simple foundation which provides all the key parts, before building on this with extras and add-ons.
The progressive enhancement strategy consists of the following core principles:
- Basic content should be accessible to all web browsers
- Basic functionality should be accessible to all web browsers
- Sparse, semantic markup contains all content
- Enhanced layout is provided by externally linked CSS
- End-user web browser preferences are respected
How does it work?
This approach can be compared to the way a car is built. While the hub of a car is its engine, specific parts are required for the car to function. Every car needs the same elements such as an engine, tyres and an exhaust, and no matter which manufacturer you get, the basics always remain the same. This is just like HTML.
The CSS of a car is its shape and look – this is something that can be customised. While newer cars are likely to have more sophisticated designs, this doesn’t determine or impact on the functionality of the car.
So, as you can see, progressive enhancement is a straightforward strategy that can lead to many benefits.
Clean and semantic HTML builds the foundation for websites and allows any user-agent (especially crawling bots) to access and interpret a website since it’s purely text based and doesn’t require additional functionality to render.
In the case of the Apple website, we can see that the main navigation and internal linking is being served via basic HTML. This means crawlers are able to access and crawl the website effectively:
As you can see above, the pre-rendered source code shows the navigation being served via HTML.
This is how a website is styled. Not all CSS is compatible with every browser, so it’s vital to include fallback CSS that functions on all browsers in addition to more advanced CSS for newer browsers.
How does this affect SEO?
HTML is the preferred method for user-agents to crawl websites. This is because HTML is completely text based. As a result, crawlers are more easily able to understand and digest them, which means that crawling a website to find hyperlinks, content and all SEO tagging signals becomes faster, more seamless and straightforward.
- Content which is hidden behind a click action. This is common with pagination controls, such as content behind ‘load more’
- Examples include buttons which don’t contain hyperlinks within the code
- Examples include many frameworks that solely use client-side rendering techniques such as Vanilla, Angular and React
Relying on Googlebot’s capabilities
If a website takes a long time to load, customers are less likely to complete a purchase.
The benefits of progressive enhancement
There are a multitude of benefits when it comes to progressive enhancement. Some of these include:
- Ease of access for search engines to crawl and pick up key content
- Improved page speed for users
- Optimised crawl budget
- Improved accessibility for users on a weaker signal or older browser/device
In a world where websites are constantly evolving, there are more and more ways to make a website perform well for users. However, relying on scripts to provide even the basic functionality of a website can cause a whole host of problems.
In order to maintain the best organic health, websites must make crawling and indexing as easy as possible; progressive enhancement enables this.