Investigating the benefits of progressive enhancement with regards to search engine optimisation (SEO).
On the 10th of August, 2022, Google sent shockwaves through the interwebs when it changed its recommendation on dynamic rendering (DR). DR is a setup where what is served is dependent on the user-agent requesting – client-side rendering with JavaScript content is served to users, whereas server-side content (without the JavaScript) is served to search engines.
In this article we’ll demonstrate why you should be considering progressive enhancement as an alternative approach to create SEO friendly, JavaScript heavy websites.
Google now advises that “Dynamic rendering is a workaround and not a long-term solution for problems with JavaScript-generated content in search engines. Instead, we recommend that you use server-side rendering, static rendering, or hydration as a solution.”
Source: Google Search Central
This setup provides a short-term solution for JavaScript heavy websites by creating a static HTML (server-side rendered) version of the pages for search engine bots. This means that search engines can crawl the website without the need for JavaScript, as only users are served the client-side rendered content.
Despite these benefits, this can cause a whole host of issues. The most prominent is user-agent sniffing. Serving a different version of a page depending on the user-agent looks great on paper, but since search engines have a whole host of different user-agents – and some are more notable when it comes to indexing than others – this approach can lead to severe indexing issues.
For example, when serving one version of a page to Googlebot Desktop and another to Googlebot Smartphone, in most cases we saw that while Googlebot Desktop was seeing the correct version of the page with all SEO tags and signals, Google Smartphone was either seeing something completely different, or nothing at all. As Googlebot Smartphone is the favoured indexing crawler, this has a massive impact on indexability and overall visibility.
Using this setup can be risky, especially for SEO. However, there is a solution. And that solution is progressive enhancement.
What is progressive enhancement?
Progressive enhancement is a strategy which looks at the overall build of a website and determines how it should be developed. In short, the core functionality of the website should continue to work if just HTML has been enabled, with extra functionality and experiences delivered through the use of CSS and JavaScript.
The strategy is based on creating a simple foundation which provides all the key parts, before building on this with extras and add-ons.
There are many advantages to this strategy, but the most important benefit is that it allows all users, including crawling bots, to access the full view of a website and its areas without being inhibited by extra functionalities required by JavaScript. It can also mean that users who are viewing from different or older browsers will still be able to navigate the website without experiencing issues.
The progressive enhancement strategy consists of the following core principles:
- Basic content should be accessible to all web browsers
- Basic functionality should be accessible to all web browsers
- Sparse, semantic markup contains all content
- Enhanced layout is provided by externally linked CSS
- Enhanced behaviour is provided by externally linked JavaScript
- End-user web browser preferences are respected
How does it work?
It’s simple. Start with clean HTML and then build on it with CSS and JavaScript.

This approach can be compared to the way a car is built. While the hub of a car is its engine, specific parts are required for the car to function. Every car needs the same elements such as an engine, tyres and an exhaust, and no matter which manufacturer you get, the basics always remain the same. This is just like HTML.
The CSS of a car is its shape and look – this is something that can be customised. While newer cars are likely to have more sophisticated designs, this doesn’t determine or impact on the functionality of the car.
Now it’s time for all the fancy upgrades and gadgets – this is JavaScript. You can add a sat nav, a fancy radio, you can even upgrade the exhaust so that it’s louder. Again, this doesn’t affect the functionality of the car. What it can do however, is improve the user experience.
So, as you can see, progressive enhancement is a straightforward strategy that can lead to many benefits.
HTML
Clean and semantic HTML builds the foundation for websites and allows any user-agent (especially crawling bots) to access and interpret a website since it’s purely text based and doesn’t require additional functionality to render.

In the case of the Apple website, we can see that the main navigation and internal linking is being served via basic HTML. This means crawlers are able to access and crawl the website effectively:

As you can see above, the pre-rendered source code shows the navigation being served via HTML.
CSS
This is how a website is styled. Not all CSS is compatible with every browser, so it’s vital to include fallback CSS that functions on all browsers in addition to more advanced CSS for newer browsers.
JavaScript
JavaScript can be used to enhance the usability of a website and provide extra functionality. Search engines are able to render JavaScript, however it can inhibit what is picked up and make it difficult for crawling.
How does this affect SEO?
At its foundation, SEO works to ensure that search engines can access, index and understand websites in order to improve their overall visibility. A website could have perfectly optimised content which provides a great user experience, but if the architecture relies too much on JavaScript and lacks basic functionality then it can prevent search engines from seeing it entirely.
HTML is the preferred method for user-agents to crawl websites. This is because HTML is completely text based. As a result, crawlers are more easily able to understand and digest them, which means that crawling a website to find hyperlinks, content and all SEO tagging signals becomes faster, more seamless and straightforward.
JavaScript reliance with crawling
Many developers rely on JavaScript and skip over the basics of HTML when creating a website, focusing on user-experience at the expense of search engines. While search engines are getting better at rendering JavaScript, there are still areas that search engines cannot render and/or have difficulty understanding:
- Content which is hidden behind a click action. This is common with pagination controls, such as content behind ‘load more’
- Examples include buttons which don’t contain hyperlinks within the code

- Content which is reliant on JavaScript to appear on a website (post-render content)
- Examples include many frameworks that solely use client-side rendering techniques such as Vanilla, Angular and React
Relying on JavaScript can have a severe impact on what a search engine can see. If a search engine cannot see the content, it technically doesn’t exist in a crawler’s eyes. This can include valuable information which would help boost a website’s rankings.
In the below example, we can see that Apple has implemented progressive enhancement since both HTML and JavaScript versions of the website show all the important content. This means that search engines can see all the main content without the need of JavaScript. JavaScript is only used to aid the user experience and doesn’t render any important content.
JavaScript enabled
JavaScript disabled


Crawl budget
The limited JavaScript content that search engines might be able to see has a substantial impact on the overall crawl budget of a website. The crawl budget is an allocated amount of time that a crawler has to crawl a website. Once this has been used up, it will stop crawling the website, regardless of whether it has picked up all the pages and content or not.
JavaScript rendering uses up a considerable amount of this budget as it requires a lot more resources for a search engine to render and understand. This is because the files and language are much more complex than basic HTML.
Relying on Googlebot’s capabilities
Google and the Chromium teams have invested a lot of effort into optimising and improving the technology used by Googlebot. However, Googlebot is still not perfect and relies heavily on additional processes to fully render JavaScript.
Our tests have shown that the time Googlebot allocates to compute JavaScript and to wait for the execution, is finite and anything after this threshold will be ignored. This means that if JavaScript takes too long for search engines to execute, it will be ignored. Since this can be influenced by external factors such as connectivity, server load and volume of content, relying on search engines to render all JavaScript driven content can result in unexpected outcomes.
Page speed
Another reason why JavaScript requires more resources to render comes down to the fact that JavaScript file sizes are larger than basic HTML files. Not only does this impact the crawl budget, but it also impacts the overall page speed performance, which is a ranking factor and also impacts on user experience.
Websites that rely solely on a JavaScript front-end framework to display the content to users result in multiple connections and transfers made by the browser before it’s able to show content. If the content is within the HTML body within the requested document, the browser will be able to construct the page while the JavaScript framework is loading.
If a website takes a long time to load, customers are less likely to complete a purchase.

The benefits of progressive enhancement
There are a multitude of benefits when it comes to progressive enhancement. Some of these include:
- Ease of access for search engines to crawl and pick up key content
- Improved page speed for users
- Optimised crawl budget
- Improved accessibility for users on a weaker signal or older browser/device
Ensuring that a website is as crawler friendly as possible is essential for organic performance. All content, internal linking, tagging and keywords, to name a few, need to be crawled and seen by search engines otherwise there may be an impact on the website’s performance. Progressive enhancement ensures that search engines are not inhibited by more complex resources, while providing the freedom to use JavaScript to enhance usability for users.
Considerations
While progressive enhancement follows a straightforward approach to the build of a website, there are limitations to the areas it can be applied to. For instance, if an area of a website is JavaScript heavy and provides an interactive user interface, it may be difficult or impossible to convert this into semantic HTML with JavaScript layered on top.
In the example below, a calculator which requires user interaction to determine the end result needs JavaScript as the main functionality. This setup is fine for this type of page as it doesn’t contain much SEO value and is purely there for the user journey.

To summarise
In a world where websites are constantly evolving, there are more and more ways to make a website perform well for users. However, relying on scripts to provide even the basic functionality of a website can cause a whole host of problems.
Progressive enhancement takes things back to basics. You first start with clean, semantic HTML which is the foundation of the website. This provides all key structure and information via HTML to aid SEO and page speed. You can then add layers of CSS and JavaScript to enhance the user experience, but the functionality itself does not rely on these parts.
In order to maintain the best organic health, websites must make crawling and indexing as easy as possible; progressive enhancement enables this.
It’s important to note however, that JavaScript heavy user interfaces may not be able to convert to a progressive enhancement strategy, so alternative setups may need to be considered if these areas contain vital information for search engines and/or user journeys.If you would like to know more on this, or would like to discuss how any of the above can be applied to your website, please get in touch.