A guide to understanding defensive SEO

Frank Calabrese

SEO Account Manager

8th September 2022

~ 7 min read

Picture this – you’ve spent the better part of a few months working on a new site migration. You’ve created some well optimized content for the site’s blog, found new ways to improve page speed for the website, and poured your heart and soul into creating a user-friendly experience. How exciting! But in a momentary lapse of judgment, you accidentally disallowed your homepage from being crawled by Googlebot. 

Normally this would be a disaster for any search marketer. However, because you employ defensive search engine optimization (SEO) best practices, you’re aware of the issue before it could do any damage. In this article, Croud’s Associate Director, Web Experience, Josh Sciortino, and SEO Associate, Frank Calabrese, explore some key ways to protect the health of your website.

What is defensive SEO?

Defensive SEO isn’t exactly what its name might imply. Rather than defending a site from ‘attacks’ by competitors, defensive SEO is all about being proactive against potential issues that may arise and constantly monitoring the health of a website. If the website is of poor technical health or experiences unexpected downtime, you want to be aware of these issues as soon as possible so that the site’s rankings aren’t impacted. 

By following a few key defensive SEO tips, you’ll be alerted of any changes to the status of your website soon after they begin happening. One of the best and most common practices to accurately monitor the health of a website is running daily crawls of the site using web crawlers, such as Screaming Frog, Botify, or Deepcrawl. From unexpected changes in HTTP status codes to recursive 404 generation, these crawlers can send daily reports via email containing valuable insights that will alert you of any critical health issues your client’s website might be experiencing.  

Running these daily crawls is the best way to stay on top of these technical issues. While you could log into Google Search Console (GSC) everyday and manually check on the health of a website, the data in GSC is not real-time data (data from GSC is often two to three days behind). By the time GSC makes you aware of any technical issues, the website’s performance may already be negatively impacted. By running these daily website crawls, you won’t have to wait to discover any significant issues with your client’s website. 

Why is defensive SEO important?

Implementing defensive SEO practices will save you time. It’s always better to be proactive than reactive, and that’s especially the case when it comes to SEO. Time spent waiting for GSC to present accurate data can be saved by quickly glancing at an all-encompassing web health report generated by a web crawler. Arguably the most important factor is the time saved from having to fix serious website health issues. Being able to fix website issues the moment they start happening will allow you ample time to defensively implement the necessary changes and address the issue before Google takes notice. 

Defensive vs. ‘offensive’ SEO

While defensive SEO pertains mainly to website maintenance and health, ‘offensive’ SEO aims to actively increase the quality of a website. Offensive SEO tactics include creating well-optimized content, performing keyword research, and building a site’s backlink profile. 

Offensive SEO is important, but the content you optimize won’t boost any rankings if you somehow accidentally apply a noindex tag to every page of the website or if a broken link gets added to the navigation on every page of the site. 

What kind of issues does defensive SEO check for?

You can gain valuable, real-time insights on all things related to a website’s health by implementing site monitoring tools such as Little Warden and Uptime Robot. Here are some of the standard defensive monitoring checks you can perform on your client’s website using these tools, as well as some of the technical issues that can be discovered in this process:  

1. Website security and server issues

Before you start worrying about the content of a website, you need to worry about its safety and security first. You can’t expect Google to crawl your website if it isn’t even live over a secure protocol in the first place, so insights into these issues are crucial. Here are some of the standard security and server checks performed by site monitoring tools like Little Warden and Uptime Robot.

  • SSL certificate validation and expiry
    • Monitoring that the SSL certificate is for the correct domain and proactive notifications of the certificate expiration, so you can notify your engineers with enough notice
  • Domain name/nameservers expiry
    • Safeguarding the ownership of your domain and nameservers in perpetuity
  • Malware detection and malicious code injection vulnerabilities
    • Some internal site search functionalities, which often use search parameters, may be indexable and allow for malicious actors to inject harmful strings or attempt to hack visitor computers without your knowledge 
  • Uptime monitoring
    • Proactive monitoring of any downtime users may be experiencing, however long or short, may occur in the middle of the night when your marketing teams wouldn’t otherwise catch these blips, and they often wouldn’t be noticeable in Search Console

2. Crawlability issues

Anything that hinders Google’s ability to crawl and discover pages on a website will seriously impact rankings, especially if Google can’t crawl a crucial page on the website. Suppose there are any changes made to the website’s robots.txt file or sitemap (both intentional and unintentional) that will impact the way that Google crawls your client’s website. In that case, defensive SEO monitoring will alert you of these changes as they happen. 

Development teams are often busy making a lot of site updates, which can include adding allows or disallows to robots.txt that haven’t been cleared by other relevant teams who need to be aware of these changes for search bot visibility considerations. Sometimes the crawlability (or disallowing of crawling) of certain site sections can have legal ramifications, making it even more important to be aware of these changes.

3. Rendering issues

Google spends a limited amount of time rendering the pages of a website. If the pages of your website are bloated with unused JavaScript or render-blocking resources, it can seriously impact the website’s ranking ability. These are just a couple common issues that also affect pagespeed and core web vitals.

Another common JavaScript rendering consideration that defensive monitoring can make you aware of is when there are several content components or even internal links that are hydrated with JavaScript. This is less of an issue for Googlebot than it used to be in the past. However, not all search engine bots are equal, and it can put you at risk of having critical elements of the site relying upon JavaScript to populate. Thus, you want to be aware of this rendered vs. non-rendered disparity so you can make informed decisions about the fast rendering and visibility of all your key content. To learn more about dynamic rendering, explore Croud’s recent blog here.

4. Indexability issues

Poorly implemented redirects or a misplaced noindex tag are examples of site issues that can impact both user experience and the way that Google indexes your website into its database. Below are some issues related to site indexability that defensive SEO monitoring will alert you of:

  • Status code change monitoring
    • We often see spikes in 301/302 or 404 status codes that we weren’t aware of. These can create a spike in URLs crawled, especially if generated in a recursive way (e.g. website.com/404/404/404/404/…), causing crawl traps full of non-indexable pages. This also can affect your crawl budget, decreasing the crawl frequency of the healthy pages where you want search bots, like Googlebot, spending their time.
  • Noindex monitoring
    • Sections of the site that still have their noindex tag (as a safeguard) from the staging environment can get pushed live to production without removal of the noindex tag, which can cause a sudden deindexing and major loss of traffic.
  • Canonicalization monitoring
    • Whether it’s the same scenario as the above, or perhaps a marketing team decided to change canonical tags without informing the SEOs, there can be serious indexability repercussions depending on what URL they’ve decided to change it to.

Summary

Defensive SEO should be thought of as a standard practice for SEOs. The benefits of having daily crawls and defensive monitoring tools integrated into your website far outweigh the cost of setting them up – a very minor cost at that. These tools are ‘set it and forget it’, so you won’t have to interact much with these tools after they’re up and running, and they’ll be constantly providing health monitoring on these aspects with little interaction. All you have to do is investigate as soon as you get an alert. 

Regardless of the effort it takes to integrate these tools, you should always be in a position to discover any technical health issues your website is currently experiencing sooner rather than later. Your website, and your stress levels, will thank you. 

If you’d like to learn more about defensive SEO, or would like to learn how to successfully implement this into your digital marketing strategy, please get in touch with our team.

Sign up to Croud’s Digital Digest

Get Croud's monthly newsletter, which is packed with the latest news from Croud from across the globe, along with updates and commentary around the latest developments in the digital marketing space.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.