Contact us

Technical SEO Best Practice – Page Two

Page Two Covers; Mobile SEO, Site Speed, Robots.txt, Subdomains, International SEO, Javascript


It’s official; we now spend more time browsing the internet on our mobiles than our laptops. But are you optimizing your site to catch all the online traffic that comes from phones and tablets? Is your site successfully adapting to different screen sizes and user types?

If you can’t give a big “yes” to those questions, then you need to take a closer look at your mobile SEO. Of course, as part of our work, the Croud SEO team will help to fully optimize your site for mobiles, but here are some considerations for making your site a smooth browsing experience on mobile devices.


1. Responsive Design

This set-up enables you to serve up the same content to multiple devices all from the same domain, with pages stretching or condensing to fit the resolution of the browser window. Google prefers this design as URLs remain the same, although will support other types if you set them up properly.


  • Simple to manage: one admin panel and all pages exist only once
  • Consolidated authority: links are all to the same page
  • Redirects are unnecessary


  • No unique mobile content
  • Not necessarily optimized for UX

2. Dynamic Serving

This system enables your website to display different sets of HTML and CSS depending on what type of device your visitor is using. This is achieved using the Vary HTTP header, which looks like this:

Example from Google Developer’s Blog


  • Retain the same URL
  • Consolidated authority: links are all to the same page
  • Enables unique mobile content


  • Hard to implement/maintain
  • Expensive to develop

3. Separate URL

Some choose to create a second, parallel site for mobile users. This allows creation of custom content for mobile visitors. Most parallel mobile sites use an “m” subdomain, e.g:


  • Easy to implement
  • Can serve up unique mobile content


  • Content duplication
  • Splits authority of site
  • Expensive to develop and maintain
  • Redirects are common

Considerations For Mobile SEO

  • How does your site look on a variety of screen sizes? Tools like Mobile Phone Emulator allow you to check.
  • Consider the bounce rate. Annoying pop-ups and difficult to click buttons will risk mobile users abandoning your site.
  • Google doesn’t like full screen overlays or app download interstitials and can reduce the rankings of sites or pages that use them.
  • Don’t hide your content. Blocking CSS, JavaScript and images won’t help Google categorise your site.
  • Avoid Flash. It may not be available to all users phones.
Wood Mac Notepad Banner


Google’s search algorithm places more emphasis on site speed than ever. Users today want to be able to load their pages quickly, especially when on mobile devices. Simply put, faster loading times means higher conversion rates and more revenue for your business, so it’s very important to get it right. The smallest technical changes to your site can lead to huge gains in terms of loading speed.

There are some excellent free-to-use tools available online to help identify site speed issues and solutions:


  • Test My Site – Your first stop should be this tool from Google, which offers a mobile page loading speed test which can benchmark your loading speed against the industry average. The tool also estimates how many seconds of loading speed could be saved through ‘quick wins’ and what these would be.
  • Page Speed Insights – this tool, again from Google, analyses your site across mobile and desktop devices and gives some insights into how you might better your speed.
  • Pingdom – This tool is run by an independent company and offers more detailed insights than the Google Pagespeed tool. It offers a waterfall view of file requests, giving a clear visual on server side delays to page loading speed:

  • Lighthouse – This open source automated tool is a Google Chrome Extension that helps you quickly audit a site. Not only do you get informative Page Speed reports but the SEO analysis feature analyzes how well optimized your site is. It’s a more lightweight tool than others but gives quality insights that involve more than just load speed (albeit this is the most important ranking factor).
  • Web Page Test – This tool provides downloadable videos of your page loading speed in real time. This is useful for benchmarking against competitors or providing visual evidence of improvements. Web Page Test is the best all-rounder, some key features include:
    • Page speeds tests for new and returning users
    • An API which can include Lighthouse reports
    • Downloadable videos available enabling side-by-side comparison of client//competitor loading speeds

WebPage Test

Some simple things to look for in Web Page Test

  • In the waterfall view, the narrower the margin the better
  • Yellow lines are redirects, red lines are errors
  • Scores lower than A or B on the top right should be investigated
  • Test History tab allows you to benchmark against previous tests

Site Speed Considerations

  • Image reduction and optimization can make a big difference. Consider reducing the number of images on a page and ensure images are compressed as much as possible without compromising image quality.
  • Accelerated Mobile Pages (AMP) is an open source project aiming to create a library which provides a subset of HTML5, allowing you to create pages that load instantaneously. There are numerous brands reporting significant revenue gains using AMP.
  • Server Response Time. There are many factors that can affect response times which makes it harder to identify issues to improve compared with other aspects. They could include: slow application logic, slow database queries, CPU/memory starvation and pop ups.
  • Progressive rendering is a set of techniques used to render content for display as quickly as possible, loading visible content asynchronously to content below the fold.
Wood Mac Notepad Banner


The robots.txt file is a text file that tells search engines which pages or directories of content on your site not to crawl. It’s important for SEO purposes that you communicate which pages you want Google to avoid.

If you have a lot of pages or complex internal linking structure, it will take a bot a long time to crawl them, which can have a negative effect on your rankings.

Examples of Robots.txt


Blocking all web crawlers from all content:



This robots.txt file would tell all web crawlers not to crawl any pages on, including the homepage.


Allowing all web crawlers access to all content



This robots.txt file tells web crawlers to crawl all pages on, including the homepage.

Specific Folder & User-Agent

Blocking a specific web crawler from a specific folder



This robots.txt file tells only Google’s crawler (user-agent name Googlebot) not to crawl any pages that contain the URL string:

Robots.txt Considerations

  • Your robots.txt file must be placed in your website’s top-level directory.
  • The robots.txt file is publically available. This means anyone can see what pages you do or don’t want to be crawled, so bear this in mind.
  • Each subdomain on a root domain uses separate robots.txt files. This means that both should have their own robots.txt files.
  • It is best practice to indicate the location of sitemap associated with this domain at the bottom of the robots.txt file.
Wood Mac Notepad Banner


As discussed before in the URL Structure section, having a clear and coherent URL structure is vital for SEO.

One approach is to use subdomains on your website. You can create a subdomain under any root domain name that you own.

An example of a subdomain is:

Where the “blog” section is the subdomain and “” is the root domain.

Subdomain Considerations

  • It is often recommended that sites use subfolders as opposed to subdomains, as these consolidate link equity into one domain so that inbound links benefit the entire domain.
  • Subdomains can be a good way to insert a keyword into your URL, so consider carefully the terms you choose.
  • Subdomains can be used for language-specific sites e.g. would be the English language site for where the local top level domain isn’t available.
  • Subdomains act like different sites and can have unique content. This can be convenient if you want to start a blog using a platform like WordPress without changing how the rest of your site is set up. However it is possible to use WordPress on a sub-domain but make it appear within your domain using a reverse proxy such as nginx.
Wood Mac Notepad Banner


International SEO is the process of helping search engines easily identify which countries your content is most relevant for and which languages are used on your site.

If you have visitors coming from many different countries, or speaking different languages, you’ll want to investigate how best to optimize your site structure and content for local, regional and language specific search.

URL Structures

There are a few different ways to structure your URL to target particular countries:

  • ccTLD – This is where webmasters use a two-letter code to indicate the country they’re operating in, such as
  • Subdomain – International content is placed on a subdomain e.g.
  • Subdirectory – International content is placed in the subfolder of a root domain e.g.
  • gTLD with language parameters – A top-level domain with a URL parameter for a specific language e.g.
  • Different domain – International content is placed on an entirely different root domain e.g.

International SEO Considerations

  • Whichever URL structure you choose, Google recommends you organise your hierarchy similarly in each section of your site so that it’s intuitive and easily crawlable.
  • Coding standards such as HREFLANG can be used to indicate where pages are localised and for which territory and language.
  • Clearly labelling the language that your content has been created in using HTML tags (e.g. html lang) or server headers helps search engines rank those pages locally.
  • Regional languages often use local characters such as the umlaut in German right through to full non-latin cyrillic characters used in Russia and asian sets in China and Japan. You can specify the character set used within the HTML using the correct unicode character set within the meta charset tag, e.g. UTF-8.
  • Hosting your site on a local IP and linking to local content can help indicate you are serving a certain local market.
  • Building links from local resources can help increase your authority in a certain locality.
Wood Mac Notepad Banner


JavaScript is a programming language that allows you to implement complex things on websites. If a page displays interactive maps, animated graphics etc, JavaScript is probably involved.



The ability for bots to crawl your site.

Webmasters have blocked search engines from JavaScript in the past, believing it to be the best strategy. Third party JavaScript libraries are also often blocked by their creators to prevent search engines loading their servers with requests. However, this means search engines are not seeing what the end user is seeing, reducing the site’s appeal.

Find out if you’ve inadvertently blocked the crawlers using a tool like Fetch as Google


The ability for bots to access information from your content.

It can be hard for search engine bots to understand JavaScript. If users/searchers have to do something in order to fully experience your site, such as log in or click a button, search engines may not see that content.

Site latency/Critical Rendering Path – the sequence a browser undergoes to display pages on your site. JavaScript might seriously lengthen this process.

To test whether or not you have site latency issues, try:

  • Adding the JavaScript in the HTML document
  • Adding “async” attribute to HTML tag
  • Placing JavaScript lower within the HTML when possible

Javascript Considerations

  • Always aim for the speediest site load time available to avoid crawlability issues.
  • Testing is key, use Fetch as Google when you want to see how Google interprets your site.
  • Turn off JavaScript within your browser using an extension such as Disable JavaScript for Chrome or use Disable JavaScript in Safari’s Develop menu and browse your pages, seeing how much content is no longer visible.
  • Manually search for your pages with JavaScript and see how Google presents the results, taking action if required.

Like to know more?


Tel: 02080177723
Email: Email us
Croud Inc Ltd, Trinity,
39 Tabernacle Street, London,

New York

Tel: 13473382012
Email: Email us
Croud Incorporated,
450 Broadway, 2nd Floor
New York, NY, 10013


Tel: 61291955309
Email: Email us
Croud Australia Pty Ltd,
Belmont House, 26-28 Wentworth Avenue
Surry Hills, NSW, 2010


Tel: 02080177723
Email: Email us
Croud Inc Ltd, The Chancery,
Abbey Lawn, Shrewsbury,