Skip to main content

How to Audit Your Website’s SEO With Google Search Console

Google-Search-Console_Logo

Google Search Console offers a whole host of functionality that can help determine issues with your site’s condition and offer guidance on how to fix them. Heavily focused on areas of search optimisation including content issues, site structure and overall performance, Search Console sheds light on what improvements need to be made to your website’s SEO.

In this blog, we’ll cover how you can use its features to conduct a basic SEO audit on your website.

What is Google Search Console?

Google Search Console is an administration and reporting interface created for webmasters. The platform allows you to check your index status, the configuration and optimise the visibility of your content.

Why is Google Search Console important for SEO?

One of the most important SEO tools, this platform acts as the main method of communication between Google and SEO managers. Google Search Console allows you to discover errors, access best practice guidance and notification of issues that can affect your website’s SEO.

How do you set up Google Search Console?

  • Go to Google Search Console
  • Create and/or login using your Google Account information
  • Select “Add A Property” at the top of the page
  • Enter your website’s URL
  • Pick your verification method from the 4 options presented:
  1. Upload a HTML file to the root folder of your website.
  2. Verify using your Hosting provider.
  3. Verify using Google Tag Manager.
  4. Verify using Google Analytics – we recommend using Google Analytics if this is already set up on your website.

You can also add a domain level property which includes all subdomains (m, www, and so on) and multiple protocols (http, https).

Repeat this process for the secure (https), non-secure (http), www and non-www versions of your domain name such as :

  • https://www.mysite.com
  • https://mysite.com
  • http//www.mysite.com
  • http://mysite.com

How to conduct a basic configuration audit

If you are new to Google Search Console, you can fix many basic SEO issues directly without the need to change your website’s coding. The following guide takes you through the initial configuration of your website.

Checking for Important Messages

Messages

Search Console is the primary platform Google uses to inform SEO managers about website issues. By accessing the ‘Messages’ section of the console, you can see a log of important changes to your configuration as well as any recommendations or required actions. These are typically linked to guides or resources to help you resolve the individual issue.

Manual actions

Found within the ‘Search Traffic’ menu item, the Manual Actions page provides details of high priority technical issues discovered by a manual inspection of your website. More information on Manual Actions can be found in Google’s official documentation.

Security issues

The ‘Security Issues’ page provides insights into known problems with your website’s security. This can include notification of hacks, malware and other issues that will potentially affect your website’s organic rankings. More information on the Security Issues section can be found in Google’s official documentation.

Setting the preferred domain

To avoid getting a penalty due to duplicate content on www and non-www versions of your website, you can tell Google which version you want it to display.

To do this, click the ‘Cog icon’ at the top right of the page and select ‘Site Settings’. You should now see a section titled ‘Preferred Domain’. Simply select your preference and this is automatically saved once selected. Repeat this process for both secure (https) and non-secure (http) versions of your website. You will notice that by configuring the www version, the non-www version is automatically configured for you.

Please note this requires www and non-www versions of the website to already have been setup in Google Search Console as per our previous recommendation.

At present, there is no way to specify your preference for secure or non-secure versions of your domain, however Google have stated that if it’s present, they will default to the secure version.

International targeting

If you own a gTLD (Global Top Level Domain) such as .com .net or .org, by default your website is targeting a global audience. If your business is unable or unwilling to sell to or service customers overseas, using Google Search Console, you can specify your target country.

To do this, access the ‘International Targeting’ page located under the header of ‘Legacy Tools and Reports’. Here you can see the details of your international traffic. To geo-target your website, select the ‘Country’ tab to the top of the graph, check the checkbox titled ‘Target Users In’ and pick your preferred country from the select box. Again, this is automatically saved once your selection has been made.

Handling URL Parameters

URL Parameters (AKA Query Strings) are often used by developers. These provide filtering and search functions on a website such as selecting a colour preference, navigating through multiple pages of products in a category or sorting content on a page by name, price or popularity. If improperly implemented, these filters can be indexed by Google resulting in potentially 1000s of duplicated or thin content pages that match the criteria. This means your best content could be demoted in the organic search results in favour of a pre-filtered page.

Google Search Console gives you the ability to specify and manage which parameters you want to keep and provide insight into how they are used within your website. This page can be found under ‘Legacy Tools and Reports’ on the page titled ‘URL Parameters’ and select ‘Configure URL Parameters’.

For new websites, it can take time for Google to populate this page with known data. However, if you have had Search Console running for a few months you should be able to see a list of those being used on your website.

Obvious choices for control:

  • UTM codes used for goal tracking in Google Analytics and Tag Manager
  • ‘Thank you’ query-strings triggered from Form submissions

Obvious entries to keep:

  • Pagination
  • If you don’t have SEO friendly URLs, for example, all of your content is accessed through query-strings, make sure you keep those related to Page ID, Product ID and Category ID

The remainder should be downloaded and discussed with your web developer to work out what delivers unique content and what results in an unwanted filter. At this point, you can revisit Google Search Console and apply the relevant configuration. There is a wealth of parameter configuration possibilities grouped into two types:

  1. No: Doesn’t affect page content i.e. tracks usage– Selecting this instructs Google not to index pages containing the specified parameter
  2. Yes: Changes, narrows or records content– Selecting this provides additional configuration options:

How does this parameter affect page content?

  • Sorts– i.e. high to low
  • Narrows– i.e. only red items
  • Specifies– i.e. a product or service page
  • Translates
  • Paginates– i.e. the category page of a blog or product category

Which URLs with this parameter should Googlebot Crawl?

  • Let Googlebot decide (Default)
  • Every URL– Do not block these pages
  • Only URLs with value [option to specify values]
  • No URLs– Instructs Google to ignore this content

Sitemaps

Found under ‘Coverage’, ‘Sitemaps’ you can provide an XML file containing a list of web pages that you want Google to index. This can dramatically reduce the amount of time it takes Google to find and index new content.

Many popular content management systems such as WordPress, Magento, Venda, Umbraco and Kentico dynamically create this file by default. In this case, you can simply ‘Add a sitemap’ and input the location of the file on the server. If you don’t have an automatically created sitemap you should speak to your developer who should be able to quickly build this for you.

There are many common issues with sitemap files that can reduce the efficiency of indexation. These include:

  • Out of date information: For example, the sitemap file is static and does not automatically update itself to reflect a website’s content
  • Listing dead, obsolete or redirected pages: When removing content from your website you may have chosen to remove a link to the content instead of deleting and redirecting the now defunct URL. This would leave these pages in the sitemap potentially allowing out of date content to be indexed.
  • Listing filtered content: In most cases you do not want Google to find and index filtered content which could compete with your commercial product, category and/or landing pages.

Ensuring your sitemap file is robust will help new and updated content to be indexed faster, potentially improving your organic ranking.

Robots.txt

Robots.txt is a simple text file where you can provide instructions for search engines and other web technologies. On a basic level, it’s typically used to block a search engine’s access to non-client facing content such as CMS login pages, document libraries, private data and in some cases landing pages used exclusively for paid campaigns which you do not want Google to index.

The robots.txt tester is no longer directly linked within Search Console but can be found directly online. Using this tool, you can quickly check your website’s robots.txt file and identify pages that are blocked as these will have the prefix “Disallow:” If folders relating to live content or critical content has been disallowed, you should contact your web developer to have these entries removed from the file.

Another common error occurs when a new site has been launched. In many cases, the demo versions of these pages are typically blocked by default to stop the in-development site from being displayed in the search results. When published, this block can be carried over to the live site effectively de-indexing all your content. To identify this issue, look for the line “Disallow: *” with the asterisk being used as a wildcard to specify all content.

Key takeaways

Having covered the basics, you can successfully safeguard your website against common SEO issues and create a foundation on which to build successful content creation, ongoing optimisation and outreach campaigns.

However, your job is not done just yet! There’s still a wealth of opportunities hidden in Google Search Console including but not exclusively:

  • Historic content that can be 301 redirected to your live URLs
  • Optimisation opportunities highlighted as duplicate title tags
  • Enhancing your visibility in the search results through schema and microformats
  • Avoiding mobile penalties from banners and popups with Search Consoles fetch and render tool
  • Indexing new content with their submission tool
  • Identifying keywords that not only rank but generate clicks to your landing pages

Our savvy SEO specialists know all about Google Search Console and use it on a daily basis to explore our client’s website, finding new potential and optimisation opportunities. If you want to know more about how we work and what our SEO services can offer your businesses, contact us today.