Audit your website’s SEO with Google Search Console

Google-Search-Console_Logo

What is Google Search Console?

Google Search Console (AKA Google Webmaster Tools) is an administration and reporting interface created for webmasters by Google. Google Search Console allows you to check your index status, adjust the configuration of and optimise the visibility of your content.

Why is Google Search Console important for SEO?

Google Search Console is the main method of communication between Google and SEO Managers. With Google Search Console you can discover errors, get best practice guidance and notification of issues that can affect your website’s SEO. This makes Google Search Console one of the most important tools for SEO.

How Do You Set Up Google Search Console?

  • Go to Google Search Console
  • Create and/or login using your Google Account information.
  • Select “Add A Property” at the top of the page.
  • Enter your website’s URL.
  • Pick your verification method from the 4 options presented:
  1. Upload a HTML file to the root folder of your website.
  2. Verify using your Hosting provider.
  3. Verify using Google Tag Manager.
  4. Verify using Google Analytics.
  • We recommend using Google Analytics if this is already set up on your website.
  • Repeat this process for the secure (https), non-secure (http), www and non-www versions of your domain name i.e.

Basic

Configuration Audit

If you are new to Google Search Console then you can fix many basic SEO issues directly without the need to change your website’s code. The following guide takes you through the initial configuration of your website.

Checking for Important Messages

Messages

Google Search Console is the main way in which Google informs SEO Managers about issues with a website. By accessing the “Messages” section of the console you can see a log of any important changes to your configuration as well as any recommendations or required actions. These are typically linked to guides or resources to help you resolve the individual issue.

Manual Actions

Found within the “Search Traffic” menu item, the Manual Actions page provides details of any high priority technical issues discovered by a manual inspection of your website. More information on Manual Actions can be found in Google’s official documentation.

Security Issues

The “Security Issues” page provides insights into any known problems with your website’s security. This can include notification of hacks, malware and other issues that will potentially affect your website’s Organic rankings. More information on the Security Issues section can be found in Google’s official documentation.

Setting the Preferred Domain

To avoid getting a penalty due to duplicate content being found on both www and non-www versions of your website, you can tell Google which version you want it to display. To do this, click the “Cog icon” top right and select “Site Settings”. You can now see a section titled “Preferred Domain”. Simply select your preference. This is automatically saved once selected. Repeat this process for both secure (https) and non-secure (http) versions of your website. You will notice that by configuring the www version, the non-www version is automatically configured for you.

Please note this requires www and non-www versions of the website to already have been setup in Google Search Console as per our previous recommendation.

At present there is no way to specify your preference for secure or non-secure versions of your domain however Google have stated that if it’s present they will default to the secure version.

International Targeting

If you own a gTLD (Global Top Level Domain) i.e. .com .net .org then by default your website is targeting a global audience. If your business is unable or unwilling to sell to or service customers overseas then using Google Search Console you can specify your target country. To do this open the “Search Traffic” tab and access the “International Targeting” page. Here you can see details of your international traffic. To geo-target your website, select the “Country” tab to the top of the graph, check the checkbox titled “Target Users In” and pick your preferred country from the select box. Again, this is automatically saved once your selection has been made.

Handling URL Parameters

URL Parameters (AKA Query Strings) are often used by developers to provide filtering and search functions on a website i.e. selecting a colour preference, navigating through multiple pages of products in a category or sorting content on a page by name, price or popularity. If improperly implemented, these filters can be indexed by Google resulting in potentially 1000s of duplicated or thin content pages that match the criteria.  This means your best content could be demoted in the Organic search results in favour of a pre-filtered page.

Google Search Console gives you the ability to specify and manage which parameters you want to keep and provide insight into how they are used within your website. This page can be found under “Crawl” on the page titled “URL Parameters” and selecting “Configure URL Parameters”.

For new websites, it can take time for Google to populate this page with known data, however if you have had Search Console running for a few months you should be able to see a list of those being used on your website.

Obvious choices for control include:

  • UTM codes used for goal tracking in Google Analytics and Tag Manager
  • “Thank You” queries strings triggered from Form submissions

Obvious entries to keep include:

  • Pagination
  • If you don’t have SEO friendly URLs i.e. all of your content is accessed through query-strings then make sure you keep those related to Page ID, Product ID and Category ID

The remainder should be downloaded and discussed with your web developer to work out what delivers unique content and what results in an unwanted filter at which point you can re-visit Google Search Console and apply the relevant configuration.

There is a wealth of parameter configuration possibilities grouped into two types:

  1. No: Doesn’t affect page content i.e. tracks usage – Selecting this instructs Google not to index pages containing the specified parameter
  2. Yes: Changes, narrows or records content – Selecting this provides additional configuration options:

How does this parameter affect page content?

  • Sorts – i.e. high to low
  • Narrows – i.e. only red items
  • Specifies – i.e. a product or service page
  • Translates
  • Paginates – i.e. the category page of a blog or product category

Which URLs with this parameter should Googlebot Crawl?

  • Let Googlebot decide (Default)
  • Every URL – Do not block these pages
  • Only URLs with value [option to specify values]
  • No URLs – Instructs Google to ignore this content

Sitemaps

Found under “Crawl”, “Sitemaps” you have the ability to provide a n XML file containing a list of web pages that you want Google to index which can dramatically reduce the amount of time it takes Google to find and index new content. Many popular content management systems such as WordPress, Magento, Venda, Umbraco and Kentico dynamically create this file by default in which case you can simply “Add a sitemap” and input the location of the file on the server. If you don’t have an automatically created sitemap you should speak to your developer who should be able to quickly build this for you.

There are many common issues with sitemap files that can reduce the efficiency of indexation. These include

  • Out of date information – i.e. the sitemap file is static and does not automatically update its self to reflect a website’s content
  • Listing dead, obsolete or redirected pages – When removing content from your website you may have chosen to remove a link to the content instead of deleting and redirecting the now defunct URL. This would leave these pages in the sitemap potentially allowing out of date content to be indexed
  • Listing filtered content – In most cases you do not want Google to find and index filtered content which could compete with your commercial product, category and/or landing pages.

Ensuring your sitemap file is robust will help new and updated content to be indexed faster, potentially improving your organic rank.

Robots.txt

Robots.txt is a simple text file where you can provide instructions for search engines and other web technologies. On a basic level, it is typically used to block a search engine’s access to non-client facing content such as CMS login pages, document libraries, private data and in some cases landing pages used exclusively for paid campaigns which you do not want Google to index.

Found under “Crawl”, “Robots.txt Tester”, you can quickly check your website’s robots.txt file and identify any pages that are blocked as these will have the prefix “Disallow:”

If folders relating to live content or critical content has been disallowed you should contact your web developer to have these entries removed from the file.

Another common error occurs when a new site has been launched, such as a red-design or sprint update, as in many cases the demo versions of these pages are typically blocked by default to stop the in-development site from being displayed in the search results and when published this block can be carried over to the live site effectively de-indexing all of your content.

To identify this issue, look for the line “Disallow: *” with the asterisk being used as a wildcard to specify all content.

 

Having covered the basics, you have safeguarded your website against common SEO issues and laid a great foundation on which to build a content creation, optimisation and outreach campaign. That’s not to say that your job is done as there’s still a wealth of opportunities hidden in Google Search Console including but not exclusively:

  • Historic content that can be 301 redirected to your live URLs
  • Optimisation opportunities highlighted as duplicate title tags
  • Enhance your visibility in the search results through schema and microformats
  • Avoiding mobile penalties from banners and popups with Search Consoles fetch and render tool
  • Indexing new content with their submission tool
  • Identifying keywords that not only rank but generate clicks to your landing pages

If you would like to speak to someone about how SEO can help your business grow online then contact us today.

 

SEO & Digital Marketing Agency

ADDRESS
First Floor Central House,
Beckwith Knowle,
Otley Road,
Harrogate,
HG3 1UF

NEWSLETTER