Google Search Console - Crawl

Welcome to the last step in my guide in Webmaster Tools/Search Console.
In this final step by step blog I will be running through the “crawl” section, giving you a breakdown of each subsection so you get a better understanding of what to expect.

Crawl Errors

SC crawl

All SEOs know that having anything broken on your website is a bad thing so this section will apply to you.

The first thing you see when go into the crawl error tab is an overview. From here you can see what needs to be fixed on your website, this could be the following...

Desktop

  • 404 errors
  • 302 errors (soft 404 errors)
  • Server errors

Smartphone & Feature phones

  • 404 errors
  • Not followed

So from this dashboard you can download these reports into CSV files or through Google docs, where you can go in and fix these errors.

Google Search Console Crawl Stats

SC site errors

This section will give you more in depth information about the Googlebot that comes to your site. The Googlebot job is to identify anything that is right and wrong about your website.

So when the Bot comes to crawl your website it will have to crawl all different types of content, this includes CSS, JavaScript and PDF files. However Google will only provide you the last 90 days of data, so if you want to check if everything is going to plan, you have 90 days to do it in!

This page includes:

  • Pages crawled per day
  • Kilobytes downloaded per day
  • Time spent downloading a page (in milliseconds)

Fetch as Google

From here you can test your pages to see if the Googlebot can access the page that you want to check.

There are 2 options to choose from, these are:

Fetch

This is where you want to check a specific URL on your website, However if you choose this option Google will not crawl any images or scripts. So only use this method if you need to check something quickly.

Fetch and Render

Again, Google will crawl a specific URL on your website but will also render that page. This will crawl all images and scripts on that page.

“You have a weekly quota of 500 fetches. When you are approaching your limit, you will see a notification on the page.”

Robots.txt tester

For you advanced SEOs you can use robots.txt tester to check any URL’s that you have disallowed, simply by inputting the URL in the correct field, and seeing if it still being crawled or not.

Google Search Console - Sitemaps

SC robots tester

Sitemaps

This field is about your site’s sitemap, from here you can see how many pages in your sitemap have been submitted and indexed. You can add, test and download every URL in your site map in a CSV file or Google docs.

URL Parameter

This feature is designed for people who want to help Google crawl their site more efficiently. If the Googlebot detects any problems with your website it will flag up these errors here as well.

Security Issues

Always keep an eye on this field, if Google finds any viruses, bugs or malware, Google will flag it up in this section.

Other Resources

This field has got nothing to do with webmaster tools but does provide you with other resources that will help you with your website, which is always a plus in my book!

And there we have it, my guide on Google Search Console written out by yours truly, keep your eyes peeled into what I’m going to blog about next! Remember you can see my previous blogs Essentials of Google Webmaster Tools, Search Appearance and Search Traffic here!

If you have any questions of sorts just give me a tweet @ISD_richard.

Google Partner