We’ve been working our way through Google Webmaster Tools learning how to set up and configure a website and took a detailed look at the site stats Google provides. This post covers the Diagnostics section of Webmaster Tools which lets you assess common issues and problems with your website.
The diagnostic tools provided by webmaster tools are pretty basic but can provide you with critical site information:
Malware is malicious software that has somehow attached itself to your website such as a virus or spyware. Generally these things don’t get attached to your site without some deliberate intent, but it may be unknown to others if it happens. Google let’s you know if they find any malware on your site.
I’ve never worked on a site that had malware so the only thing I’ve seen in this screen is “Google has not detected any malware on this site.” Hopefully that’s all you’ll ever see too.
The crawl error section of Google Webmaster Tools gives you a number of menu options to assess site crawl problems. At the top you have three main tabs that allow you to view crawl errors found from four different web crawlers: Web, Mobile CHTML, Mobile WML/XHTML, and News.
Within each of these crawler tabs a sub-menu is provided that allow you to view errors in: HTTP, In Sitemaps, Not followed, Not found, Restricted by robots.txt, Timed out, Unreachable, and News specific. Clicking through each of these sections and sub-sections will provide you all (if any) of your crawl errors for each section.
What makes this information most helpful is the links provided with each URL that allows you to find the source of the offense. You can see what URLs (both internal and external) that contain broken links.
If the broken links are internal you can correct the problem. If they are external then you might have to make contact with a few sites to have them correct the problem.
Unfortunately, Google seems to pick up a lot of links that are not there. The image above shows a discovery date of 2006 for links that have not been in place for quite some time. I’m finding this type of thing pretty regularly in webmaster tools which means that this feature isn’t a great of a tool as you would hope. I still recommend broken link checks using Xenu, if possible.
The Crawl Stats section gives you a very quick at-a-glance view of your site’s crawl history over the past 90 days. The graphs show pages crawled, kilobytes downloaded and time spent downloading on a daily basis. The numbers on the right show the 90-day high, average and low.
Look for anomalies, spikes and valleys that you can go back to assess. See what you were doing that may have caused these changes. If they are positive changes you can look to duplicate them. If negative, look to avoid them in the future.
This is an extremely useful area of webmaster tools as it provides a few clues as to what on-page optimization issues Google finds important. The page is divided into three sections: Meta description, Title tag, and non-indexible content. Notice that there is nothing here about Meta keywords. Hmmm. I wonder why that is?
At the top of the page Google tells us that the followoing issues won’t prevent you from being found in the Google index but they will affect the user’s experience. It’s also possible they will affect your rankings, though that isn’t noted here.
The first section, Meta descriptions, provides three areas of analysis: duplicate descriptions, long descriptions and short descriptions. Off to the right is the number of pages uncovered that are registering this “problem.”
Below this is the Title tag section that provides analysis on missing, duplicate, long, short and non-informative titles. The title tag is the single most important piece of real estate on your site pages so this is a section that you really want to pay attention to and correct as soon as possible.
The last section shows errors of non-indexible content. I’ve not seen this show any results on the site’s I’ve worked on, but presumably this might show when content is found hidden behind passwords, in images or flash animations. This is only a guess.
If any errors are found a link will be provided for you to open up a page that contains more information.
Here you’ll be able to locate the offending pages and content. In the sections that note duplication you can click on the + to get the actual URLs that display the duplicating content. Google also allows you to download the information into a spreadsheet for easy reference and use.
Learn more about these sections of Google Webmaster Tools
- Part I: Setting Up a Site
- Part II: Site configuration
- Part III: Your site on the web
- Part IV: Your site on the web (continued)
- Part V: Diagnostics