Google Webmaster Tools is one of the most popular tools in SEO and makes the work of webmasters much easier. This tool allows a webmaster to keep track of a Web site’s traffic, analyze its robot.txt files, add site maps, and so on. This free tool provided by Google has become an indispensable part of SEO and is the easiest way to track Web site details for users. Google Webmaster has different tools that serve different purposes. This article will discuss the features of these tools, their uses, and how to master these tools. The sections below discuss how to set up and master the different capabilities of webmaster tools.
Google Webmaster tools are free and you don’t need to pay anything to maintain these tools for your Web site. All you need is a Google account. If you don’t yet have a Google account that will work for your site, open a new Google account to access its webmaster tools. Here are the steps required to start using Google Webmaster tools.
Step 1: Sign in at Google Webmaster Tools with your Google account.
Step 2: Add your site to Google Webmaster tools.
Step 3: Validate the authority of your site.
Validating your site is necessary because Google will ask you to prove whether you really own the site. There are two methods of validating your site: a default method, in which you use an html Meta tag, or uploading a text file with the name Google provides.
Having completed these three steps, your site will be added to Google Webmaster tools and you can use any of its tools thereafter. However, it will take some time for Google Webmaster to collect all the data from your Web site. Therefore, you will not be able to see any data when you initially validate your site. After Google Webmaster has collected data about your site, you will be able to use all the tools according to your requirements.
As soon as you sign in at Google Webmaster tools, it will show your Web site or list of Web sites. You can enter the details of any of your sites by clicking over the respective site. By clicking on your site you will land up in the dashboard of your site, which will link you to all the other Google Webmaster tools. The dashboard will reflect the below-mentioned details at the center.
- Search queries:
- Crawl errors
- Links to your site
These are the main queries site owners search for and often use. You can directly click on them, or else you can pick the details by moving further with the options available on the left side of the dashboard. You can click on each option and check the details over there. You will also be able to export the files to Excel sheets for further use.
On the left of the dashboard are other available tools, and below is a list of these tools and their functions.
The site configuration option allows you to configure the settings of your Web site. This provides additional options that independently let you control the configuration for each section. Below are the listed additional options:
You can use the sitemaps option to add new sitemaps, check the status of existing sitemaps, delete any of the sitemaps, and resubmit sitemaps. Submission of site maps can be done by clicking the option “submit a sitemap,” and by following further steps. For existing sitemaps, data relating to status, type, URLs submitted, and URLs in the Web index will be reflected. A download option is also available so you can also export the details provided for your existing sitemaps to an Excel sheet for further use. Any errors in the sitemaps will be highlighted over the sitemap area.
Google Webmaster tools provides a crawler access option that allows you to manage crawler access to your Web site; that is, it lets you manage the robot.txt file. With this option you can specify how the search engine’s crawler is going to crawl your site. If there is any content you don’t want the search engine to crawl, you can specify it. Three additional options are offered at the top section of Crawler access: Test robot.txt, Generate robot.txt, and Remove URL.
Test robot.txt allows you to test the robot.txt file. This option stays there as the default page. You can specify the URLs and user agents for testing. User agents are different spiders that crawl your pages. The test robot.txt offers you different agents including Googlebot-Mobile, Googlebot-Image, Mediapartners-Google, and AdsBot-Google.
Generate robot.txt allows you to gather your new robot.txt file, which you can also add to your server. Four options are available in this section:
- Choose default crawler access: This option allows you two default methods to manage crawler access: one is to allow all the crawlers to access your site, which will make your site available to be crawled by all the crawlers of search engines. The other method allows you to block all crawlers. This hides your site from crawlers and it will not appear in the search results. However, apart from these two options, there are other options whereby you can customize the rules applying to crawler access.
- Specify additional rules: This option gives you the liberty to choose the action i.e., to allow or block, and to choose the user agents as well. You can also independently specify rules for directories and files through this option.
- Download your robot.txt file: This option lets you download your robot.txt file by clicking on the download button.
- Save your robot.txt file and upload to your site’s top-level directory: You can save your robot.txt file with this option. It also allows you to upload your site’s top-level directories.
The “Remove URL” option allows you to request the removal of any URL. It also defines the status of URLs requested for removal earlier. It also mentions the removal type, along with the status and the URL. Thus, whenever any private, useless, or outdated content appears in the Google search results, you can use the Remove URL tool to remove that particular URL so that it will not appear in the search results again.