Google Webmaster Tools is one of the most popular tools in SEO and makes the work of webmasters much easier. This tool allows a webmaster to keep track of a Web site’s traffic, analyze its robot.txt files, add site maps, and so on. This free tool provided by Google has become an indispensable part of SEO and is the easiest way to track Web site details for users. Google Webmaster has different tools that serve different purposes. This article will discuss the features of these tools, their uses, and how to master these tools. The sections below discuss how to set up and master the different capabilities of webmaster tools.
Google Webmaster tools are free and you don’t need to pay anything to maintain these tools for your Web site. All you need is a Google account. If you don’t yet have a Google account that will work for your site, open a new Google account to access its webmaster tools. Here are the steps required to start using Google Webmaster tools.
Step 1: Sign in at Google Webmaster Tools with your Google account.
Step 2: Add your site to Google Webmaster tools.
Step 3: Validate the authority of your site.
Validating your site is necessary because Google will ask you to prove whether you really own the site. There are two methods of validating your site: a default method, in which you use an html Meta tag, or uploading a text file with the name Google provides.
Having completed these three steps, your site will be added to Google Webmaster tools and you can use any of its tools thereafter. However, it will take some time for Google Webmaster to collect all the data from your Web site. Therefore, you will not be able to see any data when you initially validate your site. After Google Webmaster has collected data about your site, you will be able to use all the tools according to your requirements.
As soon as you sign in at Google Webmaster tools, it will show your Web site or list of Web sites. You can enter the details of any of your sites by clicking over the respective site. By clicking on your site you will land up in the dashboard of your site, which will link you to all the other Google Webmaster tools. The dashboard will reflect the below-mentioned details at the center.
- Search queries:
- Crawl errors
- Links to your site
These are the main queries site owners search for and often use. You can directly click on them, or else you can pick the details by moving further with the options available on the left side of the dashboard. You can click on each option and check the details over there. You will also be able to export the files to Excel sheets for further use.
On the left of the dashboard are other available tools, and below is a list of these tools and their functions.
The site configuration option allows you to configure the settings of your Web site. This provides additional options that independently let you control the configuration for each section. Below are the listed additional options:
You can use the sitemaps option to add new sitemaps, check the status of existing sitemaps, delete any of the sitemaps, and resubmit sitemaps. Submission of site maps can be done by clicking the option “submit a sitemap,” and by following further steps. For existing sitemaps, data relating to status, type, URLs submitted, and URLs in the Web index will be reflected. A download option is also available so you can also export the details provided for your existing sitemaps to an Excel sheet for further use. Any errors in the sitemaps will be highlighted over the sitemap area.
Google Webmaster tools provides a crawler access option that allows you to manage crawler access to your Web site; that is, it lets you manage the robot.txt file. With this option you can specify how the search engine’s crawler is going to crawl your site. If there is any content you don’t want the search engine to crawl, you can specify it. Three additional options are offered at the top section of Crawler access: Test robot.txt, Generate robot.txt, and Remove URL.
Test robot.txt allows you to test the robot.txt file. This option stays there as the default page. You can specify the URLs and user agents for testing. User agents are different spiders that crawl your pages. The test robot.txt offers you different agents including Googlebot-Mobile, Googlebot-Image, Mediapartners-Google, and AdsBot-Google.
Generate robot.txt allows you to gather your new robot.txt file, which you can also add to your server. Four options are available in this section:
- Choose default crawler access: This option allows you two default methods to manage crawler access: one is to allow all the crawlers to access your site, which will make your site available to be crawled by all the crawlers of search engines. The other method allows you to block all crawlers. This hides your site from crawlers and it will not appear in the search results. However, apart from these two options, there are other options whereby you can customize the rules applying to crawler access.
- Specify additional rules: This option gives you the liberty to choose the action i.e., to allow or block, and to choose the user agents as well. You can also independently specify rules for directories and files through this option.
- Download your robot.txt file: This option lets you download your robot.txt file by clicking on the download button.
- Save your robot.txt file and upload to your site’s top-level directory: You can save your robot.txt file with this option. It also allows you to upload your site’s top-level directories.
The “Remove URL” option allows you to request the removal of any URL. It also defines the status of URLs requested for removal earlier. It also mentions the removal type, along with the status and the URL. Thus, whenever any private, useless, or outdated content appears in the Google search results, you can use the Remove URL tool to remove that particular URL so that it will not appear in the search results again.
I am currently working as Application & Software Engineer in Huawei Technologies Bangladesh Ltd . My core skills include extensive knowledge and experience of HTML/XHTML & CSS as well as experience in PHP, Mysql, Wordpress and digital graphic design. I try to build web solutions, which evolve with the changing needs of your business. I am also experienced in administration and application of UNIX/Linux operating systems as well as the Windows family of operating systems.
- 66,606 Unique Visit
Top 5 Posts
- পথ চেনাবে বাংলাদেশি তরুণঃ 'ইয়াহু! হ্যাক ইউরোপ' প্রতিযোগিতার প্রথম স্থান পৃথিবীর নতুন কোনো দেশে পা রেখেছেন।... fb.me/30ciqHcFE 1 day ago
- পথ চেনাবে বাংলাদেশি তরুণ: পৃথিবীর নতুন কোনো দেশে পা রেখেছেন। নতুন স্থান, নতুন পরিবেশ সম্পর্কে কোনো ধারণাই নেই... fb.me/2xrA0XWDH 1 day ago
- চমৎকার একটি Android অ্যাপ্লিকেশান AirDroid ব্যাবহার করে দেখুন , এক কথায় কাজের জিনিস। AirDroid এর বিস্তারিতঃ... fb.me/ISGmcxID 3 days ago
- আরএকটি অসাধারণ মোবাইল ফোনঃ HTC One X+ ১) নতুন এই ফোনটিতে থাকছে ১.৭ গিগাহার্জ এর কোয়াড-কোর এনভিডিয়া টেগ্রা... fb.me/2PT45IBxA 3 days ago
- কিছু মজার মজার তথ্যঃ কম্পিউটার আবিষ্কার করার আগে আমাদের জীবন টা যেমন ছিল :) ১ - Windows ছিল ঘরের জানালা ২ -... fb.me/2qIILxzgZ 6 days ago