Basic Google Webmaster Guidelines
Google Webmaster Tools is one of the most popular tools in SEO and makes the work of webmasters much easier. This tool allows a webmaster to keep track of a Web site’s traffic, analyze its robot.txt files, add site maps, and so on. This free tool provided by Google has become an indispensable part of SEO and is the easiest way to track Web site details for users. Google Webmaster has different tools that serve different purposes. This article will discuss the features of these tools, their uses, and how to master these tools. The sections below discuss how to set up and master the different capabilities of webmaster tools.
Google Webmaster tools are free and you don’t need to pay anything to maintain these tools for your Web site. All you need is a Google account. If you don’t yet have a Google account that will work for your site, open a new Google account to access its webmaster tools. Here are the steps required to start using Google Webmaster tools.
Step 1: Sign in at Google Webmaster Tools with your Google account.
Step 2: Add your site to Google Webmaster tools.
Step 3: Validate the authority of your site.
Validating your site is necessary because Google will ask you to prove whether you really own the site. There are two methods of validating your site: a default method, in which you use an html Meta tag, or uploading a text file with the name Google provides.
Having completed these three steps, your site will be added to Google Webmaster tools and you can use any of its tools thereafter. However, it will take some time for Google Webmaster to collect all the data from your Web site. Therefore, you will not be able to see any data when you initially validate your site. After Google Webmaster has collected data about your site, you will be able to use all the tools according to your requirements.
As soon as you sign in at Google Webmaster tools, it will show your Web site or list of Web sites. You can enter the details of any of your sites by clicking over the respective site. By clicking on your site you will land up in the dashboard of your site, which will link you to all the other Google Webmaster tools. The dashboard will reflect the below-mentioned details at the center.
- Search queries:
- Crawl errors
- Links to your site
These are the main queries site owners search for and often use. You can directly click on them, or else you can pick the details by moving further with the options available on the left side of the dashboard. You can click on each option and check the details over there. You will also be able to export the files to Excel sheets for further use.
On the left of the dashboard are other available tools, and below is a list of these tools and their functions.
The site configuration option allows you to configure the settings of your Web site. This provides additional options that independently let you control the configuration for each section. Below are the listed additional options:
You can use the sitemaps option to add new sitemaps, check the status of existing sitemaps, delete any of the sitemaps, and resubmit sitemaps. Submission of site maps can be done by clicking the option “submit a sitemap,” and by following further steps. For existing sitemaps, data relating to status, type, URLs submitted, and URLs in the Web index will be reflected. A download option is also available so you can also export the details provided for your existing sitemaps to an Excel sheet for further use. Any errors in the sitemaps will be highlighted over the sitemap area.
Google Webmaster tools provides a crawler access option that allows you to manage crawler access to your Web site; that is, it lets you manage the robot.txt file. With this option you can specify how the search engine’s crawler is going to crawl your site. If there is any content you don’t want the search engine to crawl, you can specify it. Three additional options are offered at the top section of Crawler access: Test robot.txt, Generate robot.txt, and Remove URL.
Test robot.txt allows you to test the robot.txt file. This option stays there as the default page. You can specify the URLs and user agents for testing. User agents are different spiders that crawl your pages. The test robot.txt offers you different agents including Googlebot-Mobile, Googlebot-Image, Mediapartners-Google, and AdsBot-Google.
Generate robot.txt allows you to gather your new robot.txt file, which you can also add to your server. Four options are available in this section:
- Choose default crawler access: This option allows you two default methods to manage crawler access: one is to allow all the crawlers to access your site, which will make your site available to be crawled by all the crawlers of search engines. The other method allows you to block all crawlers. This hides your site from crawlers and it will not appear in the search results. However, apart from these two options, there are other options whereby you can customize the rules applying to crawler access.
- Specify additional rules: This option gives you the liberty to choose the action i.e., to allow or block, and to choose the user agents as well. You can also independently specify rules for directories and files through this option.
- Download your robot.txt file: This option lets you download your robot.txt file by clicking on the download button.
- Save your robot.txt file and upload to your site’s top-level directory: You can save your robot.txt file with this option. It also allows you to upload your site’s top-level directories.
The “Remove URL” option allows you to request the removal of any URL. It also defines the status of URLs requested for removal earlier. It also mentions the removal type, along with the status and the URL. Thus, whenever any private, useless, or outdated content appears in the Google search results, you can use the Remove URL tool to remove that particular URL so that it will not appear in the search results again.
Google sitelinks is a feature that enhances a site’s list links with additional relevant links. Sitelinks, which appear under the search results for your site, are the links that go to other sites from your site. Google generates these sitelinks automatically, and not every site has these links. A site has to earn such links, which are very beneficial for a site as they appear as the top results in SERP. However, only quality, user-friendly content invites such sitelinks from Google. If your Web site is very new, don’t expect to get your sitelinks immediately.
Every site strives to obtain Google’s sitelinks because they give you top rankings and wide Web space. Furthermore, they push your competitors further down in the SERP. Thus, they help achieve your SEO goals. According to Google, sitelinks are auto generated.
After you succeed in gaining Google’s sitelinks, you can manage the list of sitelinks. If you don’t want any page to appear as a sitelink, you can demote that page. Google will grant your request while generating the sitelinks, but there is no guarantee the particular page will be demoted. You can demote 100 URLs, and demotions are effective for 90 days. It’s a privilege and very beneficial for any Web site to get Google sitelinks. You can manage your sitelinks listing by using the sitelinks option Webmaster Tools provides.
Change of address
Use this option whenever you want to change your site address. This tool makes it easier for you to move your site to another domain, which helps the process of indexing pages and facilitates a smooth transition for your users. Many times, for different reasons, you might be required to move your site to a different domain. This tool makes the process easier and simpler. You get step-by- step instructions using Google Webmaster tools to change the address. Here are the steps:
- Step 1: Set up the new site and manage all its content and internal links. Make sure all your internal links point to your new domain.
- Step 2: Redirect all traffic coming to your site to the new domain so you don’t lose any traffic and hamper your business. You can do so by a permanent redirect of 301. Make sure all your traffic is redirected to your new domain.
- Step 3: Submit and validate your new site in Webmaster Tools.
- Step 4: Use the change of address option and submit the new domain of your site.
After this process your site will be moved to its redirected domain. You need to check the updates in Webmaster Tools to see the indexing and crawling for the new domain.
The settings option allows you to customize your requirements according to the geographical target, domain, and crawl rate. This option lets you set your preferred domain and also your desired crawl rate. The settings tool provides three options:
- Geographical target: Use this option to target a specific audience for your Web site. By default, the geographical target is kept off. First check the checkbox and enable your target, then choose the preferred geographical location and save it. You can later change your target.
- Preferred domain: With this option you can tell Google your preferred domain. This can help if you are worried about link canonization.
- Crawl rate: This option allows you to decide the crawl rate for your site. The search engine has its own time of crawling and there is no fixed time for crawling pages. Therefore, you can fix your own crawl rate for convenience. You can either allow Google to determine its crawl rate or check the “set custom crawl rate” option to set your own crawl rate. With this option you can tell Google at which intervals it can come and crawl your pages.
This option allows you to specify parameters for particular URLs to Googlebot. Using URL parameters helps Googlebot crawl your site more efficiently. It also saves you bandwidth and boosts the number of unique pages.
Your site on the Web
This section gives you details of your site’s Web presence. It states the search queries, links for your site, keyword status, and internal links, which help you get to know how your site is rated on the Web. Each option is discussed below.
By clicking this option you can see the search queries for your site. You get the keywords for which your site is being searched and ranked. It also lets you know your place in Google’s search results. You can see the top queries for the top pages in this section, as well as the impression change, clicks change, CTR change, and change in average position. You can put filters on according to geographical location and can set the time to see the changes within that time. You get the impression change in percentage terms and the position of the keywords numerically. This section helps you see whether your site is increasing or decreasing in popularity. It gives you details on keywords and therefore helps you decide your calls based on keywords.
Links to your site
This section is very important because you can see the internal and external links of your site’s URLs. You can see who is linking to your internal pages, and by entering your internal pages you can get details of any back links earned by any of your pages. You can also view the anchor text against each back link. This section thus helps you get all the details of your site based on linking. Earning back links is like getting votes in favor of your site. Therefore, you can see the sites that are linking to you and their quality. You can also take your call in removing back links from bad sites, which might harm your site.
The search queries section also provides details of a list of keywords. You can see the position of a keyword against your Web site. You will get a list of keywords for which your site is being searched and what keywords are working for your site. Generally, you will get a list of important keywords for your site. You might be surprised by the improvement some keywords give you and you can work on those for better results. If you don’t see your targeted keywords, you can check through the crawl errors and take the necessary steps to find them. If you see irreverent keywords, there might be a possibility of your site being hacked. In such cases you can again decide your call. Thus, this section helps you build a better and more useful site.
This section is similar to links to your site option. You get the details of your internal link building within your site. You can search which internal link is pointing to which one [site?]. The uniqueness of this section is that it lets you know the amount of juice that has been passed. The passage of link juice has a very important value in SEO, and therefore this section of internal links is very helpful.
This section describes your existing feeds and gives you subscriber stats for these feeds. It also provides you with Google subscribers. You can click on Google subscribers and get the respective data. You can also add your feed as a sitemap. The option “submit feed as sitemap” lets you submit your feed as a sitemap. There is also a downloading option which lets you download the data for all the sites to an Excel spreadsheet.
This option gives you additional information on search impact, activity, and audience. It tells you the annotated details for impressions, clicks, and CTR. You get the data for +1 annotated impression, +1 annotated clicks, and the search impact. You have the option of sorting pages by +1 annotated impression, +1 annotated click, all impressions, and all clicks. You can also compare the clicks, impressions, and CTR with this option.
As the term suggests, this section details all the errors or issues of your site. The Diagnostics section consists of the following parts:
If your site is infected by some hacker or has any malware issues, this section gives you the details, which helps you make the required corrections. All sorts of malicious and infected software which might harm your Web site can be detected in this section.
The Crawl errors section states the crawling errors for your pages. This section gives you the URLs, their details, and the detected crawl errors of the URLs. This helps you check through the errors and make the required changes. There are three main tabs in this section: Web, mobile CHTML, and mobile WML/XHTML. You can encounter the issues Google faced while crawling your pages and correct them.
This section gives the statistical data over the crawling period on your site. Three sub- sections are offered in the crawl stats section: pages covered per day, kilobytes downloaded per day, and time spent downloading a page. Each section gives detailed graphs on their respective areas. Along with the graphs, each section also gives you information on highs, averages, and lows. There is also a page rank section at the bottom part, which provides the page ranks of your pages in detail and their status.
Fetch as Googlebot
With this option you can see how a page will appear to Google. You need to put your URL in the space provided and click on the fetch button. If a URL is successfully fetched, you can submit it for Google indexing.
This section highlights the issues Googlebot confronted while crawling your pages. It will show you the issues in terms of Meta descriptions, title tags, and non-indexable content. Each sub-section will detail issues such as duplicate, long, or missing Meta descriptions, which will help you in correcting them.
This is Google Webmaster’s final tool. It gives you options of custom searching, instant previews, and site performance. Custom search lets you customize your search either to your site only or to other sites as well. You can thus see the results in your customized way. Instant preview gives you snapshots of pages as they appear in the search results. Finally, site performance shows you the performance statistics of your site. You can use the information to improve speed and the users’ experience with your site.
These tools provided by Google Webmaster tools make the webmaster’s task easier and simpler. By using these tools, you can improve users’ experience with your site. You can also make all the corrections required in terms of SEO. By understanding how to use the webmaster’s tools will help you improve your Web site in every way.
Leave a Reply Cancel reply
I am currently working as Application & Software Engineer in Huawei Technologies Bangladesh Ltd . My core skills include extensive knowledge and experience of HTML/XHTML & CSS as well as experience in PHP, Mysql, Wordpress and digital graphic design. I try to build web solutions, which evolve with the changing needs of your business. I am also experienced in administration and application of UNIX/Linux operating systems as well as the Windows family of operating systems.
- 66,233 Unique Visit
Top 5 Posts
- আরএকটি অসাধারণ মোবাইল ফোনঃ HTC One X+ ১) নতুন এই ফোনটিতে থাকছে ১.৭ গিগাহার্জ এর কোয়াড-কোর এনভিডিয়া টেগ্রা... fb.me/2PT45IBxA 17 minutes ago
- কিছু মজার মজার তথ্যঃ কম্পিউটার আবিষ্কার করার আগে আমাদের জীবন টা যেমন ছিল :) ১ - Windows ছিল ঘরের জানালা ২ -... fb.me/2qIILxzgZ 3 days ago
- আজকের জন্য টিপসঃ সাবধান হন এখনি। আপনিও হতে পারেন সিম ক্লোনের শিকার। ১- সিম ক্লোন কি? একটি সিম যেটি আপনি... fb.me/1Ol0itFjy 3 days ago
- Exclusive: Samsung I9505 Galaxy S4: PRICE: 50,000/= (Around) 2G Network / 3G Network / 4G Network Micro-SIM... fb.me/2KDFfjezK 5 days ago
- ধামাকা অফারঃ 0% Down payment, 0% Interest, 0% Processing fee and 100% grand in style. Get your Samsung GALAXY... fb.me/2j9I6VnRk 5 days ago