Google’s new Sitemaps

Google has just added some changes to its Sitemaps. Some things there are interesting to note:

1. When you log in, in case you’ve been logged into any other Google services you use before, it asks you whether you want to use a separate account for Sitemaps – and warns that you can only be logged into one account at a time – nice enough of G to tell us 😉

2. Now you don’t even need to create an xml sitemap to use the service – all you do is upload to your server a file named with a unique code Google gives you – this way they verify you are the true owner of the domain. OK, this is all fine and pretty secure – but this is how bad they want to know who owns what!

The new information provided for the domains submitted to Google Sitemaps includes problems crawling pages, and otehr Googlebot stats. that’s not so new really, as all these stats are obtained through the old queries, only now they are listed like this:

Query Type Link
Indexed pages in your site
Pages that refer to your site’s URL
Pages that link to your site
The current cache of your site
Information we have about your site
Pages that are similar to your site

Considering the bad reputation the site: query has had lately… Hmm…

Another question that pops up in my head is whether adding the Google-coded file to the server will be as lethal for black hat sites as using Google Sitemaps proved to be. In other words, I’m not in a hurry to try it out until I hear people’s reports on this.

Comments are closed.