Google Sitemaps (Beta)

To let webmasters help Google index their site better there is Google Sitemaps. Sounds like a good idea.

So how does it work?
First you need to have a Google account (having a GMail account is probably enough).
Second you need to create a Sitemap file in the root of your site. This is an XML file that lists all your indexable pages. Google even provides a generator for this file.
Third you have to tell Google where your sitemap file can be found.
Last, wait to see what Google does with the sitemap file.

I’m still stuck at the the second stage. The generator from Google requires Python to run. Unfortunately I can’t. I don’t like to update the file mannualy so I’d like this to be automated. If anyone knows of a good solution to generate sitemaps automatically. I’d love to hear about it.

Update: It seems that Google Sitemaps will accept RSS 2.0 and Atom 0.3 feeds as well. So for now I’ve added those. And looking at my logs I see Google visit some links.

4 responses to “Google Sitemaps (Beta)”

  1. I wonder if it’s gonna be as effective as promiced and won’t turn into just another version of submitting a site to Google which only leads to a long wait before it’s even indexed and doesn’t guarantee anything in terms of search results and site’s visibility to the wide online public. Also, I did not realize I have to do anything other than let it generate my sitemap file for me automatically, I am also stuck in step 2 and only submitted a site for experimentation purposes, to see how it goes – I’d rather be careful about using it with any serious site as its impact is not known yet and I don’t want to hurt myself. And I wonder how it would go for, say, blogs that are updated more often than one can imagine updating the sitemap file manually – I believe RSS would serve much better in this case. But we’ll see…

  2. Hello Pet (I’m sorry but I had to remove your original URL, because it hurt my eyes),

    First, before Google sitemaps you only had the robots.txt. This file told you that a (search)bot wasn’t allowed in certain subdirs. Well, that is like honeypot to a … well spammer/hacker/cracker.

    Now you have the option to tell a searchengine where it can look, instead where it may not look.

    Furthermore, Google (and any other searchengine) is always slow to fetch and index your content. Take Technorati, although you’ll have to ping them about an update and it almost immediately indexes you, it takes another 10 minutes before you are visible.

    Kind regards,

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.