When I started using
Google webmaster tools , one day I got to see something like this:
URLs restricted by robots.txt : 24 urls
When I saw the details, then I came to know that most of the restricted urls are nothing but the "
labels" I created on my blog. Did a lot of Google and finally I learnt that we cant change the robots.txt file in Blogger (unless we host our blog in our server).
The I looked at my rebots.txt (http://all-mixed.blogspot.com/robots.txt) which looks like this:
-----------------------------
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Sitemap: http://all-mixed.blogspot.com/feeds/posts/default?orderby=updated
-----------------------------
So,
"Disallow: /search" seems to be fine as it just not going to index the search result pages (labels too), which are kind of duplicate (as individual post url is going to be indexed anyway, if no error)
** In the above default site map, the line
"User-agent: *" means that allow all types of user agents to index your blog content.
If the information I put above seems wrong, please correct me. thanks.