A robots.txt
file is read by search engine robots when they
crawl your blog. You can use it to tell them which pages should be indexed. There
are a couple of reasons why using a robots.txt
file is good for
SEO. First, Google and other search engines recommend you use one and it's
generally a good idea to do what they say. Second, it can help you to cut down on
duplicated content.
Search engines do not like duplicated content (that is, the same content appearing at two different URLs within a website) because they suspect it might be spam. One minor drawback with WordPress is that it can create a lot of duplicate content. For example, http://blog.wpbizguru.com/category/tutorials points to exactly the same content as http://blog.wpbizguru.com/tutorials. Also, the same content is repeated...