Top 5 This Week

Related Posts

Multiple Robots.txt Files for Single Domain

A HighRankings Forum thread asks why do some people use more than a single robots.txt file to control and instruct search spiders how to crawl and access their content. That is a good question. Typically, the spiders will only listen to the robots.txt file found in the root level. So technically, if you place a robots.txt on a subdomain, the search engine will likely ignore it. I do not believe the same applies to subdomains, where subdomains have their own root levels.

HighRankings administrator, Randy, said:

I love what Ron Carnell added:

I believe Google often uses individual sitemaps per subdomain, to control their content.

Forum discussion at HighRankings Forum.

Popular Articles