A Google Blogoscoped Forums thread notes that Google recently disallowed the URL google.com/compressiontest in their robots.txt file. The question is, what is this all about?
Google does have a code base for code.google.com/p/compressiontest/ that is “a small test framework that performs benchmark comparisons between a variety of open source compression libraries.” But how does this impact searchers or SEOs?
On February 17th, a few searchers noticed a bug with their browsers associated with this file. There are threads at Google Web Search Help and Google Custom Search Help with questions on this. Here they are:
A few months later, Google blocks this file in their robots.txt. For what reason? Just clean up?
Ionut Alex. Chitu from Google Operating System blog said in the thread:
Is Google just that obsessed with speed? Does it really mean nothing?
Forum discussion at Google Blogoscoped Forums.