Almost all SEOs know that a busy server or slow server will result in GoogleBot slowing how they crawl your web site and we also know that extremely slow sites/pages can be negatively impacted in ranking well in the Google search results.
But how slow is too slow?
Google’s John Mueller called out a specific load time as being too slow for GoogleBot to crawl the site at its normal rate. John Mueller said in a Google Webmaster Help thread:
He specifically called out “over 2 seconds” to load a single URL on this site, which is resulting in GoogleBot “severely limiting the number of URLs” it will crawl on that site.
Now, John is not mentioning anything about the PageSpeed algorithm, just about the crawling of the site.
We often don’t hear specific numbers from Google, like this.
Forum discussion at Google Webmaster Help.