A Google Webmaster Help thread has one webmaster complaining his site isn’t being crawled by Google and isn’t showing up in the search results. The reason, his site can’t handle Googlebot crawling it.
The site is pretty static and basic but the server is a cheap or free host that can’t handle much activity. So Googlebot can’t crawl it without taking down the site and thus stays away until it can get through to crawl it without negatively impacting the site.
The interesting thing is that if you use the Fetch As Googlebot feature when this is the case, it will fail as well. So you can actually somewhat diagnose a major site speed issue with Fetch as Googlebot.
John Mueller from Google said:
Forum discussion at Google Webmaster Help.