There is this interesting conversation on LinkedIn around a robots.txt serves a 503 for two months and the rest of the site is available. Gary Illyes from Google said that when other pages on the site are reachable and available, that makes a big difference, but when those other pages are not, then “you’re out of luck,” he wrote.
Note, he specified the home page and other “important” pages as needing to be available…
The thread was posted by Carlos Sánchez Donate on LinkedIn where he asked, “what would happened if the robots.txt is 503 for 2 months and the rest of the site is available?”
Gary Illyes from Google responded:
The question was if there needs to be more clarification on the robots.txt 5xx error handling in the documentation or not to handle this.
This is a super interesting thread, so I recommend you scan through this stuff if it interests you. Of course, most of you would say, just fix the 5xx errors and don’t worry about this. But many SEOs love to wonder about the what if situations.
Here is a screenshot of this conversation, but again, there is a lot more there, so check it out:
Forum discussion at LinkedIn.