Believe it or not, I am not a huge fan of placing robots.txt files on sites unless you want to specifically block content and sections from Google or other search engines. It just always felt redundant to tell a search engine they can crawl your site when they will do so unless you tell them not to.
Google’s JohnMu confirmed in a Google Webmaster Help thread and even recommended to one webmaster that he/she should remove their robots.txt file “completely.”
John said:
I know many SEOs feel it is mandatory to have a robots.txt file and just have it say, User-agent: * Allow: /. Why when they will eat up your content anyway?
Anyway, it is nice to see a Googler confirming this, at least in this case.
Forum discussion at Google Webmaster Help.