There is a short but informative post by Google’s John Mueller in the Google Webmaster Help forum on the topic of blocking Google from seeing content on a specific page.
John explains there are many ways to block Google from indexing your page but the technique would be different based on the time period you want something blocked from indexing. Often I see that webmasters are confused about how to handle this, so here is John Mueller’s advice:
If you just don’t want the content indexed (maybe you’re trying something out on the page), then using the robots.txt is a good approach
If it’s very temporarily, maybe even a 503 HTTP response code
If you want the page actively removed from search, then I’d definitely recommend using a noindex over the robots.txt
If you’re using a staging server and don’t want that indexed, limiting access to just the testers’ IP address ranges or using server-side authentication would be good approaches too.
John added that don’t flip/flop back and forth between these techniques on a single page because it will confuse GoogleBot. He wrote:
Forum discussion at Google Webmaster Help.