Google posted a public service announcement saying you should disallow Googlebot from crawling your action URLs. Gary Illyes from Google posted on LinkedIn, “You should really disallow crawling of your action URLs. Crawlers will not buy that organic non-GMO scented candle, nor do they care for a wishlist.”
I mean, this is not new advice. Why let a spider crawl pages where it cannot really take any actions. Googlebot can not make purchases, cannot sign up for your newsletter, and so on.
Gary wrote:
How should you block Googlebot? He said, “You should probably add a disallow rule for them in your robots.txt file. Converting them to HTTP POST method also works, though many crawlers can and will make POST requests, so keep that in mind.”
Now, a few years ago, we reported that Googlebot can add products to your cart to verify your pricing is correct. It seems to be part of the merchant shopping experience score feature – so I’d be a tad careful with all of this.
Forum discussion at LinkedIn.
Note: This was pre-written and scheduled to be posted today, I am currently offline for Shavout.