Top 5 This Week

Related Posts

PSA: Bing Says They Ignore Default Robots Directives If There Is A Bingbot Section

Bingbot

Frédéric Dubut from Bing’s search team said on Twitter that if you create a specific robots.txt directive for Bingbot, their crawler, then Bing will only look at that specific section. So you should make sure that when you do that, copy all the directives from the default to the Bingbot section that you want Bing to comply with.

He said:

Google works a bit differently and goes with the most strict directive they can find, when not told otherwise:

Forum discussion at Twitter.

Update: John Mueller said it works the same way for Google, he said this on Reddit “This is standard for any user agent section in the robots.txt :)”

Popular Articles