I know computers and algorithms are funny but clearly, when you search for [wank] in Google Images and set the SafeSearch filter to moderate, no or little pornographic or nude content is coming up. But when you up the SafeSearch filter to the strict setting, then plenty of nude and pornographic content comes up.
If the lesser version of SafeSearch is blocking it, what is wrong with the stricter version of the filter?
Here is a picture of the current results under moderate SafeSearch:
Here is a picture of just changing it from moderate to strict:
Of course, I blurred out the nude ones. Keep in mind, there are plenty more when you scroll down.
Like I said, I guess this is just a weird quirk in Google Images SafeSearch algorithm and Google will fix it. But just weird nevertheless.
A system administrator said he runs 700 computers and he noticed this while doing some tests. He posted the issue in the Google Web Search Help forums and wrote:
Here are some other stories similar to this we covered in the past:
Porn on Google Image Search with Strict Search On
Google Search By Image Thinks I’m A Porn Star
Google Recommends Reporting Mass Porn In Forums
Google’s Porn Issue With Children Related Keywords
Pirelli Tires? Nope, Pirelli Porn In Google
Google Background Image Of Naked Women
Very Explicit Porn Hits Google Universal Search
Forum discussion at Google Web Search Help.