Google Suggest was supposed to help users type a query by providing useful suggestions. Unfortunately, some of the suggestions are offensive and Google had to filter the searches related to pornography, violence, and hate speech.
Google's over-protective algorithms now filter all the suggestions that include "is evil", "I hate", "[ethnic group] are" (for example, "chinese are"). Google Suggest also filters "Smells Like Teen Spirit", the name of a popular Nirvana song.
"Queries in autocomplete are algorithmically determined based on a number of objective factors (including search term popularity) without manual intervention," explains Google. Google Suggest's filtering flaws are more obvious, now that Google Instant previews the results without having to press Enter. If you type [google is e], Google no longer previews the results and suggests to "press Enter to search".
Google Blacklist (not safe work and potentially offensive) lists some of the rules used by Google to censor the list of suggestions. "Like everything these days, great care must be taken to ensure that as few people as possible are offended by anything. Google Instant is no exception. Somewhere within Google there exists a master list of "bad words" and evil concepts that Google Instant is programmed to not act upon, lest someone see something offensive in the instant results... even if that's exactly what they typed into the search bar."
{ via waxy.org }
Subscribe to:
Post Comments (Atom)
It would be good if such filtering could be disabled...
ReplyDeleteThe filtering is applied even if I have SafeSearch turned off.
Oh come on, if you're typing those words in, you *know* what you're searching for.
ReplyDeleteHow quickly we've come around from being skeptical of Google Instant to demanding it.
ReplyDeleteGoogle is however, not very good at it! Putting in 'Blair is evi' comes up with results and even suggests: Did you mean: 'Blair is evil'. If you put 'Blair is evil' it hides the results until you press enter...
ReplyDeleteTheir methodology for filtering offensive suggestions seems off to me.
ReplyDeleteInstead of having it filter suggestions based on offensive phrases in the query itself, it should filter based on the aggregate "OffensiveRank" of the pages shown when a given suggestion is searched for. They must have something like this already in place to support SafeSearch.
As for disabling such filtering when a searcher has SafeSearch off, I'm on the fence about it. On the one hand, if I've disabled it I ought to be able to find the naughty stuff. On the other, I don't really want nasty/racist/etc search results and suggestions popping up when I'm doing a search for something more benign. And some of the naughty terms are also the most searched for on Google.
A nice compromise might be to just downrank offensive suggestions when SafeSearch is off.
"Oh come on, if you're typing those words in, you *know* what you're searching for."
ReplyDeleteI don't think you understand how Google Suggest/Instant works. It starts displaying suggestions and search results before you've finished typing the word. So if the word starts with the same letters as other offensive results, those offensive results will be shown.
For example, if they didn't filter the suggestions, then if you were doing a search for Portugal, by the time you type in "por", you would probably get some undesirable results.