Google would like to remind you that those auto-complete things you see as you search for answers are not predictions but rather, suggestions and they are based on the real searches that are happening. They are the most common or trending ones that are relevant to the characters you type in, where you’re typing from, and also your previous searches. But this can also mean some “inappropriate” predictions make it to the search box and so now they will be rolling out an expanded policy on these kinds of results.

Google already has an existing autocomplete policy and they remove predictions that are sexually explicit (unless connected to medical, scientific, or sex education topics), hateful against groups or individuals based on race, religion, sexual orientation ,etc, violent, and dangerous or relating to harmful activity. They also remove those that they determine to be spam, relating to piracy, and basically anything that should “not shock users with unexpected or unwanted predictions”. Of course there are still a lot that get past them given the millions of people who search through Google every day.

They are now trying to improve how they flag inappropriate predictions. Last year they launched a feedback tool and based on their learnings, they are expanding their criteria for policy removals. Under this new policy, it will also cover predictions perceived as hateful or prejudiced toward individuals or groups even if they don’t belong to a certain demographic. Predictions will be retained if there are clear “attribution of sources” indicated. Their example is if a lyric or book title may be construed as sensitive, but if you add “lyrics” or “book” then it will still be retained.

As always, Google would need feedback from users in order to make things work. They already have a “report inappropriate predictions” link just under the search box on your desktop. If you’re on mobile, you can long press on the prediction so you can get the reporting option. Here’s to better and more “appropriate” predictions on Search.

SOURCE: Google