The Search “is never a solved problem and there are always new challenges that we face as the web and the world change”: words of Pandu Nayak, Google Fellow and Vice President of Search, which rekindles the spotlight on a very delicate issue, the one about harassment and defamation that find online visibility also through the results of Google. For this, the search engine has decided to use the hard fist, using algorithms no longer just to remove such content and penalize predatory sites that try to economically exploit the situation, but to protect users more decisively.
The fight against damaging content for people’s reputation
Nayak’s intervention, published on the official blog of the company, comes following a series of articles by the New York Times on the “online slender” phenomenon, which we can translate as “online slander”, which have highlighted a serious problem in the system of management of harassment cases, defamation and attacks on the reputation of people (known or not) on sites that are positioned on the search engine.
Understanding the seriousness of the problem, Google is planning to change its algorithm to better locate and punish websites that post unverified or slanderous claims on other people, building real activities that exploit the victims of this situation.
The authoritative newspaper has described the functioning of this mechanism, the “vicious circle” that has been going on for years: the websites launch “resounding unverified complaints about alleged cheaters, sexual predators, idlers and swindlers”; people slander their enemies; anonymous posts appear high on top of Google’s results for the names of victims; hence, the websites that initiated the trial ask victims “thousands of dollars to remove the posts”.
Google’s commitment against these harassment cases
Nayak first recalls the work done by Google “in the last two decades of research development”, aimed at “improving and improving our ability to deliver the highest quality results for the billions of queries we see every day”continuing to constantly update the system to make it work better for users.
In particular, one of the areas of this commitment was the attempt to “balance the optimization of access to information with the responsibility to protect people from online harassment”.
Although Google’s classification systems are designed “to bring out high-quality results for as many queries as possible,” in fact, some types of queries are “more susceptible to bad actors and require specialized solutions”. An example are websites that use exploitation removal practices, such as those described by the NYT, and that require a payment to remove content: since 2018, Google has set “a policy that allows people to request the removal of pages with information about them from our results”.
The first effect of this action is the removal of these pages from the presence in Google Search, but the algorithm uses this indication as a sign of demotion of the entire site in the Search, so that sites that are guilty of exploitative practices are placed lower in the results. It is, according to Nayak, an “industry-leading solution, effective in helping people victims of harassment from these sites”.
Google’s algorithm against predators
However, this intervention is not enough to solve the problem, as evidenced by the articles of the NY Times: even if placed further down, these slander-peddling sites (which peddle defamation) often continue to appear in the Search, and therefore people unjustly accused of being drug addicts or pedophiles still see their reputation damaged and are often forced to give in to “blackmail” for the removal of such contents.
In addition, the newspaper also brought to light problems with repeated harassment, which continued after the removal of content as required.
Admitting that Google did not know of such situations, Nayak nonetheless announced that the company is working to overcome “some limits of our approach“, implementing corrections to further protect the “known victims”.
The preservation of know victims
Now, explains Search’s VP, “When someone has requested a removal from a site with predatory practices, we will automatically apply ranking protections to prevent content from other similar sites of low quality from being displayed in search results for people’s names”, and also “We are also looking to further expand these protections as part of our ongoing work in this space”.
It is an approach inspired by the one already adopted with victims of explicit non-consensual content, commonly known as revenge porn. Although no solution is perfect, says Nayak, “our assessments show that these changes significantly improve the quality of our results”.
In particular, the system will make it harder for sites to “get traction on Google through one of their favorite methods: copy and republish defamatory content from other sites“, analyzes the New York Times, which summarizes: “The company plans to change its search algorithm to prevent websites, operating under domains such as Badgirlreport.date and Predatorsalert.us, from appearing on the list of results when someone looks for a person’s name”.
Effects of the new approach
Google has already started testing the changes, and side-by-side comparison of new and old search results reveals that the effects are visible.
The expert Chris Silver Smith, who has worked for years in the field of reputation management, has shared examples of sites such as Ripoff Report, Pissed Consumer and Complaints Board that have less visibility in Google Search.
And the same Times had “previously compiled a list of 47,000 people who were cited on libel sites: in a search of a handful of people whose results were previously disseminated by smear posts, the changes made by Google were already detectable”The American newspaper’s article continues. In particular, for some “posts had disappeared from their first page of results and their image results”, while for others “posts were mostly missing – except one from a new defamation site, Cheaterarchives.com”.
This site “can prove the limits of new Google protections: being fairly new, it is unlikely to have generated complaints from victims”, and also “does not explicitly advertise the removal of posts as a service, making it potentially more difficult for victims to have it removed from their results”.
An algorithmic change that is not scary
This time, then, Google does not announce an algorithmic change that scares in terms of ranking, because it affects only harmful and dangerous sites, which publish defamatory content for the sole purpose of exploitation for economic purposes; will indeed be an important help for those who work in the management of online reputation, because they will support the work with customers struggling with web reputation problems in Google Search.
The new and more robust security measures will prevent the birth of content that is harmful to reputation for people’s names and are inspired by a general and systemic approach (without therefore addressing the issue on a case-by-case basis, when new sites are displayed); as Nayak says, “we do not fix individual queries, as they are often a symptom of a class of problems affecting many different queries”, but a specific class of queries, those related to names.
This means that we should see fewer predatory websites or exploiters emerge in Google Search results for name queries. As with any other area of Search, such as spam, it is easy to predict that some sites will look for methods to circumvent algorithms, prompting Google to work on alternative solutions with new and improved preventive search algorithms.
This awareness does not frighten the American giant that, as Nayak concludes, will continue to apply the same approach of the origins: take “examples of queries where we are not doing our best in delivering high quality results and looking for ways to make improvements to our algorithms”. This ability to deal with problems “continues to drive the industry and we have implemented advanced technology, tools and quality signals over the past two decades, making research work better every day” thanks to feedback (including newspaper criticism).