Why Google is increasingly sorting results

Posted on August 20, 2022


Who is the King of the United States? Until recently you could read Obama in the first search results in English. While some Internet users have fun diverting the search results put forward by Google, the search engine has decided to remove the eccentric answers. What is behind this phenomenon?

Manipulation of search results: késako

The development of web 2.0, also known as participatory web, has profoundly changed our use of the internet, but also our relationship to information. Unfortunately, some not always well-meaning people manipulate search results and navigation by, for example, creating erroneous articles with highly searched keywords, or using other methods likely to mislead Internet users.

This method has a name. This is theastroturfing. This technique can take many forms, from the simple concealment of one’s affiliation to a company while claiming to provide independent testimony, to more complex forms, using software that multiplies false identities on the Internet. L’astroturfing for example, was used in December 2012 in South Korea as part of a defamation campaign aimed at removing a candidate, with 24 million tweets.

The manipulation of the results is in theory prohibited

On November 20, 2018, France adopted a Law against the manipulation of information, which aims to better protect democracy against various forms of intentional dissemination of false news.

In commercial matters, the publication of a false notice is certainly incomparable with the attempt to manipulate public policy, but the practice is nonetheless illegal.

As such, a “false comment” is punishable by a fine of up to 300,000 euros and two years’ imprisonment. The company itself risks a fine of up to 10% of its turnover.

There is no neutral result in a search engine

Is there a search engine that wouldn’t censor its content?

Many search engines claim to offer unbiased and unfiltered search results. These include: StartPage, Swisscows, Qwant, Infinity Search, Gibiru, BoardReader, Searx or DuckDuckGo.

Unfortunately it is very difficult to determine if this is really true because there are always technological biases and strategies that ultimately influence search results. For example, since the Russian invasion of Ukraine, DuckDuckGo now downgrades sites believed to be associated with Russian disinformation. Internet users liken this change to censorship.

Transparency of algorithms, false good idea?

Finally, as algorithms become increasingly involved in decisions that affect our lives, their decisions can be biased.

This is the reason why some politicians call for the transparency of algorithms. It is in this context that a bill in the United States, the “Algorithmic Accountability Act of 2022” aims to force the State and large companies to submit their algorithms to a battery of checks to detect possible biases. .

However, beware of too radical solutions because this transparency can also have the opposite effect. First, the algorithms can be copied which is contrary to the principle of ownership and innovation, but it would also be even easier to manipulate its results.

Educate: an essential solution to fight against manipulation

As researcher Aurélie Jean points out, it may never be possible to completely eradicate biases in algorithms (nor their corollaries).

On the other hand, we can go further in the field of explainability, which aims to justify as precisely as possible a result given by a model. It is also a matter of behavior and collective responsibility. It should indeed be remembered that freedom of expression is a fundamental freedom in our society and in this respect, the restrictions imposed must be justified, appropriate and proportional.

In a democracy, it is neither up to the state nor to search engines to say whether information is true or false.

A virtuous circle of vice

As noted by journalist Thomas Huchon interviewed by Radio France:

“It is these fake news that make us react the most”.

Thanks to the algorithms that put them ever more forward, “they are in a kind of virtuous circle of vice”.

We shouldn’t expect everything from platforms. On the other hand, access to the judge should be facilitated for anyone interested in stopping the dissemination of false information. From the point of view of freedom of expression, the most lasting and sustainable solution to spotting manipulation or false information involves educating young and old alike. This means taking a step back, thinking critically, objectively analyzing the information before sharing everything that can be read on the Internet. We are not algorithms!

Leave a Comment