skip to Main Content

New EU Lawsuit Claims Google Failed To Forget ‘Sensitive’ Information, Such As Their ‘Political Affiliation’

For years, we’ve pointed out that the “Right to be Forgotten” (RTBF) in Europe is a dangerous tool that has been and will continue to be abused as a tool to censor freedom of expression, while hiding behind a claim that it is to protect “privacy.” While the concept has been around for a while, it really took off online with a EU Court of Justice (CJEU) ruling from three years ago, saying that Google’s search results index counted as a data repository on someone, and thus, an individual could force Google to “delink” certain results from searches on their names. But, the court left some leeway to Google to decide whether or not the requests were valid. Basically, if the information is no longer relevant for the public to know about the person, then Google should delink it. Now, obviously, that’s a horribly subjective standard, and Google has had to staff up on people to determine whether or not any requested delinking qualifies.

Part of the problem with all of this is that it seems to produce tremendous liability. Fail to get a delinking request “right” and Google is right back in court, which is exactly where we are today. Google has rejected just under 60% of requests to delink info in Europe, and four individuals in France were so upset by this, that they complained that their rights were being violated. The French data protection regulator, CNIL, actually agreed with Google that the information shouldn’t be “forgotten.” However, the four have appealed their case, and it’s been kicked back to the European Court of Justice. The four individuals are claiming that the information is “sensitive data” and are suggesting that just being “sensitive data” alone is enough to require forgetting — no matter what the “public interest” may be in that info.

As Google has noted in a blog post, there are serious questions here about whether or not people can hide information from their past that may be relevant:

The CJEU now has to decide whether “sensitive personal data”—such as the political allegiance of an individual, or a past criminal conviction reported in the press—should always outweigh the public interest.

The tricky thing with this kind of information is that it is often important for people to know and it is frequently reported in newspapers and elsewhere. Requiring automatic delisting from search engines, without any public interest balancing test, risks creating a dangerous loophole. Such a loophole would enable anyone to demand removal of links that should remain up in the public interest, simply by claiming they contain some element of sensitive personal data.

While that is an important point — equally important is the question of how this can be massively damaging for basically any other company, that doesn’t have Google’s legal team and resources to fight. The fact that anyone disagreeing with your decision on a RTBF request can put a company at risk for failing to delete something, and take them to court repeatedly, means that most companies are going to default to deleting.

At least with things like the DMCA notice-and-takedown provision — which is already massively and widely abused to censor content — there are fairly clear and strict rules about how a takedown notice works, and what it requires. With the RTBF, it’s not at all clear, and risks significant and repeated litigation. As it stands, the system is a recipe for widespread censorship of often important information.

Permalink | Comments | Email This Story
Go to Source
Author: Mike Masnick

Back To Top