Quantcast
Channel: La Quadrature du Net - Net filtering
Viewing all articles
Browse latest Browse all 110

Content regulation: what legal obligations for the GAFAM?

$
0
0

op-ed by Arthur Messaud

7th of November 2018 - Last week, we explained that decentralizing the web would bring hope to organize our online exchanges democratically, to counter the hegemony of the online attention economy. Indeed, the GAFAMs are distorting our exchanges for economical reasons while propounding hateful, caricatural, violent or even paid statements... to the detriment of others. This must be fixed.

To promote a decentralized alternative, we proposed that libre/open hosting providers (who do not impose a hierarchy of content) should no longer be subject to the same legal obligations as the giant's ones (who do force their hierarchy of content on us).

These legal obligations, which require an increasingly rapid censorship of "manifestly illegal" content which are notified to hosting providers, are hampering the development of the decentralized Web. Thus, we proposed that these hosting providers no longer have to bear these heavy obligations: only a judge can require them to censor content.

Today's question is not about these hosting providers but about the others : what obligations should be applied to the Giants ?

The power of the Giants

This story is unfortunally common enough: you registered on Facebook, Youtube or Twitter many years ago and, since then, you have made many connections (you have added "friends" on Facebook, you have gathered an attentive audience on Youtube, and you are following relevant people on Twitter). Through the years, these plateforms have adopted rules of hierarchization of content that are becoming more and more constraining.

Facebook and Instagram have begun to censor every image that is contrary to their puritan vision of society, to hide media websites they consider contrary to what they think is a "right debate" and to organize our "news feeds" to make us more receptive to their advertising. Youtube favors videos that are more and more anxiety-provoking, to make us stay as long as possible on its website, to the detriment of other videos whose audience is becoming more and more governed by the company alone. Twitter organizes our exchanges to make them more and more lively and captive, without worrying about the consequences on the public debate.

More generally, "A perverted connection between hate speech and advertising " has been created. "People who write offensive or extremist remarks are the "money makers", because one of them can cause fifty or a hundred others. From this perspective, it is valuable for these networks to host and disseminate this kind of speech" ("Strenghtening the fight against racism and antisemitism on the Internet", report to the French Prime Minister, p. 36, in french).

However, many of us are staying on these platforms. We cannot sever all the link we built at once. We suffer from these new constraints that we did not choose and that we cannot really escape. This is what we need to correct.

A new legal status

If we wish to use the law to correct the giants, we first need to find an objective criterion to identify them. Then, around this criterion, the law will be able to set specific obligations to form a "legal status" that fit their situation. This new legal status would be half way between hosting provider and editor, less flexible than the first one and less strict than the second one. The giants'"power of constraint" could be the criterion that define their new status. This "power" manifests itself when users of a platform cannot leave it without withstanding "negative consequences", which means that the platform can impose any rule they choose.

In our previous example, these "negatives consequences" were the loss of the human links weaved on the platform. To measure the severity of the loss, these links can be assessed depending on the size of the platform, its life span and its functionning, for example.
The "negative consequences" to take into account to assess the platforms'"power of constraint" must not be limited to this one example. Indeed this criterion was inspired by "freedom of consent" principle which is at the heart of the GDPR. Today, we propose to extent this criterion to questions about freedom of information, with all the flexibility it offers to adapt to different situations.

Once this criterium stated, comes the question of the content of the new status (in the following text, we will call "giants" any service owning the "power of constraint" defined before - not only the GAFAM).

There are several ways to prevent the giants from abusing of their power and trap us into their toxic rules which distort our exchanges for their personal gain. In the upcoming debates we will have to identify the best ways to do so. Here are some first leads.

Beforehand, it might be helpful to start back with the basics, to go back to the fundamental principle of Net Neutrality: if you can't really choose a service to communicate, this service must not be allowed to prioritize (promote, slow down or censor) the information you publish and receive, except if the law requires it.

Therefore we must force the giants to choose between becoming neutral or giving up their power of constraint (and thus going back to the genuine hosting providers status).

Open up to decentralisation...

In practice, if we don't want to lose the connections we have made on the plateforms run by the giants, we have no other choice but to keep using them. This can be corrected if the giants become "interoperable" with other services: if they let us communicate with our "Facebook friends" without remaining on Facebook ourselves.

Technically, this interoperability relies on enforcing "communication standards": a language shared by one or several services to communicate with each other. For example the ActivityPub standard offers a standard for "decentralised social networks" which gives us concrete hope that a decentralised Web can rise. Moreover, enforcing such standards would make the "data portability right", which was created by the GDPR (article 20), finally effective. Without interoperability between these platforms, this right struggles to prove its usefulness.

We could leave from a giant (for example Twitter) to move to a different service (Mamot.fr, the decentralised Masoton micro-blogging service proposed by La Quadrature). From this new service, we would be able to keep receiving and sending messages to people who stayed on the giant (Twitter) without breaking ties with them.

This way, as soon as a giant would give up their power of constraint, we could freely escape the destructive framework built by their capitalisation of our attention. With the virtuous circle of decentralisation set in motion, their power of constraint might diminish so much that they might, eventually, go back to the more flexible status of genuine hosting provider.

In any case, they would be bound like any other hosting provider to clearly display their rules on moderation, so that we could use them fully informed. Likewise, this status of a more flexible hosting provider doesn't mean that the obligations in terms of personal data protection are in anyway lighter: content must never prioritised based on personal data without the explicit consent of the concerned person.

...or become neutral

In the event that a giant refused to open up to decentralisation (by refusing to enforce the standards that allow it), they should be forbidden to prioritise our exchanges according to their own rules, since we would not be able to escape from them.

This interdiction could take on many forms. For example it could be a prohibition in principle controlled by an independent authority (as is the case with the protection of Net neutrality). It could also be the possibility for anyone to be able to take to court a giant if they censored lawful content. Finally, in case of repetitive censorship of lawful content, these legal actions could become collective complaints (the same way the GDPR allowed us to act [in french] as 12 000 persons against the GAFAM to protect our personal data). Let us not forget that, as of today, it is pratically impossible to defend ourselves against excessive private censorship.

In summary: if the giants keep trapping us (by refusing to open up to decentralisation), they must become neutral (and not censor lawful content nor promote some at the expense of others). The absence of prioritisation would be pointless if not complete: it must prevent the censorship of nudity on Facebook as well as the promotion of content based on the attention economy or retribution (for advertising). It is the same discipline that is at the heart of Net neutrality and that we must apply thoroughly. Otherwise none of the abuses created by the attention economy will ever be corrected.

No automated censorship

In any case, giants, even neutral, must not be subject to an obligation to monitor the content they broadcast so as to automatically suppress the "obviously illicit" information. However this is what can be expected from the Copyright directive or the new proposal for an EU regulation against the dissemination of terrorist content. These obligations are unacceptable, for at least three reasons.

First, article 15 of the 2000 EU directive that regulates the activity of internet intermediaries states that "Member States shall not impose a general obligation on providers […] to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity". This protection covers all the Internet intermediaries, even those who are not neutral and therefore not submitted to the status of hosting provider strictly speaking.

Secondly, aside from this directive, the Court of Juctice of the European Union (CJEU) considers since 2012 that these obligations violate our rights to privacy and freedom of information as they were defined by the Charter of Fundamental Rights of the European Union. Specifically, the CJEU states that to force a platform to "actively monitor almost all the data relating to all of its service users in order to prevent any future infringement of intellectual-property rights" (the case dealt with counterfeiting) "would involve the identification, systematic analysis and processing of information connected with the profiles created on the social network by its users" but also "could potentially undermine freedom of information, since that system might not distinguish adequately between unlawful content and lawful content, with the result that its introduction could lead to the blocking of lawful communications". Consequently this obligation "would not be respecting the requirement that a fair balance be struck between the right to intellectual property, on the one hand, and the freedom to conduct business, the right to protection of personal data and the freedom to receive or impart information, on the other" (SABAM v. Netlog, CJEU, 16 February 2012, C-360/10, points 38, 49, 50, 51).

Finally, to demand that the giants monitor every content in real time will only promote the already troublesome phenomenon of relocating content moderation [in french] to swarms of low wages employees working in stressful environments, in order to "keep up" with machines that are unavoidably flawed. Most of them already suffer from mental trauma as a result of "staring for hours at a computer screen, watching images and videos of torn and shredded bodies, drowned or abused children, not to mention the never ending stream of insults and calls for murder".

Imposing stronger obligations of censorship on the Web giants is only a way to avoid facing the problem.

This response relies on the dangerous and nonsense idea that technological solutions might fix problems which are first and foremost social and political. Above all it doesn't take into consideration the root of the problem, but only adresses the symptoms (the same lack of ambition displayed in the draft law against the spread of "false news" that we denounced).

The attention economy, now hegemonic on the Internet, is at the heart of the problem Governments aims to solve, since they lead to the dissemination of stressful speech and our over-exposition to unwanted interactions. The only way to stem these abuses is to develop open or neutral services, which will allow the judge to focus on the more serious offences.

There are many ways to help the development of those alternative options. Our current legal propositions are only the beginning, and we invite you to discuss them by contacting us or by coming to our next public meetings :)


Viewing all articles
Browse latest Browse all 110

Latest Images

Trending Articles





Latest Images