information
/

How Facebook encourages people to "like" genocide

Start
The social network Facebook celebrated its 15th birthday on February 4th. The role of the platform in spreading hate speech has been the subject of much debate recently. At the heart of the controversy is Myanmar, where Facebook has been used to force hundreds of thousands of Rohingya to leave their country during a campaign that the United Nations has labelled genocide. Should Facebook be held accountable in the future? Will it be? An investigation by Janet Anderson for Justiceinfo.net.
 
Facebook, an American company worth several hundred billion dollars, has been pilloried throughout 2018 as the social network through which false news is spread and amplified, at the risk of altering the outcome of elections or referendums around the world.
Many hashtags express indignation about this, fewer about the company's mistakes in attracting millions of new users living in fragile states to its network. In Myanmar, for example, where some 700,000 Muslim Rohingya were expelled from their homes following a brutal campaign of hate speech, in what the United Nations has called "a "massive" campaign of hate speech. described as genocide.
 
Will Facebook ever really be held accountable for its (in)actions in situations like Myanmar? Emma Irvinga senior lecturer at Leiden University Law School, thinks it's highly unlikely. " They have monopolized the public space, they control it, without any accountability "she says. "I think maybe things are changing."The Commission, for its part, considers that Alexa KoenigDirector of the Center for Human Rights (HRC) at the University of Berkeley. "I would not be surprised if, in a relatively short time, some regulatory frameworks are put in place to help clarify this undefined world of platforms' duties to the people they serve in different ways. »
 

The Myanmar campaign  

But before we know whether these frameworks can lead to accountability for crimes committed, let us look at what happened in Myanmar.
Facebook is, in essence, accused of being used to promote hate speech. There appears to have been "an orchestrated, concerted campaign under the control of the military to use the platform to misinform the public," says Félim McMahonDirector of Technology at HRC. A journalist and investigator by training, McMahon is currently on leave from his work at the International Criminal Court, studying in particular the link between social networks and war crimes. An extensive Reuters surveyThe report, to which the HRC contributed, describes a context marked by the rapid expansion of smartphones with the reintroduction of democracy in Myanmar after 2011. "In 2016, almost half of the population had a mobile phone subscription. (...) Most smartphones purchased had Internet access," the report notes. And with these low-cost offers, the Facebook application went viral.
An investigation by the New York Times revealed the exact manner in which "the propaganda campaign, concealed behind false names and fictitious accounts, went undetected. The campaign, described by five people who, fearing for their safety, requested anonymity, included hundreds of military personnel who created trolls, news pages or well-known Facebook personalities before flooding them with inflammatory comments and scheduled messages during peak hours. »
"We studied hate speech in Myanmar over two years. Our teams gathered the information. Anyone who is willing to open their eyes can see that many negative feelings were directed at the platform.confirms McMahon.
 

The limits of the algorithm

The fact that Facebook has been overtaken is obvious. The company has tried to regulate Burmese content; offensive messages have been removed. But these efforts stumbled both on the tide and on the lack of deep cultural knowledge. An example is provided in the Reuters report about a racial insult - "kalar" - which can be a highly pejorative term for Muslims but can also have a more innocent meaning: chickpea. Banning the word on Facebook, in itself, makes no sense. Koenig is concerned about the tendency to try to solve the problem through a purely technological and automated response. She acknowledges that the simple "scale of content shared on platforms" implies a technological approach. But as long as social networks don't do "more sociological analysis", automated algorithms "will continue to be a rudimentary tool for dealing with very sensitive subjects".
 
Finally, in August 2018, Facebook has closed 18 accounts and 52 pages linked to Burmese officials. The company then announced that it had "found evidence that these individuals and organizations have committed or authorized serious human rights violations in the country".
Is this a serious response to an alleged genocide?
 

A very theoretical responsibility  

"The Facebook press release about the closure of these accounts indicates to me that they at least want the world to believe that they're taking the problem seriously."Irving, who is also studying digital data and responsibility for mass atrocities, observes. Koenig is more positive: "I really think they want to find the solution. I think we're seeing a movement right now that comes from the highest levels of development management in all of these platforms and is trying to figure out how to translate that into practice in the digital world. »
 
The company has also commissioned an independent human rights report based, it says, on the United Nations Guidelines for Business and human rights, which it made public in November 2018. The report reprimands Facebook for failing to prevent its platform from being used to "foment division and incite violence outside the network" in Myanmar, said the report. recognized one of the leaders of FacebookAlex Warofka. The report presents the extraordinary image of a company that apparently had no awareness - at the time - of its own potential for causing nuisance. It shows that Facebook management did little to verify the facts on the ground.
 
Irving describes this type of commissioned reporting as part of the "rhetorical accountability" that she has seen increasingly adopted by social networks. Priya Pillaia consultant and researcher in international law, and a blogger on Opinio Jurishad also hoped for more consistency: "Facebook needs to really acknowledge its work and potential impact in Myanmar in a more comprehensive way, which the report has not done at all. It's basically about avoiding a difficult discussion. »
 

Proving intent

Can Facebook ever be - and will it be - held responsible for the way its platform is abused? That seems unlikely.
As a company, Facebook had a duty to conduct the necessary audits, Irving said, and to "find out if [its] business conduct was going to cause a human rights problem in the region. But in the end, not all of these human rights principles are binding. "So, to be honest, we end up at a dead end. »
 
Pillai suggests using the example of how individuals linked to media outlets have been prosecuted for incitement to genocide at the United Nations Tribunal for Rwanda, including members of the board of directors of the famous Radio des Mille Collines (RTLM). "I do think there's a parallel."she said, arguing that even though "While it may not be exactly the same situation," it could be argued that, to the extent that Facebook was the medium for the dissemination of hate speech, it "in all likelihood created the conditions for the commission of mass atrocities.
 
Irving points out, however, that Facebook is a platform and not a media production company or a broadcaster or an author. Unlike RTLM, "it is not Facebook itself that generates hateful and inciting content. But potentially, she continues, " if you build algorithms in such a way that they promote hate or incitement content at the top of a person's news feed, you're more than just a neutral hosting platform".
" There's a lot of discussion these days about different legal theories of how to hold platforms accountable "Koenig admits, "But one of the questions I ask myself as a lawyer is whether we are dealing with something of the "knew or had reason to know" type as in the superior officer principle. »
 
Under international humanitarian law, commanders can be held responsible for war crimes committed by their subordinates, in some instancesThey are not to be considered to have committed a crime when they knew or had reason to know that crimes were being committed and did nothing to prevent or punish them. But while Koenig admits that social networks have "enormous control over what comes into their community space," she does not easily see in which jurisdiction they could be held accountable.
 
Irving responds more curtly than she thinks she can because Facebook is a company, not an individual. « I've looked at every conceivable possibility and, from a strictly legal point of view, Facebook is a private company." and "If we want to try to prosecute Mark Zuckerberg [founder and president of Facebook] for aiding and abetting genocide and crimes against humanity...I think it would be very difficult to prove intent. »
 

"It's kind of Wild West."

So what's waiting for Facebook?
"For Myanmar, it's too late"said McMahon. You can't go back in time," says Irving, "but you can try to see "how to avoid doing the same kind of damage in the future. The social networking giant is clearly looking to reach out to civil society and academics to try to improve itself. And the pressure for reform, fuelled by critical journalistic investigation, is not waning.
Fragile states such as Myanmar, often deeply divided and lacking media literacy, have become places where a social networking platform such as Facebook is actually the Internet for a lot of people. Irving compares Facebook to a public service and, "If Facebook is equated with the Internet, as a company it brings with it a set of responsibilities.
"It's kind of Wild West."Koenig believes, "The next step will really be to have the equivalent of a Bill of Rights or a Constitution. »
"We see that the company is starting to shift from denying responsibility and avoiding regulation to addressing the problem instead of denying and confusing the issue, said McMahon. "The question is, are they ready to turn the corner?"...press Koenig. « In any institution or field of work, there is a learning curve, a period when people are fully prepared to trade off responsibility for innovation.she adds. "But now that social networks like Facebook have matured, I think there's less tolerance for the idea that anything goes. »
 
New approaches are needed that "include civil society, states and platforms," says McMahon. « Leaders will have to decide whether they are willing to listen and learn from communities that do not seem to have obvious technological expertise, said Koenig, "but certainly have social and cultural expertise. I think this is entirely possible. But listening is a very difficult task. »
 

The danger of self-regulation

Another danger is that of going too far in the other direction and regulating an area of freedom of expression - banning everything - to the detriment of everyone, including journalists and human rights investigators. Criticism from "civil society has pushed [Facebook] in a direction where it's becoming Big Brother and we don't want that," says McMahon. Irving agrees with this current risk: "Basically, private companies are being given the task of deciding what is hate speech, what is acceptable speech, what constitutes incitement... They're being given the role of deciding what we as a society consider to be freedom of expression, what should be censored, and the rules they follow to do that are very non-transparent and unaccountable. »
 
At the age of 15, Facebook can behave as a platform for hate speech with impunity or as a self-regulator that is also free of control. Unless he or she becomes a more responsible adult.
 
Janet ANDERSON,Correspondent in The Hague Justiceinfo.net
 
Header illustration: ©Justiceinfo.net
 

Anything to add? Say it as a comment.

 

0 Comments
Inline Feedbacks
View all comments
fast food
Previous article

With 6500 billion $, investors are urging the junk food giants to respect the planet.

private data
Next article

Can we imagine a future without Facebook?

Latest articles from Companies

JOIN

THE CIRCLE OF THOSE WHO WANT TO UNDERSTAND OUR TIME OF TRANSITION, LOOK AT THE WORLD WITH OPEN EYES AND ACT.
logo-UP-menu150

Already registered? I'm connecting

In order to contribute to the information effort on the current coronavirus crisis, UP' proposes to its readers a free entry to the latest published articles related to this theme.

→ Register for free to continue reading.

JOIN

THE CIRCLE OF THOSE WHO WANT TO UNDERSTAND OUR TIME OF TRANSITION, LOOK AT THE WORLD WITH OPEN EYES AND ACT

You have received 3 free articles to discover UP'.

Enjoy unlimited access to our content!

From $1.99 per week only.
0 Shares
Share
Tweet
Share
WhatsApp
Email
Print