Every minute of every day, 500 hours of video footage are uploaded on YouTube, 2,5 million posts are made on Facebook and 450,000 tweets appear on Twitter. These are the numbers that crowd the social platform where more than 3 billion people hang out. So many of us are ‘connected’. THE CLEANERS, premiered at the latest Sundance Festival is remarkable, well-balanced and hard-hitting in listing off the huge and controversial aspects of social platforms – privacy, freedom of expression, accountability.
Moritz Riesewieck and Hans Block inform us of a lot of things, maybe more than we can actually take in in 80 minutes. First, social platforms are a far cry from being social, places where people, as Mark Zuckenberg puts it, “can share anything with anyone.”
In theory (but only in theory) anything can end up on the social network without any accountability of Facebook, Twitter or Google, that continue to represent themselves as technological platforms, providers of content done by others. After Trump and Brexit, this line no longer holds, the demarcation between tech companies and publishers has become too blurred.
The contents spread on the platforms are not the best, but “the most viral”.
This brings us to the second issue. The business model of social platforms is to clearly make a profit by creating interest and participation. Algorithms do the rest, selecting targets and contents. Not everything uploaded remains online. It is the content moderators, the cleaners, who control the Web, making it safer (but less democratic) than we could ever image. Thousands of young people clean the ‘social space’ 10-12 hours a day, from neo-Nazi rallies, child pornography, beheadings, but not from live suicides because as one cleaner explains that in the case of real time videos they can delete it only after it has ended.
Who checks the cleaners? Which editorial guidelines are they subject to? Who do they account to for their work? These are only some of the many questions that are raised as we get into the documentary.
Some of these cleaners have bravely decided to show their faces and talk about their job before the video cameras of Moritz Riesewieck and Hans Block.
Their daily target is 25,000 images and team leaders examine only 3% of each moderator’s work. The Cleaners are not employees of the big Silicon Valley companies, but work in the squallid outskirts of Manila.
Outsourced workers, increasingly used by the tech-companies. The Cleaners are averagely educated and poorly paid, much less than their American colleagues. All around them, like a bizarre paradox heaps of rubbish where other young people are raking through to find something to make a leaving.
“It is the only job that pays us a decent wage”, one cleaner tells, “The world should know that we are here, that there is somebody checking the social media. We are doing our best to make this platform safe for all, for millions of people, to protect them. We are like policemen.”
He feels a bit like Rodrigo Duterte, the President of Terror, of the war on drugs, who has had almost 7,000 people, drug addicts and pushers, killed, tortured or disappeared , drug addicts and pushers, killed, tortured, disappeared. But the job of content moderator is not the same for everybody. Some of them want to give up, the human bestiality they have to watch on the web is often too hideous to stand.
From Manila to the United States, to the very big issue that encompasses all the others – the political impact the platforms have on global dynamics.
From the election of Donald Trump in 2016 to the crackdown of Recep Erdogan, from the massacre of the Rohingya in Myanmar to the IS videos. What is on (or not on) the Web influences if what happens in the world is politically featured or not. Whether it is the bombings in Syria or the brutality of the Islamic Caliphate or even some funny things. One of many, the nude male body with the head of Donald Trump. The penis is too small for someone who wants make America ‘Great Again’.
Too insulting for the personality of a president. This is the drawing of the young artist, Illma Gore, with 50 million shares on the social, but now removed from the Web.
However, there is much more than the genitals of the President of the United States and pornography that come under the axe of the Cleaners. After an awkward hesitation, Colin Stretch, Vice-president and legal consultant for Facebook, admitted, before the Senate Commission investigating the Russian interference in the 2016 election that the company has developed a software to geo-block content that must not be visible in a specific country as its government considers it illegal. In most cases, Strench continues, blushing, “[…] this content has nothing to do with political matters.”
Yaman Akdeniz, professor of law in Istanbul, disagrees with Strench’s statement. “Facebook removes all the Turkish government asks it to delete, mostly political criticism.”
In Turkey, censorship has become self-censorship. “I don’t like this solution, but the alternative was to be completely blacked out”, the former policy maker at Google and Twitter, Nicole Wong, embarrassingly admits. In 2004, she had declared “[…] we must decide what cannot be viewed on the platforms. There is a choice in that. And the choice is context-based […].”
In this case, the context has the face of Erdogan and the body of Turkey, a market from which you cannot be cut out, at the cost of deleting “terroristic propaganda”, that inevitably means turning the focus away the victims, as American Senator Lindsey Graham points out.
Abdulwahab Tahan of the NGO Airwars confirms this conclusion.
“My job is to collect and archive videos on the air strikes in Syria before they are taken off from the Web. Activists in Syria upload lots of videos on YouTube. After archiving the videos we geo-localize data […], without our work the regime would have an even freer hand, nobody would challenge them, there would be more victims killed. These videos are part of the war, they provide information, evidence for the future. The problem is that often they are classifed as IS videos and deleted from YouTube. This censorship is affecting a lot of organizations engaged in Syria.” (Abdulwahab Tahan della NGO Airwars)
In any case, the algorithms are much more influential than the journalists. David Kaye, UN Special Rapporteur on Freedom of Expression is not optimistic :
“[…] social platforms will have more and more power to decide what can remains and what is removed. Over time we will have less information available and people shouldn’t be surprised if we become a poorer society.”
The testimonies of Facebook, Google and Twitter’s big wigs before the American Senate Committee for Crime and Terrorism in October 2017 have not reassured us. They all claimed to be doing their best to protect social platforms from terrorists, jihadists and extremists of all kinds. They all guarantee that they have thousands of people reviewing thousands and thousands of pages to stop the spread of terrorist content and foreign interference in the democracy of the United States – “We have ten thousand people working for security and we are committed to investing more by the end of 2018.” (Colin Stretch)
Tristan Harris, well-known former Google design ethicist, and “the closest thing Silicon Valley has to a conscience”, as The Atlantic defined him, thinks that platforms have been undoubtedly designed to provoke extreme reactions and behavior.
“There is an underlying misconception – it is to think that human nature is human nature and technology is a neutral tool. This is not true because technology does have a bias, it has a goal. The goal is to draw the attention of as many people as possible. Outrage, offence, violence, all the evils of the world, are really good at doing that, and Facebook benefits from it all.”
The closing scenes of THE CLEANERS takes us back to where we had started our journey, to the outskirts of Manila among mountains of trash. Some of the Cleaners will give up, some confessing that they are no longer the same person as when they started the work (“it is as if a virus had taken possession of my brain”), some grit their teeth and go on.
One, instead, couldn’t handle it and killed himself, unexplainably, with a rope around his neck. He was specialized in censoring extreme self-mutilation videos.