The misery of Facebook
© Daniel Hertzberg

The misery of Facebook

To the outside world, the inner workings of Facebook have to remain a secret. Meanwhile, inexperienced employees are overwhelmed by horrifying messages and images.

In early February - a month before Facebook is under fire globally because the data of 87 million of its users was stolen -, teamleader Sebastian sends an e-mail to 1,100 employees of the Horizon team in Berlin. He announces a visit by the BBC and the New York Times. 'The aim of this visit', Sebastian writes, 'is the same as it was last time: to create more public understanding of what we do here.'

They are meticulously prepared for the media visit. Designated team members are selected to communicate Facebook's story. The company has long struggled with several ailments, like the spreading of misinformation, terrorist groups gaining political influence by abusing the platform and users falling victim to hate campaigns.

Facebook wants to transform its image. In the eyes of the public, the company continuously responds to criticism with hesitation. For a long time, it held the position that it's just a 'platform' and that it's up to users to behave with decency. Leading up to the U.S. elections in 2017, Facebook turns a deaf ear to allegations that the medium is being manipulated, while the company later admits that eighty thousand messages were posted from Russia in order to influence the elections. Meanwhile, by means of a so-called 'internet law', Germany is forcing companies like Facebook to remove messages containing hate speech within 48 hours.

Openness

Facebook wants to show the BBC and the New York Times that it means business. In the past years, the company has built up an army of cheap laborers in different places across the world - Manila, Dublin, Berlin - to keep the platform clean. The exteriors of the buildings where they work look perfectly ordinary. Even the renovated Siemens factory complex in Spandau, on the west side of Berlin. There are four gargantuan buildings there - among which an impressive monumental belfry that survived World War II - where in early 2017, 650 Facebook employees were stationed. Now, almost a year later, that number has risen to 1,100.

Every morning, they amble over in droves to the complex that also houses eBay and AirBnB. What they do there exactly is a secret. 'The aim of this visit is explicitly not to discuss details or confidential information', it says in the e-mail about the media visit.

Because the openness outwardly projected by Facebook to the media, is staged. The people who work here aren't allowed to tell anyone about their duties, says Dutchman Erik. So they are told at the beginning of a two-week training. They cannot tell friends, nor their families. They are not allowed to take personal possessions inside. Phones have to be put in a safe. Pens and paper are prohibited. The bathrooms are outfitted with expensive Dyson hand dryers, because paper towels can be written on, he says. Employees check for bags under the desks.

Facebook does not want it to be known how it determines what is and what isn't allowed, how it filters messages, censors, blocks and deletes them. Two moderators spoke to De Volkskrant about this in the past months, provided that they would remain anonymous. In several conversations of a few hours, Erik told his story. A different moderator checked the factuality of his story. They want to speak out against Facebook, because they are worried. About the company and about themselves.

'I want to remind you', team leader Sebastian emphasizes in his e-mail, 'to behave in accordance with your terms of employment and never to speak to journalists or other interested parties. This is mostly for your own safety!'

Dutchman Erik winds up at Facebook by coincidence, just as his co-workers did. He responds to a simple want ad for 'customer care agent - Dutch speaker' on employment website indeed.com. Berlin is home to a large group of international creatives. The low rents and pleasant living circumstances draw them to the German capital. Berlin is enticing to international companies as well: the wages are low and there are many young people with different backgrounds. That's why there are many call centers of companies like Zalando and Playstation in and around the city.

Erik had worked for several of these call centers, but did not want to do that anymore. He's invited for an interview with GI Gruppe, a large German employment agency. The interview takes less than 10 minutes. During that time, the employer's name is not mentioned once. They only refer to 'the project'. The GI Gruppe recruiter is not picky: there are basically no job requirements.

The most important question put to Erik is whether he is able to withstand seeing horrifying images. Terrorism, torture and the like. Erik thinks it will be alright; he spent years working with refugees, so his tolerance for misery is high. And that concludes the interview. The next day, he walks into the office of Arvato Bertelsmann, a large German media company.

There, he is again briefly interviewed. Erik is told that he's applying for a job as content moderator: he will be studying videos, images and text. All of this will be part of 'the project' for 'the client'. The name 'Facebook' still isn't mentioned. Erik accepts, he is to report to a building on the east side of Berlin three days later.

Artificial intelligence

He passes through the security gates and waits, along with others, at reception. There are Brazilians, Portuguese, Swedes, Greeks and Syrians. 22 people in total. First, they sign a nondisclosure agreement. For the first time, Erik is told that he will be working for Facebook. Then they move to a cellar with tinted windows, where a young Argentinian gives presentations.

As a test, the trainees are shown several messages on their screen. One of the posts Erik has to evaluate is a picture of a dying African boy, with a Pizza Hut box photoshopped into it. He finds it abhorrent. Sadistic. He feels this should at least be marked as sensitive and cruel. Wrong, he is told. Let it slide.

Facebook, Erik learns, uses protected categories (PC) and quasi protected categories (QPC). A PC is a vulnerable group - such as transgenders, the severely chronically ill, Jews, Ghanaian women, women in general. A message may only be deleted when it refers to an entire group. E.g. 'all blacks are lazy.' If a picture is captioned 'lazy black man', that's alright. Benign, according to Facebook.

QPC's include migrants, refugees and asylum seekers, the trainees are told. They are barely protected. Anything can be said about them. 'Murderers, thieves, moochers.' Facebook draws the line at inciting violence: 'Blow that boat with asylum seekers to bits.'

The course leader explains the method. The soon to be content moderators - Erik and his fellow trainees - will be shown reported messages. These have either been reported by users, or signaled by Facebook's computers - by means of artificial intelligence. The latter applies to about one in ten messages. This number will rise. The moderators are to make a decision within 12 seconds. They have five options: 1. Ignore 2. Remove 3. Check with a superior 4. Escalate 5. Mark as sensitive/cruel.

When Erik removes a message, he has to assign it to a category. The higher the category, the more severe the potential punishment for the offender. The penalty is determined by higher-ranking employees. From Facebook's ranking, it's clear what it considers to be most harmful: 'spam', even more harmful than child pornography, terrorism or hate. Offenses may lead to a temporary 'block' or permanent suspension.

'Take care of yourself', it says on a presentation that includes utterly gruesome video and images. Teenagers throwing boiling water on others, real executions hidden in children's videos, livestreamed suicides. When the two-week's training program is completed, three out of the 22 trainees have quit.

Erik has not quit and starts to work as a moderator, driven by curiosity. At 8 a.m., he reports for his shift to 4.30 p.m. The office is quiet, apart from the soft, irregular clicking of computer mice. Black Dell pc's sit atop the empty white desks.

The Dutch - at that time there are four of them, later there are six or seven - sit next to the Greek and the Portuguese, to the right of them are the Italians and the Slovaks. They moderate all Dutch-speaking areas: Flanders, Surinam and South-Africa. Nearly everyone wears headphones. Moderators stand or sit. The average age is about 27. They practically earn minimum wage: 8.90 euros per hour, about 1,500 euros per month.

A subject matter expert teaches Erik what to do: evaluate tickets. The four Dutch people get about 8,000 tickets per day: reported messages about hate, violence, child pornography, automutilation. The goal is to assess a minimum of 1,800 per person. Break-time is not work-time: moderators are obliged to log out during breaks. The Dutch are far behind, Erik is told. There are 22,000 tickets on hold, including cries for help from people who are about to commit suicide.

Diseases

The hatred, that's what gets to him immediately. 'I was taken aback by it.' The intense hatred directed at asylum seekers, Moroccan Dutch, black people. In the Netherlands, a lot of curse words refer to diseases. Erik: 'And in the Netherlands, it's always cancer. Cancer guy, cancer negro, cancer whore.' In Flanders, they blame the 'makakken'- referring to the macaque monkey -, in the Netherlands 'the Moroccans'. Every area has its own 'nuisances': violent pictures and videos from gangs in Latin America, porn and violence against women in the Middle East. It's relatively quiet for the Portuguese and Greek moderators. They can sometimes watch some Netflix. Not the Dutch: the Netherlands is the land of hatred, says Erik.

And the country where they use diseases as curse words. The other Europeans are alarmed by the number of 'tickets' coming from the Netherlands. 'Cancer' is such a widely-used curse word that Facebook doesn't even mark it as a curse word. It's impossible to remove all of that. Because of the many tickets on hold, the Greek temporarily help the Dutch team assess videos, the two moderators confirm.

Peak times in the Netherlands are the arrival of 'Sinterklaas' - the beloved legendary Dutch patron saint of children, under fire for his association with 'Black Pete', deemed racist by many -, elections and meetings of Geert Wilders' right-wing party PVV. In Berlin, they now know: that's when the number of threats increases immensely. 'Geert Wilders has called for a protest on Saturday, January 20th, 2018, in Rotterdam', it says in an e-mail sent to the Dutch moderators in January. 'We know that this type of situation always causes intense hatred and strong reactions.' And: 'Don't just focus on Saturday. Most messges about this topic will pour in on Sunday and Monday.'

Facebook is American. And American standards are leading. What constitutes terrorism, is determined by the terrorism list of the U.S. Department of State. But Facebook is not unequivocal. The attack in Las Vegas, October 2017, is first labelled a mass murder. It was committed by a white man. After severe criticism, Facebook changes its policy a week later. Erik: 'It was terrorism after all.'

Every few weeks, there is an internal policy update. The Facebook policy is sacred, unless countries object to specific parts of it. In Turkey, political messages are disappearing from the platform. Criticism of Erdogan is prohibited and Facebook adheres to this, even though its own policy is lenient when it comes to the criticism of public persons. That is how a cartoon of Erdogan, made by Dutch cartoonist Ruben L. Oppenheimer, came to be deleted: either because the report was assessed by Turkish moderators, or because someone wrongly interpreted the rules.

Human rights organizations have long accused Facebook of serving governments or applying censorship. In Myanmar, the company prohibited any reference to a rebel group of threatened Rohingya. Facebook alleges that this is because it's a 'dangerous organization', but it left the official page of the army - accused by the UN of 'ethnic cleansing' of the Rohingya - untouched.

Or take, for example, the female nipple. Erik: 'To us Europeans, that's normal. But images of a full nipple have to be removed. American prudishness.' Exceptions are made for activists and breast amputations. That this leads to discussion as well, becomes apparent from the fact that a well-known poster of the Dutch Pacifist Socialist Party (PSP) from the 1970s - depicting a naked woman and a cow - is regularly removed.

When Erik has assessed a couple thousand tickets in his third week, he automatically also gets to see videos, without warning. Immediately, he sees a video that will haunt him for a long time. A man in orange overalls hops along on an asphalt road. His legs are cuffed, which is why he moves in small skips. A tank pulls into view and runs over the man. Then the camera zooms in on his flat, bloody remains.

Erik stands up and walks outside. Now he understands why people regularly leave the building screaming and crying. Outside, he smokes a cigarette, shaking. He's out there for an hour. Then he goes back to work. Nobody asks him anything.

In the weeks to come, he witnesses every imaginable horror. Torture, executions, stoning. But also challenges. Teenagers who put erasers to their arms until their muscles are visible. Anorexia patients making appeals: for every like they get, they'll refrain from eating for another hour. 'And then the likes pour in. From 2 to 5, to 14.' Videos of Sponge Bob that turn into footage of a decapitation. They're images that keep lingering in his head. 'We're not at all trained to see this kind of misery, day in, day out.'

The Berlin branch has one psychologist, a social worker and a so-called 'feel-good-manager'. The latter's tasks include sending e-mails about yoga classes and reminding employees that there is fresh fruit every day. So, he reports to the corporate psychologist one day. She also emphasizes the healing properties of yoga. And she explains to him that he's doing tremendously important work. Difficult work, but important. She also recommends not immediately taking the U-bahn after work, but walking to the next stop. It's not very helpful, says Erik.

A social movement

A Swedish colleague puts an idea into his head. One day, he's standing in front of the Edeka supermarket, drinking wine from a carton. Self-medicating seems like a good idea to Erik. First, after work. Then he starts taking a drink before getting on the U-bahn. The next thing he knows, he's drinking at work. It's not long before he's smuggling in a liter of wine in a blue Facebook bottle.

Like his colleagues, he starts going to a so-called Polish market, near Frankfurt. A return ticket on the bus costs five euros. There, they buy cheap cigarettes, but also valium, diazepam and other medication to numb their minds during work hours. They don't need prescriptions.

For a long time, Facebook ignores all criticism of the platform. Its mantra: it wants to connect people. It's a social movement. The trainer tells Erik that until 2014, Facebook was the most important channel for jihadists going to Syria. Only when the U.S. government complains, does the company actively start preventing terrorist groups from spreading propaganda. But 2017 is the tipping point. First, CEO Mark Zuckerberg acknowledges that the medium was abused for propaganda. Then former executives sound the alarm about the societal dangers posed by big tech companies like Facebook. They demonstrate that everything those companies do is aimed at getting people addicted to their product. They are ad agencies: the larger their audience, the more money they make.

People have to come back as often as possible. That is why Facebook managers go to conferences on influencing behavior. When users open the app, they have to be surprised. The algorithm makes sure of that. Posts that, for whatever reason, stand out, get more attention. Those are the posts that stir emotion and incite response. A nuanced contribution doesn't score as well as a more explicit op-ed piece. Social media thrive on controversy, as long as the controversy doesn't center on them.

That becomes clear from the way Sylvana Simons was treated. The Dutch television personality and politician is a beloved target of hate and racism. On Facebook, Simons is a public person, says Erik, and the policy states that nearly anything can be said about public persons. Erik sees a myriad of messages about Simons. 'Go back to your ape country', 'I'll throw your black cunt in the canal', or 'Die, whore'. All allowed.

In August of 2017, Sylvana Simons shares a private message she's received on Facebook from a man named Egbert. 'DIE!! Black fucking moron! Please get the fuck out and leave our country alone! Nauseating sand eater! We whites should have kept your black kin as slaves! At least then you were good and did as you were told!'

Public debate

Facebook allows it. Erik cannot get used to it. Everything he despises as a person, he has to accept as content moderator. Hate, racism, cursing. But with the umpteenth message about Sylvana Simons, he's done. Bam. He removes the message and marks it as 'hate'. Good riddance. But then he's questioned by his superior. Why did he do it? Erik gets a 'bad evaluation', immediately visible in his success rate. 85 percent of his assessments is in concurrence with Facebook policy. That should be at least 98 percent.

It's not in Facebook's interest to ban a lot of content - unless the public debate shifts from that content to Facebook. That's what happens with Simons. First, Erik has to accept almost all messages that pertain to her. But suddenly, that changes, he says. This happens in October 2017, shortly before a European NGO is to visit Facebook to test how it deals with hate speech. Just before the visit, Simons' status changes. She becomes a 'protected person'. The reasoning behind it: she's not only a politician, she's also an activist.

'Yes', Simons responds, 'I've noticed that things have calmed down lately. From January, I think. Before that, the 'Black Pete' debate was still going strong.' She would regularly report fake profiles and racist messages. 'Actually, they barely responded to that. Or they would reply: this is in concurrence with our policy.' That has changed. Simons: 'Now, they do take me seriously when I report hate and racism.'

After four months, Erik sees a video that he cannot forget. The video lasts 30 seconds. A girl of about 11 gets brutally sodomized. 'Horrendous. I'm afraid I'll never be able to forget it. That I'll always keep seeing the little girl's frightened face.' He self-medicates more and more with wine and pills. His colleagues do the same. A Danish girl quits after a couple of months and is then committed, with posttraumatic stress disorder. Her practitioners forbid her to use Facebook any longer. Within six months, the composition of the Dutch team, now eight people, has completely changed. No one can stand it for longer than a couple of months. Erik continuously sees new faces.

He understands that Facebook has to do something. That someone has to clean up the mess. 'But what really bothers me is the lack of professional help. Police employees who have to look at child pornography have been trained to do that and are only allowed to do it a couple of hours a day. We see misery all day. Unprepared.' Erik can't stand it anymore either. He notices that he's more anxious, wary of other people. He's more sensitive to violence and hate, he sees it all day. He gets counseling, but every time he enters the Facebook building, his anxiety returns. Eventually, after eight months, he quits his job. He still sees a therapist every week.

Shortly after department head Sebastian sends his warning e-mail to the staff, several journalists, including from the New York Times, are shown hours of presentations about Facebook's moderating policy. Managers tell them how quickly the tech giant has adapted to the new German law. It doesn't lead to any publications: several weeks after the visit, the news is out that the British data company Cambridge Analytica has stolen the profiles of 87 million users. Facebook knew about it, but did not act. After days of silence, Zuckerberg apologizes. Again, he promises change for the better.

Extra: how this story was verified

This article is based mainly on the story of 'Erik', a Dutch man who spent more than eight months of last year as content moderator for Facebook. He wants to remain anonymous because he signed a nondisclosure agreement. De Volkskrant has verified his story with another moderator. He checked 36 statements and confirmed 31 of them. The five unconfirmed statements were removed. Furthermore, our reporters have seen company presentations and policy papers. Also, several parts of Erik's story are supported by policy documents previously published by The Guardian.

Translated by Lisa Negrijn

Facebook's response:

Facebook was shown this story before publication. This is their Dutch-German communication team's response: 'Worldwide, over 7,500 moderators assess the content of messages, that includes Facebook staff and people who work for, for example, Arvato. In Berlin, over 700 people work for Arvato.

'Our standards constantly change, because our community is growing and social issues develop along with this. We are permanently in conversation with experts and local organizations about everything from child safety to terrorism and human rights. The spreading of hate is a global issue.

We strongly disagree that Arvato employees are poorly taken care of. We realize that their jobs are often difficult. As soon as they start moderating, they are offered psychological help. Psychologists can be consulted at any time. We are always evaluating whether this service needs to be changed and whether our employees feel supported. The psychological help available has recently been extended, and we also have seven coaches, both male and female, taking cultural sensitivities into account.

'The atmosphere at the Facebook team's workplace is positive. Moreover, we show increasing support for social and recreational activities and will soon have finished a psychological risk analysis to evaluate where more support is needed.

'Arvato personnel is paid better than average in this industry. Additional compensation applies to weekend and night shifts.'