Facebook, Twitter still failing on hate speech in Germany as new law proposed

Facebook, Twitter still failing on hate speech in Germany as new law proposed


Facebook and Twitter have once again been criticized in Germany for failures to promptly remove hate speech being spread via their platforms.

At the same time, the German government has presented a draft bill aimed at more effectively combating hate criminality and criminal offenses on social network platforms, arguing that the companies’ ongoing failures necessitate tighter regulation — with a potential fine of up to €50 million being floated for social networks breaching what are intended to be binding standards for dealing with complaints and removing criminal content.

As with many EU countries, Germany has specific hate speech laws that criminalize certain types of speech, such as incitement to racial violence. The issue has stepped up the domestic political agenda in the country in recent years, following the refugee crisis — with Germany taking in a large number of asylum seekers.

The biggest problem is that the networks do not take the complaints of their own users seriously enough.

— Federal Justice and Consumer Protection Minister, Heiko Maas

The country also has elections this year, heightening concern over the role social media can play in shaping and influencing public opinion.

“There can be just as little space in the social networks as on the street for crimes and slander,” said Federal Justice and Consumer Protection Minister, Heiko Maas, in a statement today (translated from German via Google Translate). “The biggest problem is that the networks do not take the complaints of their own users seriously enough.”

Back in December 2015, Facebook and Twitter gave a commitment to the German government that they would remove criminal hate speech from their respective platforms within 24 hours. Google also agreed to do so on YouTube.

Last May, Facebook, Twitter, Google and Microsoft also all agreed with the European Commission on a code of conduct that committed to removing hate speech within 24 hours.

However, in the latest German government-funded study monitoring the performance of the companies, the Ministry of Justice said Facebook has become worse at promptly handling user complaints, saying the company deleted or blocked 39 percent of the criminal content reported by users — a seven percentage point decline versus the first test of its performance.

In addition, only one-third of content reported by Facebook users was deleted within 24 hours of the complaint being made, according to the survey.

Twitter’s performance is also criticized, with the survey finding that only one of a hundred user messages had been erased, and none of the deletions took place within 24 hours.

The pair’s failings also contrast negatively with Google, which is reported to have made significant improvements regarding YouTube content complaints since the tests began — with the study finding that 90 percent of user-reported criminal content was deleted from the platform, and 82 percent of the deletions occurred 24 hours after the notification.

“Google shows with the platform Youtube that it is better,” said Maas. “Therefore, it is now clear that we must further increase the pressure on social networks. We need legal regulations to make companies even more obligated to eradicate criminal offenses.”

We reached out to Facebook and Twitter for comment on the findings. Twitter declined to make a statement but a spokesman said the company has made changes in the past few weeks aimed at reducing the spread of abusive content on its platform which may not have been in place at the time of the survey.

However, these tweaks appear most focused on using technology to try to automatically identify abusive/problem accounts, or give more tools to users to enable them to filter their own feeds, rather than putting more resource into content complaint processes specifically. (Although Twitter does claim to have improved the transparency of the reporting process, such as notifying users when a complaint has been received and if it is being acted upon.)

At the time of writing, Facebook had not responded to our questions, but we’ll update this story with any response. Update: In an emailed statement a spokesperson for the company said: “We have clear rules against hate speech and work hard to keep it off our platform. We are committed to working with the government and our partners to address this societal issue. By the end of the year over 700 people will be working on content review for Facebook in Berlin. We will look into the legislative proposal by the Federal Ministry of Justice.”

Responding specifically to criticism that it does not take user complaints seriously enough, Facebook said it has changed its internal procedures for interpreting its community standards guidelines in recent months, and has also provided specific guidance for its Community Operations (CO) team regarding hate speech in Germany.

It also said it has “invested heavily” in the CO team — both globally and locally in Germany. By the end of 2017, Facebook says it will have more than 700 employees working with content moderation partner Arvato, based in Berlin.

It also disputes the result of the German government-funded survey (which was carried out by jugendschutz.net), saying the results “do not mirror our internal experience and tests by independent organizations such as FSM” — claiming the latter’s tests of its processes found the opposite: a significant improvement in removing illegal content. (An FSM test based solely on user flagging from January found 65 percent of illegal content was removed from Facebook within 24 hours, and overall it had a deletion rate of 80 percent, according to Facebook.)

“We do not believe that the results of the current jugendschutz.net tests reflect the progress we have made in improving our systems and will analyze this test thoroughly,” Facebook said, adding: “Obviously, we are disappointed by the results and we would like to thank Jugendschutz.net for testing our systems and will study all the reports carefully to help us improve the way we operate. We have clear rules against hate speech and work hard to keep it off our platform. We are committed to working with the government and our partners to address this societal issue.”

Last week Facebook also came under fire in the U.K. for content removal failures pertaining to child safety, after a BBC investigation found the platform failed to promptly respond to the vast majority of reports the journalist made. The news agency had been checking Facebook’s own claims of improved performance after an earlier BBC investigation unearthed secret Facebook groups being used to share child abuse imagery. But it concluded that Facebook had failed to improve over the past year.

Facebook is also facing continued pressure for how its platform is misappropriated to spread so-called “fake news” — an issue that has gained prominence in the wake of the U.S. election when large numbers of bogus political stories were seen to have circulated via the platform — potentially influencing how Americans voted. CEO Mark Zuckerberg initially tried to shrug off the issue, before conceding at the turn of the year that the billion+ user platform does indeed have “a greater responsibility than just building technology that information flows through.”

Since December, Facebook has been piloting a series of measures aimed at fighting the spread of fake news problem, including in Germany since January, working there with local third-party fact-checking organization Correctiv to try to identify and flag dubious content.

However, early signs are that Facebook’s efforts on the fake news front are not too promising either, with Recode reporting earlier this month that it took the platform a full week to label one made-up story as “disputed” — despite the source being a self-confessed fake newspaper.

“Shouldn’t/couldn’t Facebook move faster on this stuff, especially when it’s a clear-cut case like this one? Yup! But that would require Facebook to make these kinds of (easy) calls on its own, and Facebook really doesn’t want to do that,” was Recode’s Peter Kafka’s conclusion of its performance there.

Germany’s Maas also touches on the fake news issue, arguing that the proposed domestic law to tighten content complaint procedures for social media platforms will also help kill fake news.

“We will not establish a commission of truth in a free society in which freedom of expression applies. But because the rules we propose are directed against the spread of criminal content, they are also a means against criminal ‘fake news,’” he said. “‘Fake News’ are punishable, for instance if they fulfill the facts of the offense, slander or the evil slander.”

Among the proposals in the draft law are that social networks operating in Germany be obliged to:

  • Provide users with a readily identifiable, directly accessible and constantly available procedure for the transmission of complaints about criminal content
  • Deal with user complaints without delay and to examine the relevance of criminal law
  • That obviously criminal offenses be deleted or blocked within 24 hours of receipt of the complaint
  • To delete or block any criminal offense within 7 days after receipt of the complaint and
  • To inform the user of any decision regarding his complaint

The obligation to delete or block criminal content reported by a user would also apply to all copies of the criminal content on the platform, under the proposals.

The draft law would also require social network operators to report quarterly on the handling of complaints about content relevant to criminal law — including detailing the volume of complaints; their decision-making practices; and the “staffing and competence of the work units responsible for dealing with the complaints.”

These reports would also have to be made accessible to the public online.

Social networks that do not create an effective complaint management system capable of deleting criminal contents effectively and swiftly could be punished via fines of up to €5 million euros against the individual person responsible for dealing with the complaint, and larger fines of up to €50 million against the company itself. Fines also could be imposed if the company does not comply fully with its reporting obligation.

The proposed law also would require social networks to appoint a “responsible contact person” in Germany who could be served in criminal proceedings and civil proceedings — and would themselves be on the hook for a fine of up to €5 million.

Source link

Leave a Reply

Your email address will not be published.