UK eyeing fines for social media content-moderation failures

UK eyeing fines for social media content-moderation failures


After the U.K. Prime Minister Theresa May secured a joint statement from the G7 on Friday, backing a call for social media firms to do more to combat online extremism, a Conservative minister has suggested the party is open to bringing in financial penalties or otherwise changing the law in order to encourage more action on problem content from tech companies if it’s returned to government at the U.K. general election on June 8.

The Guardian reports the comments by security minister Ben Wallace, speaking to BBC Radio 4 on Sunday. Wallace’s words follow an exposé by the newspaper of Facebook’s moderation guidelines — which the minister dubbed “totally unacceptable,” citing an example of Facebook’s moderator guidance saying it’s “OK to publish abuse of under-seven-year-old children from bullying as long as it doesn’t have captions alongside.” Facebook’s rules have also been criticized by child safety charities.

The company declined to comment for this story. But Facebook has previously said it intends to make it simpler for users to report content problems, and will speed up the process for its reviewers to determine which posts violate its standards (although it has not specified how it will do this). It has also said it will make it easier for moderators to contact law enforcement “if someone needs help.”

Beyond bullying and child safety issues, concern about social media platforms being used to spread hate speech and extremist propaganda has also been rising up the agenda in Europe. Earlier this year the German cabinet backed proposals to fine social media platforms up to €50 million if they fail to promptly remove illegal hate speech — within 24 hours after a complaint has been made for “obviously criminal content,” and within seven days for other illegal content. It appears a Conservative-majority U.K. government would also be looking seriously at applying financial penalties to try to enforce content moderation standards on social media.

Wallace’s comments also follow a U.K. parliamentary committee report, published earlier this month, which criticized social media giants Facebook, YouTube and Twitter for taking a “laissez-faire approach” to moderating hate-speech content. The committee also suggested the government should consider imposing fines for content-moderation failures, and called for a review of existing legislation to ensure clarity about how it applies.

After chairing a counterterrorism session at the G7 on Friday, which included discussion about the role of social media in spreading extremist content, the U.K.’s PM May said: “We agreed a range of steps the G7 could take to strengthen its work with tech companies on this vital agenda. We want companies to develop tools to identify and remove harmful materials automatically.”

It’s unclear exactly what those steps will be — but the possibility of fines to enforce more control over platform giants is at least now on the table for some G7 nations.

For their part, tech firms have said they are already using and developing tools to try to automate flagging up problem content, including seeking to leverage AI. Although given the scale and complexity of the content challenge here, there will clearly not be a quick tech fix for post-publication moderation in any near-term time frame.

Earlier this month Facebook also said it was adding a further 3,000 staff to its content reviewer team — bringing to 7,500 the total number of moderators it employs globally to review content being posted by its almost two billion users.

Source link

Leave a Reply

Your email address will not be published.