I
n his comments this week, Zuckerberg seemed to take a measured approach to address his accusers: “We need to make sure that we don’t get to a place where we’re not showing content or banning things from the service just because it hurts someone’s feelings or because someone doesn’t agree with it – I think that would actually hurt a lot of progress.”
Being the “arbiter of truth” is not something that should fall on a single organization, nor is it simple to balance that arbitration across 2 billion users, across a multitude of cultures and beliefs and who are using a product that is fundamentally geared toward fostering that feedback loop — read this, like this, share this, see more of this.
And it’s certainly not easy to do any of that while trying to run an advertising business.
One of the accusers Zuck may be referencing is former product designer Bobby Goodlatte, who said: “Sadly, News Feed optimizes for engagement. As we’ve learned in this election, bullshit is highly engaging.”
Fair point.
Complicating matters more is the fact that everyone experiences Facebook differently, based on that all-knowing algorithm. The New York Times has the same front page, across all users, every single day. There is only one New York Times, and you can take it or leave it. A Facebook user, on the other hand, may see an entirely different front page across a variety of sources and a range of topics that is completely different from another Facebook user.
Add to that a president who has declared the media the enemy and it’s hard to blame Zuck for not wanting to get his hands dirty.
Fake news is not easy to remedy.
But before Facebook, there was no website that was home to 2 billion users. There was no ‘connected world’ the way we know it today. And it rests at the feet of Facebook to be accountable for the way its technology has changed the way we consume information.
Many smart people have suggested a public editor, but it would take an army to wade through the troves of content on Facebook with the same meticulous eye as a real news editor. Fortune’s Matthew Ingram put it well:
Depending on whom you believe, the problem of fake news on Facebook is either one of the most important issues facing mankind, or an over-blown controversy pumped up by the mainstream media. And in a way, that dichotomy itself points out the problem with defining — let alone actually getting rid of — “fake news.”
When someone uses that term, they could be referring to one of a number of different things: It might be a story about how Bill and Hillary Clinton murdered several top-level Washington insiders, or it might be one about how Donald Trump’s chief adviser is a neo-Nazi, or it might be one about how the most important election issue was Clinton’s emails.
The first of these is relatively easy to disprove just by using facts. The second is somewhat more difficult to rebut, since a lot of it is based on innuendo or implication. And the third is almost impossible to respond to because it is pure opinion.
Ingram suggested that, instead of a public editor, Facebook may implement some sort of Wikipedia-style crowdsourced truth filter. But suggestions aside, the first step toward solving the problem is Facebook taking responsibility for the problem, even if it’s just in part. And thus far, the company has yet to fully admit anything other than it’s not the arbiter of truth, it isn’t responsible for what it’s users post, and that despite that lack of accountability, the company is taking steps toward a remedy through growing the media’s dependence on Facebook as a platform.
Both as a business decision and as a matter of moral responsibility, the company is in a difficult position.
This is why I think we’ll never see Facebook admit that it’s a media company; that it has a responsibility for the content shared on its service; or make any legitimate moves that would suggest that Facebook is accountable for what you see on Facebook.
Based on the company’s response over the last four months, it’s starting to seem like we’ll get more of the PR runaround than an actionable, achievable solution.
Facebook and media publishers have grown their audiences together, but this particular issue has left them at an impasse. Publishers are calling for further action to stop the spread of fake news, which stands to discredit the whole lot of us, without taking a stand about sharing content through Facebook. And even if Facebook made moves toward filtering content, the media would undoubtedly hold it accountable for any mistake. “Censorship!” they’ll cry.
Meanwhile, Facebook will continue to insist that it’s the publishers’ problem, the users’ problem or the fact-checkers’ problem before they actually do something meaningful to solve the problem. After all, the first step is admitting that you have one.
“Move fast, break things” may have gotten us to this place, but that’s in the past now. “Be still, maybe it won’t see us” is the future.