Facebook's Fix for Fake News Fix: Seven Steps to Normal Thought

"Not sure if fake news or real..." - img: BBC
Last week, Facebook CEO Mark Zuckerberg suggested a resolution to the apparent fact that fake news swung the results of the US Presidential election. Basically: "meh."

But it seems you can't please all of the people all of the time - because many weren't 100% satisfied with that. So, in the wake of a massive public outcry, Mr Zuckerberg has come back from the drawing board with another potential remedy: a seven-point plan released on Saturday which is designed to curtail the circulation of phony facts (and maybe opinions) on the site. All of which has been greeted by a resonant "meh" of our own.

But we should be payin' attention. So, let's start with the facts (if they can so be called). Facebook's plan: 

     1 - 'Stronger detection.' Facebook says it will improve its capacity to 'classify misinformation' using 'better technical systems to detect what people will flag as false before they do it themselves.' So, that probably means more algorithms scanning posts and preemptively flagging them for human analysis.

     2 - 'Easy reporting.' It's already painfully easy to report something as 'fake' on Facebook, but  they're going to make it even easier. It couldn't hurt - but maybe the problem lies elsewhere...

     3 - 'Third party verification.' Ah, now we're talking. Facebook's already reached out to some 'respected fact checking organizations' to filter phony information; but now it plans 'to learn from many more.' So, more of the same is coming our way. 

     4 - 'Warnings.' The official release says: 'We are exploring labeling stories that have been flagged as false by third parties or our community, and showing warnings when people read or share them.' Right, so, we'll be seeing fake/real or true/false labels on our news stories. See below for how this is potentially really bad news. 

     5 - 'Related articles quality.' Facebook will be 'raising the bar for stories that appear in related articles under links in News Feed.' That really doesn't make sense. Will they be making sure the stuff displayed as 'related news' is less or more similar to the original thing? If it's the latter, won't that make the 'echo chamber' thing worse? If the former, what exactly are they going to choose? There's a lot of stuff out there. But maybe they mean they're just going to make sure 'related' articles are only ever those which come form mainstream news media. (No rebellion here!) 

     6 - 'Disrupting fake news economics.' This is basically an expansion of the crackdown they announced last week on the advertising capacity of fake news sites, as well as the introduction of 'better ad farm detection.' 

     7 - 'Listening.' Finally, Facebook pledges to 'continue to work with journalists and others in the news industry...in particular, to better understand their fact checking systems and learn from them.' It makes sense as a necessary step; seeing as how this is the first time Facebook has really seen a big backlash regarding fact checking. 

Okay, that's the list. You can read the original source here (which I'd recommend, considering I've been putting it in my own words). The first question to arise from all that: should we give Facebook the benefit of the doubt?

On the one hand, Mark Zuckerberg is trying his darnedest to resolve an issue to which there is no easy fix. The more Facebook intervenes in our consumption of news, the more people like me are going to say 'you're encouraging people not to think for themselves'; or 'objectivity is unachievable'; or, perhaps eventually, 'you're censoring our input to suit your own agenda.' But if they do nothing, they'll be seen as ignoring user demands - and people like me will say 'you're an unaccountable corporation that doesn't listen to the people who depend on you.' How can they win?

Short answer: they can't. And that's what's really interesting about this whole fiasco. No fix will please all of the people. So, by coming up with a Seven-Point Plan of Action, Facebook has broadcast the notion that it's in control of the situation; even though its plan satisfies only some of the people, and probably won't work.

Still, that's the best they're going to get.

So, the next question we have to ask is: what does this mean going forwards?

We have to understand first that Facebook will only have the resources to fact-check a certain amount of news. Whatever the plan, it won't lead to a completely 'verified' social media experience. Facebook will have to choose which facts are checked and which are not. There's no longer the dual-faceted Trump-Clinton dichotomy: people have gone back to thinking about a billion things at once instead of just two; and there are usually more than two sides to an argument anyway.

Maybe Facebook will decide that fact-checking only applies to stories with x level of popularity. But, equally, they might just use their own discretion to decide what they scrutinise and what they let through. Like airport security. Which is never controversial.

Beyond that, however, the million-dollar question: will the new plan lead Facebook to start filtering opinions and narratives, rather than just the facts from which they are drawn? 

Post a comment


Author Name

Free Gift

Free Gift
Get immediate access to our in depth video training on the click by click steps required to get your successful online business started today

Contact form


Email *

Message *

Powered by Blogger.