In a change of heart, Facebook says it will fact-check and label fake news

The social network will work with third-party organisations and users to determine what news is misleading.

The amount of fake news on Facebook has apparently reached critical mass. In a blog post released on Thursday, Facebook announced it is rolling out new measures to fact-check links and alert users of false stories. The move represents a capitulation for the company, which had vigorously disavowed the idea that fake stories appearing in its feed had swayed the presidential election.

The new measures fall into four areas: easier ways to report fake stories, working with third-party organisations to flag these stories, amplified listening and reducing financial incentives, like the use of ads, for what the platform calls "spammers."

On Facebook, Mark Zuckerberg wrote, "While we don't write the news stories you read and share, we also recognise we're more than just a distributor of news. We're a new kind of platform for public discourse—and that means we have a new kind of responsibility to enable people to have the most meaningful conversations, and to build a space where people can be informed."

On the buying side, Facebook has eliminated the ability for spammers to spoof domains and will analyse publishers’ sites to detect where policy enforcement actions might be needed.

In the past, Facebook has relied on its community to report links as false. Going forward, it will do so even more. It’s a safe approach considering that controlling what news users see has gotten the platform in trouble before, such as when it was rebuked for relying on humans to hand pick its trending topics.

In the blog post, Adam Mosseri, Facebook’s VP of News Feed, wrote, "We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we’re approaching this problem carefully."

Facebook plans on testing several ways to make it easier for users to weigh in on whether a piece of news is fake or not. Currently, the only way a user can get close to doing so is to click on the upper-right-hand corner of a post and choose "report post" and then "it’s spam." Facebook will be adding the option, "it’s a fake news story."

Facebook fake news

Furthermore, it will begin to rank posts in its News Feed by articles that users share the most. "We’ve found that if reading an article makes people significantly less likely to share it," wrote Mosseri, "that may be a sign that a story has misled people in some way."

The platform will also begin to work with third-party fact-checking organisations. If one of these organisations reports a story as fake, it will be flagged as "disputed," appear lower in News Feed and there will be a link to the corresponding article that explains why.

Adam Mosseri, Facebook’s VP of News Feed, wrote in the blog post, "We’ve focused our efforts on the worst of the worst, on the clear hoaxes spread by spammers for their own gain, and on engaging both our community and third party organisations."

These changes come after both Facebook and Google were criticised over how fake news on their platforms might have influenced voters during the election. Brands like Pepsi and New Balance have also faced boycotts after appearing in fake news stories.

In a Facebook post on 19 November in which Zuckerberg previewed the above changes, he wrote that the network has "complex" problems with misinformation on the platform. The problem, he explained, is that Facebook wants the platform to be open for people to share anything, but that it also understands that people want correct information.

Digital agency Rain's Matt Lang believes its a good approach to begin solving a larger problem on the internet. "It's interesting that it is a two-pronged approach of both proactively alerting users with 'disputed' tags and providing new options for users to directly participate in the process," he wrote in an email. "They are blending partner expertise and crowdsourcing to minimise this type of content."

Lang wonders if, in the future, Facebook will "implement some kind of penalty system" and blacklist fake news sites from the platform entirely.

In the blog post today, Zuckerberg alluded to more upgrades in the future. "We have a responsibility to make sure Facebook has the greatest positive impact on the world," he wrote, "This update is just one of many steps forward, and there will be more work beyond this."

Have you registered with us yet?

Register now to enjoy more articles and free email bulletins

Register
Already registered?
Sign in

Would you like to post a comment?

Please Sign in or register.