For example, the plague of fake news and abusive social media posts has spawned a new role: the social media
news fact checker.
Not exact matches
On Thursday, Facebook announced a plan to deal with the proliferation of fake
news: Third - party fact - checkers will flag what they think are false stories, and then Facebook will decide whether or not to demote them in people's News Fe
news: Third - party
fact -
checkers will flag what they think are false stories, and then Facebook will decide whether or not to demote them in people's
News Fe
News Feeds.
«If Facebook continues to grow as a trusted
news source in its own right,» Salmon argues, «then the result could be an existential crisis for
news organizations with old - fashioned things like editors and
fact -
checkers and clear ethical guidelines.»
It has used third - party
fact -
checkers to identify them, and then given such stories less prominence in the Facebook
News Feed when people share links to them.
Facebook also hires third - party
fact -
checkers to manually review fake
news and flag it for users when they see it.
From there, a group of
fact -
checkers — from groups like Snopes, ABC
News, the Poynter Institute — will determine if the story is, in -
fact, untrue.
But in a society in which many of us get our
news from blatantly partisan outlets (if not Jon Stewart's comedic rendition on The Daily Show), these web - based
fact checkers — however critically acclaimed — offer limited recourse.
Before heading out to one of our recent forays, I caught up on the
news on the well known climate blog, «Watts Up With That» and read the 10/39/17 article, How Google and MSM Use «
Fact Checkers» to Flood Us with Fake Claims by Leo Goldstein.
It might be a relief for
fact -
checkers and
news accuracy experts that repeating the misinformation when correcting it will most likely not make people's belief worse.
In their study, participants were shown «fake
news» headlines that were sometimes accompanied by warnings that they were false («disputed by independent
fact checkers»).
When a
fact -
checker rates a story as false, Facebook will show it lower in
News Feed, significantly reducing its distribution
Meanwhile, it's downranking hoaxes, partnering with outside
fact checkers and demonetizing fake
news.
Several
fact checkers who work for independent
news organizations and partner with Facebook told the Guardian that they feared their relationships with the technology corporation, some of which are paid, have created a conflict of interest, making it harder for the
news outlets to scrutinize and criticize Facebook's role in spreading misinformation.
Under the false -
news strategy that Facebook already had in place, Facebook passed flagged stories to
fact -
checkers.
Most recently, over the weekend, Facebook said it would employ third - party
fact -
checkers to verify
news posted on its site.
Demoting false
news (as identified by
fact -
checkers) is one of our best weapons because demoted articles typically lose 80 percent of their traffic.
This is clearly a problem that requires careful work, and since then we've done a lot to fight the spread of disinformation on Facebook from working with
fact -
checkers to making it so that we're trying to promote and work with broadly trusted
news sources.
This is clearly a problem that requires careful work, and since then we've done a lot to fight the spread of disinformation on Facebook from working with
fact checkers to making it so that we're trying to promote and work with broadly trusted
news sources.
The
fact checkers can give the signal of whether a story is true or false» says Facebook
News Feed integrity product manager Tessa Lyons.
The company waged a global fight against fake
news by cracking down on tens of thousands of fraudulent accounts, partnering with
fact checkers and running full - page ads in newspapers with tips to spot fabricated stories.
It will make fake
news posts less visible, append warnings from
fact checkers to fake
news in the feed, make reporting hoaxes easier and disrupt the financial incentives of fake
news spammers.
But after being criticized for allowing fake
news to proliferate during the 2016 US presidential election, Facebook began working with third - party
fact checkers to append warnings to disputed articles.
Almost exactly one year ago, Facebook implemented several changes to fight fake
news, including easier steps to report articles, partnerships with
fact - checking organizations and features, like Disputed Flags, that alert people when they are about to read or share articles that have identified by
fact -
checkers as fake
news.
«To reduce latency in advance of elections, we wanted to ensure we gave
fact checkers that ability,» says Facebook's
News Feed product manager Tessa Lyons.
It's not a traditional technology company,» he said today, mirroring his exact words from last week when Facebook launched product updates and partnerships with outside
fact -
checkers to fight fake
news.
Facebook has been hit with the brunt of the blame for the fake
news phenomenon, though it's fighting back against misinformation with outside
fact checkers and more features.
So far, it has
fact -
checkers to flag false
news reports in News Feed, for example, and the company said it had «taken action» against 30,000 accounts fake accounts in France, which is currently in the midst of its own heated elect
news reports in
News Feed, for example, and the company said it had «taken action» against 30,000 accounts fake accounts in France, which is currently in the midst of its own heated elect
News Feed, for example, and the company said it had «taken action» against 30,000 accounts fake accounts in France, which is currently in the midst of its own heated election.
Facebook launched a similar experiment in which it added context from
fact checkers to known fake
news articles.
In an email obtained by BuzzFeed, Facebook's manager of
news partnerships Jason White wrote to one of the company's third - party
fact checkers:
Facebook must evolve its policy to more broadly and forcefully define and delete fake
news, whether that means taking the political heat of vetting content in - house and being accused of bias or massively funding third - party
fact -
checkers to staff up so they can handle the volume of moderation Facebook requires.
In a call with reporters Thursday, Facebook executives elaborated on their use of human moderators, third - party
fact checkers, and automation to catch fake accounts, foreign interference, fake
news, and to increase transparency in political ads.
This system will be put in place alongside Facebook's existing strategy for combating fake
news, which uses machine learning to identify bogus stories before passing those pieces on to human
fact -
checkers.