
Facebook announced its efforts to battle fake news in December. So far, it's a losing effort.
FacebookFacebook's automated antidotes to fake news aren't working.
The massive social network has gone on a fight-fake-news blitz after realizing it had a problem with hoaxes going viral on its website. It started with founder Mark Zuckerberg denying that fake news on Facebook affected the 2016 US presidential election and evolved to Facebook doing everything it could to curb phony stories from spreading -- including taking an ad out in UK newspapers on how to spot fraud articles.
Facebook started testing features like fact-checked articles attached to questionable content, and in December it added warnings for fake news stories on people's newsfeeds.
Too bad it's apparently had the opposite effect.
According to a report from The Guardian, Facebook's fact-checking system only serves to push people to click and spread stories more, with people telling their friends that the social network is trying to censor the "news."
Facebook didn't respond to requests for comment.
When the fact checks do work, often it's too late. By the time fact checkers get to a story, it's likely already spread on the social network and done its damage, Aaron Sharockman, PolitiFact's executive director told the publication.
"These stories are like flash grenades," Jestin Coler, a fake-news writer told the Guardian. "They go off and explode for a day. If you're three days late on a fact check, you already missed the boat
It's Complicated: This is dating in the age of apps. Having fun yet? These stories get to the heart of the matter.
Technically Literate: Original works of short fiction with unique perspectives on tech, exclusively on CNET.