Facebook Announces Latest Steps To Tackle Fake News
Hoping to prevent meddling in the 2018 midterm elections
March 26, 2018
Facebook announced on Thursday new measures it hopes would prevent fake news and election meddling in the upcoming 2018-midterm elections in November.
The social media giant is now blocking the creation of millions of fake accounts per day, Facebook said in a blog post.
The company is also expanding its fact-checking efforts, improving ad transparency, doubling its security team and working to prevent "misleading or divisive" stories from going viral in advance of the midterm elections.
"This year, for example, we are doubling the number of people who work on safety issues overall from 10,000 to 20,000, and that includes content reviewers, systems engineers and security experts," told Samidh Chakrabarti, Facebook Product Manager reporters during a press conference.
The company is also changing the way it works together with third-party fact checkers. "We're ramping up our fact-checking efforts to fight false news around elections," said product manager Tessa Lyons. "We're scaling in the US and internationally, expanding beyond links to photos and videos, and increasing transparency."
Advertisers will have to verify and confirm who they are and where they are located in the US in the run up to the 2018 elections, according to Product Management Director Rob Leathern.
Facebook began to fight fake news in the wake of the 2016 elections when Russian actors created fake accounts and posted fake stories in an attempt to meddle in the US presidential elections.