Facebook is launching a campaign to help people spot fake news amid a growing advertising boycott putting pressure on the company to tackle misinformation and hate speech.
Steve Hatch, Facebook’s vice president for Northern Europe, says the media literacy campaign launched with fact-checkers FullFact is evidence that the company is “listening and adapting”.
But some experts and critics argue the effort across the UK, Europe, Africa and the Middle East is “too little, too late”.
The campaign will direct people to the website StampOutFalseNews.com and ask users key questions about what they see online: “Where’s it from?” “What’s missing?” and “How did you feel?”
Seven ways to stop fake news from going viral
In an exclusive interview with the BBC, Mr Hatch says “financial considerations” are not behind the new ads.
In recent days, more than 150 companies – including Coca-Cola, Starbucks and Unilever – have announced temporary halts to advertising buys on Facebook as a result of the #StopHateForProfit campaign.
- Could a boycott kill Facebook?
‘Night and day’
Misinformation or viral “fake news” has been a persistent issue for years on the social network, and it flared up dramatically after the emergence of Covid-19.
In May, a BBC investigation found links between coronavirus misinformation and assaults, arsons and deaths, with potential – and potentially much greater – indirect harm caused by rumours, conspiracy theories and bad health advice.
- The human cost of virus misinformation
- The (almost) complete history of ‘fake news’
Mr Hatch says Facebook employees have working “night and day” to tackle false claims during the pandemic.
“If people were sharing information that could cause real-world harm, we will take that down. We’ve done that in hundreds of thousands of cases,” he says.
But the media literacy effort is “too little too late” says Chloe Colliver, head of the digital research unit at the Institute for Strategic Dialogue, an anti-extremism think tank.
“We’ve seen Facebook try to take reactive and often quite small steps to stem the tide of disinformation on the platform,” Ms Colliver says. “But they haven’t been able to proactively produce policies that help prevent users from seeing disinformation, false identities, false accounts, and false popularity on their platforms.” Facebook also owns Instagram and WhatsApp.
Facebook and other social media companies have also come under pressure over misleading information or comments that could arguably incite violence, in particular posts by US President Donald Trump.
Following widespread protests after the death of George Floyd, the President warned: “Any difficulty and we will assume control but, when the looting starts, the shooting starts.”
The post was hidden by Twitter for “glorifying violence”, but remained on Facebook.
Mr Hatch says that the US president’s posts “come under a high level of scrutiny” by Facebook bosses. Echoing earlier comments by chief executive Mark Zuckerberg, he denied that the comment in question broke Facebook’s rules, and stated that the company interpreted it as a reference to the possible use of National Guard troops.
“Whether you’re a political figure or anyone on the platform,” Mr Hatch says, you will be reprimanded for sharing posts that could cause real-world harm.
Is there a story we should be investigating?.