Skip to main contentSkip to navigationSkip to navigation
Facebook and other social media sites should use ‘counter-speech’ to fight extremism, says Monika Bickert.
Facebook and other social media sites should use ‘counter-speech’ to fight extremism, says Monika Bickert. Photograph: Beck Diefenbach/Reuters
Facebook and other social media sites should use ‘counter-speech’ to fight extremism, says Monika Bickert. Photograph: Beck Diefenbach/Reuters

Facebook policy chief: social media must step up fight against extremism

This article is more than 7 years old

Speaking at SXSW, Monika Bickert calls for social networks to build ‘counter-speech’, responding to extremism and hate speech with broad opposition

Social networks such as Facebook need a more proactive approach to countering extremism and hate speech than simply deleting extremist posts, the tech giant’s head of product policy said at SXSW on Saturday night.

“Even if we were perfect at keeping violent extremism from ever hitting our community and other technology companies were perfect, we know that alone isn’t enough to change minds or stop the spread of violent extremism,” said Monika Bickert, who appeared on a panel titled Taking Back the Internet: Countering Extremism.

She also said networks need what she called counter-speech, a crowd-sourced response to extremism where posts are met with disagreement or derision.

“The best remedy is good speech that gets people thinking and challenging ideologies. We focus on trying to amplify some of the voices to counter violent narratives,” Bickert said.

Facebook has partnered with the US Department of Homeland Security’s Countering Violent Extremism Task Force and EdVenture Partners, an analytics firm, to fund and scale campaigns against hate and extremism. The partnership, called Peer to Peer: Challenging Extremism, invites university students to design prototypes and digital media campaigns designed to change the minds of people lured towards extremist groups such as Isis and neo-Nazis. The partners used the SXSW panel to outline some of the things they have learned since the program launched in 2015.

Who delivers the counter-speech is critical. “You have to be a credible speaker,” Bickert said. Someone from government or a senior executive from a tech company is “not likely to resonate in the same way as a young person’s voice speaking to a young person’s community” would, she said.

Matthew Rice, chief digital officer of the homeland security department taskforce, agreed that authorities struggled to persuade people at risk of becoming radicalised. “The government isn’t the voice,” he said. “A lot of this speech is first-amendment protected, so the government isn’t the best person to act in this space. You empower folks outside.”

The student project that won the Challenging Extremism program was called It’s Time: ExOut Extremism, and was created by a team from Rochester Institute of Technology. It creates videos, infographics and other educational tools and resources to empower people, who might otherwise stay silent, to stand against extremist content.

“People who are radicalised were searching for camaraderie, community,” said Olivia Hauck, ExOut CEO. ExOut provides that for people on the other side of the fence.

Keeping messages positive rather than negative makes a bigger impact. “If you say you’re wrong, your ideas are stupid, it doesn’t shift opinions. If you use humour, it’s more likely to be shared and ignites the community.” Bickert said.

Hauck added: “If you make someone uncomfortable to have a seat at the table, you never get to have those conversations.”

One challenge for Facebook and the homeland security department is how extremism and hate speech is defined, particularly in an administration that includes officials who arose from the far right. The White House chief strategist, Steve Bannon, for instance, was head of the site Breitbart News, which features antisemitic, racist and misogynistic articles, and an appointee to the Department of Energy was dismissed last week after a history of anti-Muslim remarks was exposed.

“The administration is still young and still figuring out its way,” Rice said. “But we are still combatting violent extremism regardless of ideology.” He added that although many of the student campaigns focused on the terror group Isis, there were others that focus on rightwing and xenophobic groups.

Facebook stopped short of accepting responsibility for attempts at intervention targeting users who appear to have been radicalised, although it has similar tools. The social network has recently revealed it can use artificial intelligence to identify, from someone’s posts, whether they might be suicidal. If they are, someone in Facebook’s team will be notified to make contact with the user at risk.

Facebook is also using AI to determine the difference between news stories about terrorism and actual terrorist propaganda, a system revealed by Mark Zuckerberg in a manifesto he published in February.

In a version of Zuckerberg’s letter sent to media outlets, the CEO described an even more specific and invasive tool: in the long term, he said, AI would be used to “identify risks that nobody would have flagged at all, including terrorists planning attacks using private channels”.

However, when the Guardian asked Bickert whether Facebook would use AI to detect and intervene among people posting extremist material, she returned to an oft-cited company line: Facebook is a neutral platform.

“We are not the creators of this content,” she said. “We facilitate people using Facebook.”

That’s not to say that the algorithmic selection of content shown to targeted individuals can’t sway people’s opinions, as demonstrated by Facebook’s own advertising sales team, which has told advertisers it can help get senators re-elected.

More on this story

More on this story

  • Internet firms must do more to tackle online extremism, says No 10

  • Face-off between MPs and social media giants over online hate speech

  • Social media sites face heavy hate speech fines under German proposal

  • The ‘free speech debate’ is nothing of the sort, whatever the far right says

  • Publishers chased Katie Hopkins and Milo Yiannopoulos. But hate doesn’t sell

  • Toxic political agenda is dehumanising entire groups, Amnesty warns

  • Mark Zuckerberg pens major Facebook manifesto on how to burst the bubble

  • 'We always look for reliability': why Wikipedia's editors cut out the Daily Mail

Most viewed

Most viewed