NewsNational News

Actions

Facebook and Google will face Congress over white nationalism

Posted
and last updated

Representatives from Facebook and Google will be on Capitol Hill today to face questions from lawmakers about how their platforms are used by white supremacists. You can follow the latest updates here.

The hearing, which is being conducted by the House Judiciary Committee, comes just a few weeks after a terror attack in New Zealand that was streamed live on Facebook. Fifty people at two mosques were killed in the attack.

The representatives from the two big tech companies' policy teams will appear on an eight person panel that will also include representatives from civil rights groups such as the Anti-Defamation League, and Candace Owens of the conservative group Turning Point USA. Google has received criticism for the role online search plays in spreading hateful ideologies, but its video sharing site YouTube has increasingly been slammed for hosting such content and its algorithms surfacing it.

The New Zealand attack "underscores the urgency" of addressing the white supremacy problem on social media, Kristen Clarke, the head of the Lawyers' Committee for Civil Rights Under Law, told CNN Business.

The attack, Clarke said, is "exhibit A in how violent white supremacists abuse the Facebook platform to promote their dangerous, fatal activities." She will be part of the panel testifying on Tuesday.

The mass shootings in New Zealand highlighted two key challenges for the social media platforms: The way in which they are used to spread extremist ideologies and rally people to those ideologies, and how people who commit violence on behalf of those extremist ideas use the platforms to promote their actions.

Two weeks after the massacre, Facebook announced that it would ban all "praise, support and representation of white nationalism and separatism" on Facebook and Instagram. Previously, the company had banned white supremacy, but had viewed white nationalism differently. The company said it had decided to ban white nationalism after months of consultation with civil rights groups.

Neither YouTube nor Twitter have enacted similar blanket bans of white nationalism but both companies say they have policies to fight hate and the incitement of violence on their platforms.

Despite investments in human moderators and artificial intelligence, Facebook failed to interrupt the video stream of the mass murder as it was streamed live.

Facebook and YouTube said they spent the days after the attack removing millions of reuploads of the video. Facebook said it had stopped the upload of 1.2 million versions of the video, but that 300,000 copies had made it onto the platform and were later removed.

A statement from the House Judiciary Committee said Tuesday's hearing "will examine hate crimes, the impact white nationalist groups have on American communities and the spread of white identity ideology. The hearing will also foster ideas about what social media companies can do to stem white nationalist propaganda and hate speech online. "