Major tech firms on Wednesday pledged to pursue a range of new measures aimed at stamping out violent extremist content on the internet, amid growing pressure from governments in the wake of the massacres at two New Zealand mosques in March.
The “Christchurch Call” was spearheaded by New Zealand’s premier Jacinda Ardern and French leader Emmanuel Macron, who gathered tech executives and world leaders to launch the initiative at a meeting in Paris.
The vow came after a gunman killed 51 people at two mosques in Christchurch in March while broadcasting his rampage live on Facebook via a head-mounted camera.
“The dissemination of such content online has adverse impacts on the human rights of the victims, on our collective security and on people all over the world,” the signatories said in a statement.
Facebook in particular has faced withering criticism since the Christchurch attack, after the horrific footage was uploaded and shared millions of times despite efforts to remove it.
The social media giant, which participated in crafting the new commitments, said earlier Wednesday that it would tighten access to its livestreaming feature.
Google and its YouTube unit also joined the pledge, along with Twitter, Wikipedia, Dailymotion and Microsoft.
The companies said they would cooperate on finding new tools to identify and quickly remove extremist content, such as sharing databases of violent posts or images to ensure they don’t spread across multiple platforms.
They also said they would explore tweaking their algorithms to prevent violent or hateful content from going viral, while making it easier for users to report harmful posts.
“For the first time, governments, international organisations, companies and digital agencies have agreed on a series of measures and a long-term collaboration to make the internet safer,” Macron’s office said in a statement.
But it will be up to companies to develop specific tools or policies.
‘Before harm is done’
The largely symbolic initiative is intended to keep up the pressure on social media companies, which face growing calls from politicians across the world to prevent their platforms from becoming stages for broadcasting extremist violence.
“We need to get in front of this (problem) before harm is done,” Ardern told CNN in an interview Wednesday. “This is not just about regulation, but bringing companies to the table and saying they have a role too.”
Many countries have already tightened legislation to introduce penalties for companies that fail to take down offensive content once it is flagged, by either users or the authorities.
Facebook said it would ban Facebook Live users who shared extremist content and reinforce its internal controls to stop the spread of offensive videos.
“Following the horrific recent terrorist attacks in New Zealand, we’ve been reviewing what more we can do to limit our services from being used to cause harm or spread hate,” Guy Rosen, the firm’s vice-president for integrity, said in a statement.
‘Can’t prevent uploads’
But analysts say the tighter controls pledged Wednesday will go only so far in preventing people from circumventing rules and policies already in place against disseminating violence and hate speech.
“You can’t prevent content from being uploaded: it would require the resources for tracking everything put online by all internet users,” said Marc Rees, editor in chief of the technology site Next INpact.
“Can you imagine trying to get TV or radio to prevent libellous, abusive or violent speech that someone might say?” he asked.
The “Christchurch Call” meeting ran in parallel to an initiative launched by Macron called “Tech for Good” which brought together 80 tech executives to discuss how to harness technologies for the common good.
Top officials from US tech giants Wikipedia, Uber, Twitter, Microsoft and Google attended, but not Facebook chief Zuckerberg, who met privately with Macron last week.
The US government has not endorsed the Christchurch Call and was represented only at a junior level at a meeting of G7 digital ministers which also took place Wednesday in Paris.
In an opinion piece in The New York Times over the weekend, Ardern said the Christchurch massacre underlined “a horrifying new trend” in extremist atrocities.
Ardern said Facebook removed 1.5 million copies of the video within 24 hours of the attack, but she still found herself among those who inadvertently saw the footage when it auto-played on their social media feeds.
Around 8,000 New Zealanders called a mental health hotline after seeing the video, she told CNN.