Australia’s media watchdog launched a formal investigation Monday into the possible breach of rules by networks which broadcast or put online images from a livestream of the New Zealand mosque massacre.
Some Australian outlets used clips or images taken from the 17-minute livestream broadcast on Facebook by alleged gunman Australian Brenton Tarrant as he mowed down worshippers at two mosques in Christchurch on Friday, killing 50 people.
The Australian Communications and Media Authority (ACMA) said its “formal investigation” would include content from the livestream broadcast by commercial, national and subscription television.
ACMA chair Nerida O’Loughlin would write to broadcasters’ CEOs demanding “urgent information on the nature, extent and timing of the broadcast of content relating to the shootings, in particular from the day of the attack”, it said.
The regulator said it would also meet with industry bodies to discuss whether current rules provide “adequate protections” for the public from such content.
The ACMA’s authority does not extend to online content, but the regulator said it would work with the Australian Press Council to determine if broadcasters may have breached rules by posting images from the Tarrant livestream on their websites.
The ACMA has the authority to levy stiff fines for breaches of its broadcast rules.
In the wake of Friday’s mosque rampage there have been heightened demands in New Zealand and Australia for social media companies to take greater measures to prevent the spread of violent imagery and speech.
Facebook said it “quickly” removed Tarrant’s video and scrubbed 1.5 million videos worldwide from the platform within 24 hours.
But despite efforts, the footage proliferated widely online and experts said the video was easily retrievable several hours after Friday’s attack.
New Zealand Prime Minister Jacinda Ardern said she would raise her concerns over the issue directly with Facebook.
Australian Prime Minister Scott Morrison acknowledged for his part that while social media companies have been willing to take action, “clearly the capability to deliver on that willingness hasn’t been present.”
“There needs to be the capability to be able to shut this — these horrific things — down immediately and if you can’t do that, then the responsibility of having those features available is something that really generally needs to be questioned,” he said.