A graphic showing a man holding a smartphone with the words "This stream has been cancelled".

A lot has been said of New Zealand Prime Minister Jacinda Ardern’s leadership in the wake of the Christchurch massacre that left 51 Muslim worshippers dead in March.

Key points:

  • The initiative will ask nations to adopt measures to ban objectionable material online
  • But experts warn regulation is difficult to implement
  • Facebook and other tech giants have resisted pushes for regulation in the past

15 may 2019


She has earned praise for her resolve and her humility, and now she’s embarking on an ambitious plan to curtail the spread of extremism online across the world.

Later today, Ms Ardern will co-chair a meeting with French President Emmanuel Macron in Paris, on the sidelines of a Tech For Humanity meeting attended by digital ministers from G7 nations, to discuss her plan which has been dubbed the ‘Christchurch Call’.

Jacinda Ardern, wearing a headscarf, walks to mourners

The bold initiative will ask signatory nations to adopt measures to ban objectionable material online and create a framework for media to report on atrocities without amplifying them.

Queensland University of Technology’s Nicolas Suzor, who studies online regulation, said the Christchurch Call remained a non-binding agreement despite a growing sense that the regulatory status quo online was no longer acceptable.

“Governments aren’t really willing to defer to platforms anymore … right now everyone agrees this needs to be taken more seriously,” Dr Suzor said.

Reality is ‘majority of people watched video after’

 

In the wake of the Christchurch attack, discussions have been swirling around what could have been done to prevent the real-time spread of a video of the atrocity filmed by the attacker.

In an op-ed published earlier this week in the New York Times, Ms Ardern said New Zealand will push to “prevent the use of live streaming as a tool for broadcasting terrorist attacks”.

 

But Dr Suzor said the live stream function was not to blame for the video’s rapid proliferation.

“The great majority of people who watched this video watched it after the fact,” he said.

There are also few, if any, effective options to filter out violent content using solely algorithms.

Dr Suzor said identifying and removing extremist content required significantly more resources, manpower and vigilance from online platforms.

“You can’t reliably train a machine to tell the difference between news footage that’s permissible and violence material,” Dr Suzor said.

Many media outlets also played sections of the video, fuelling a debate about what role they played in its spread.

“[The post-Christchurch period] is a productive opportunity for a discussion on media ethics … particularly what we do to reduce the impact and demand for atrocities like this,” Dr Suzor said.

 

Facebook logo.

According to Andre Oboler, senior lecturer in cybersecurity at the La Trobe Law School and chief executive of the Online Hate Prevention Institute, the intense focus on live streaming has also distracted from other important discussions about online regulation.

“An effective response to regulation needs to take account not only of attacks, but also of incitement to violence and the incitement to hate that proceeds it,” he said.

Dr Oboler added thatpublic attitudes needed to change alongside legislative change in order to “nudge companies in the right direction”.

“What’s needed is a greater investment in creating standards, backing them with laws, and ensuring international cooperation between governments,” he said.

Silicon Valley’s tech giants have been reluctant to shoulder responsibility for what appears on their platforms, arguing they’re not publishers.

That claim has riled critics who say it’s a convenient answer that allows them circumvent responsibility.

“I think for many years the tech industry has not taken it seriously enough … they haven’t spent a lot of money historically in improving those processes in a way that engenders trust, so they now face a point of crisis where people want them to do better,” Dr Suzor said.

‘Zuckerberg alone decides how to configure Facebook’

Facebook and other tech giants have long been accused of dragging their feet over managing the spread of extremism on their platforms.

Facebook chief executive Mark Zuckerberg will be conspicuously absent from today’s meeting, prompting criticism he is yet to fully acknowledge the need for more regulation.

Last week, Facebook’s co-founder Chris Hughes called for the company to be broken up in an op-ed published in the New York Times, citing Mr Zuckerberg’s near unchecked powers.

“Mark alone can decide how to configure Facebook’s algorithms to determine what people see in theirnews feeds… he sets the rules for how to distinguish violent and incendiary speech from the merely offensive,” he said.

Mr Zuckerberg for his part has maintained that Facebook is serious about preventing extremism and earlier this week welcomed recent efforts by the French Government to work with Facebook to tackle the spread of hate content.

The ABC also understands Facebook will sign up to the Christchurch Call action plan.

Facebook also said in a statement that “we share the commitment of world leaders to curb the spread of terrorism and extremism online”.

Mr Macron meanwhile appears a natural ally for Ms Ardern to help establish a framework to stamp out hate speech online.

 

Emmanuel Macron and Vladimir Putin give a joint press conference.

Mr Macron has made it no secret he is fed up with Facebook’s reticence to act on online extremism and concerns over Russian political interference.

His government is pushing ahead with reforms to make the company more accountable for content that appears on its platform.

But Facebook does have a history of actively resisting government intervention in its practices.

According to a report published by Politico in January, Facebook told the EU Commission in 2016 that “the industry does not need a regulatory push to improve” and insisted its own usage rules should take precedence over EU regulation.

Those efforts came alongside record spending on lobbying by Google and Facebook to influence policy in Washington as politicians continue to mull new privacy and censorship laws in order to rein in the powers of Silicon Valley’s tech giants.

‘This can succeed only if we collaborate’

But Dr Oboler said Facebook is not sitting idly by when it comes to weeding out extremist content.

“Facebook has a number of experts in counter-terrorism on staff — that’s a far greater engagement than most other companies,” he said.

“They also have artificial intelligence that works to spot violent content in real time, as well as reporting mechanisms.”

New Zealand Prime Minister Jacinda Ardern wearing a hijab.

But today Ms Ardern is up against an industry seemingly reticent to wide reaching change, although after Christchurch she is resolved to bring it about.

In her New York Times op-ed, Ms Ardern said “our aim may not be simple, but it is clearly focused: to end terrorist and violent extremist content online. This can succeed only if we collaborate”.

Ms Ardern has gained a powerful ally in Mr Macron, and her cause will be buoyed if more nations agree to her framework later today.

Dr Oboler remains sceptical as to whether the Christchurch Call will be an effective solution to end online extremism, but he does say “it could well be the start of international cooperation that eventually leads to stronger solutions”.

Editor: Can I ask how a female leader who runs a country of 4.5 Million people and has done so for a mere 18 months (Oct 2017), who in 2008, was elected President of the International Union of Socialist Youth,has so much sway in the world, while at the same time is tucked away at the bottom of the Pacific Ocean, and yet can still charm world leaders? Someone is drinking the Kiwi Kool Aid!