We condemn the horrific Wakeley terrorist attack and wish for a speedy recovery for Bishop Mar Mari Emmanuel and the Christ the Good Shepherd Church. We agree the video of the attack in circulation is horrific and disturbing. However, we support evidence-based policy and seeking global censorship is an ineffective, dangerous, and inappropriate response.
The Facts
It’s important that we remove emotion from the facts:
- Content depicting the Wakeley attack was uploaded, hosted and accessible on the ‘X’ platform (formerly Twitter).
- The video was graphic.
- The video was posted and shared by users internationally (not just in Australia).
- Data on the ‘X’ platform is stored in the United States.
- The video did not violate US law.
- The e-safety Commissioner requested that ‘X’ permanently remove the video from its global platform.
- ‘X’ in response initiated a ‘geo-block’ of the content in Australia – preventing Australians from viewing/accessing it.
Our Analysis
‘X’ has complied with its obligations by restricting access to the video in question to an Australian audience. However, it is unreasonable to expect a platform to respond to a request for a global ban, especially when it falls outside of Australian laws and jurisdiction.
Here’s why a ban would be ineffective.
We understand VPNs can be used to circumnavigate the existing ‘geo-block’. In this case, Australians may appear to be based in another location where the content is available on ‘X’ (such as Argentina). Even so, the video can be accessed on thousands of websites/forums with ease (and without a VPN). The effort required to remove content from all these platforms would be extreme and impractical.
As this latest instance shows, governments, political actors and the media are drawing more attention to the content than would otherwise be the case and the issue now has international coverage.
A better course of action would be to provide users with the option to ‘hide’ graphic content from automatically appearing in feeds, which would protect most Australians from shocking material.
A request to delete the content globally is effectively ‘global censorship’. If X was to take down the content globally it could establish a dangerous precedent of global takedowns, challenging public discourse, transparency, and accountability.
Imagine if the United States issued a takedown notice of a video depicting soldiers committing war crimes in Iraq uploaded by an Australian for an Australian audience? Or if the People’s Republic of China did the same for a video depicting the Tiananmen Square massacre or Tankman?
The core objective of the e-safety Commissioner is to protect our most vulnerable online. She should, instead of demanding censorship, focus on life-threatening content requiring immediate attention like child abuse material and scamming.
Our Position
Instead of seeking a reasonable, evidence-based approach to the horrific event, the Australian Government has pursued a populist and political endeavour. Social media companies have a social responsibility as corporate citizens, and we should take practical steps to reduce the proliferation of violent content on these platforms. This is our evidence-based position.
The Role of the e-Safety Commissioner
The office of the e-Safety Commission requires significant reform on its focus to be effective, democratic and protect our most vulnerable. Australians deserve protection from serious online criminal activity. In 2021 the Australian Centre to Counter Child Exploitation received over 33,000 reports of online child sexual exploitation. We should strengthen the capacity of the Commission so it can respond to every report, help hold perpetrators accountable and proactively prevent child abuse.
The ABS reports that two-thirds of Australians aged 15 years or older were exposed to a scam in 2021-22, with an estimated loss of $3.1 billion dollars reported by the ACCC in 2022. The Commission should take an active role in fighting scammers, taking down fake ads, and holding social media companies accountable for hosting impersonation scams. A transparent report on its efforts in this regard should be made public.
While extremism presents a danger online, the Commission must not undertake a political role. The censorship of ideological discourse, even when radical does not change minds but forces communities into ideological bubbles per Pew Research. This, in essence, restricts the exposure of individuals to new ideas or opposing viewpoints, impacting critical thought. These bubbles where ‘everyone agrees’ often fall outside of regulatory control/influence and lead to ideological cult-like radicalisation.
The e-safety Commission should be strictly apolitical and independent. Political parties and politicians should not influence its activity and must not partake in censorship. All interactions/communications between partisan officials and the Commission in decision-making should be made public. Officeholders should be appointed by an independent, apolitical commission, and all actions should be reported in a proactive and transparent fashion.
Social Responsibility & Social Media Companies
Social Media companies have a responsibility to protect their users, and this starts with transparency and accountability. The position that we should punish platforms that publish ‘unfavourable’ content is misleading, dangerous and will not work. The Australian Democrats understand that social media is inherently complex and social responsibility starts with the algorithms on these platforms.
Social media giants should provide comprehensive reports on their algorithms and outline how they manage content. This would include factors that determine what content Australians see, the impact of personalisation and ‘deboosting’/account reputation scores. This information should be available to the public and media, who can scrutinise these platforms and incentivise responsible, transparent algorithms that are in line with societal standards.
We support stricter data protection and localisation laws in Australia where practical. Australians deserve to have their data protected and free from unlawful surveillance and interference. Australians should be able to customise how social media companies use their data individually, in accordance with the EU’s GDPR legislation.
Children and vulnerable people must be protected on social media platforms. Inappropriate content must be restricted from their view where possible, and social media companies should be transparent on how they intend to accomplish this. Platforms should hire Australian-based staff to combat child abuse material.
We are opposed to government-mandated ID checks for social media for age or anti-spam verification purposes. Any company that voluntarily enacts verification systems should ensure strict compliance with privacy legislation. Strict penalties must be enforced for misuse of user identification.
Politicians should not direct the public or internal processes of Social Media companies. Any correspondence between government bodies and social media companies intended to influence decision-making algorithms, censorship or otherwise must be made public.
Social Media giants should provide region-specific artificial intelligence reports and frameworks that assess how their AI will impact Australia. This includes impacts on platform users, creators, professionals, government agencies and society at large.
We would encourage Social Media companies to adopt a content curation system. In this scenario, users would select and customise the type of content they would like to see (e.g. if they would like to see graphic content or not) – promoting safety while minimising censorship risks. This system would be strictly optional, not controlled by government and empower the user to curate their experience.
Sources:
Australian Bureau of Statistics, 22 February 2023 <https://www.abs.gov.au/media-centre/media-releases/132-million-australians-exposed-scams>.
Australian Competition and Consumer Commission, 17 April 2023 <https://www.accc.gov.au/media-release/accc-calls-for-united-front-as-scammers-steal-over-3bn-from-australians>.
eSafety Commissioner, 23 April 2024 <https://www.esafety.gov.au/newsroom/media-releases/statement-on-removal-of-extreme-violent-content>.
The Conversation, Dan Jerker B. Svantesson, 23 April 2024 < https://theconversation.com/elon-musk-vs-australia-global-content-take-down-orders-can-harm-the-internet-if-adopted-widely-228494>.
The Guardian, 16 August 2021 <https://www.theguardian.com/australia-news/2021/aug/16/australian-minor-parties-revolt-against-new-rules-that-could-bar-up-to-30-from-next-election>.
Pew Research, 29 March 2017 <https://www.pewresearch.org/internet/2017/03/29/the-future-of-free-speech-trolls-anonymity-and-fake-news-online/>.