Australia Outlaws Stalkerware and Deepfake Nude Generators in Tech Safety Overhaul

  • 02/09/2025

The Australian government has unveiled significant reforms to combat technology-facilitated abuse, specifically targeting the proliferation of stalking applications and AI software used to generate non-consensual deepfake nude images. Communications Minister Anika Wells stated the new measures will hold technology platforms responsible for blocking access to these "abhorrent technologies," which are causing widespread and irreparable harm. The approach aims to be proactive, developed in partnership with industry, while ensuring legitimate, consent-based AI and online services remain unaffected.

This crackdown is a response to growing alarm over the accessibility of AI tools that can create photorealistic explicit content without consent. A recent survey highlighted the scale of the issue, finding that 10% of young people knew someone targeted by deepfake nude imagery, with 6% reporting being victims themselves. Although the government acknowledges these reforms won't instantly eradicate such abuse, they are designed to build upon Australia's existing online safety laws and world-leading framework to provide significantly stronger protections for citizens.

The move is part of a broader series of aggressive legal reforms Australia has introduced to address digital harms. It follows the country's landmark law—the first of its kind globally—to ban minors under 16 from having social media accounts, which is set to take effect late next year. Companies that fail to take "reasonable steps" to comply with any of these new regulations, including the latest bans, could face massive fines of up to 49.5 million Australian dollars.

Related News