Australia’s eSafety Commissioner will soon have strengthened powers to protect Australians from exposure to harmful illegal online content, including child sexual exploitation and terrorist material, no matter where in the world it is hosted under new regulatory guidance released today.
Under updates to Australia’s Online Content Scheme, online service providers who fail to comply with eSafety removal notices to take down illegal content that is accessible to Australians face financial penalties of up to $111,000 per offence for individuals and $555,000 for corporations.
Those services may also have their content delinked from search engines and their apps removed from app stores if they fail to comply.
As a last resort, where a service is deemed to pose a serious threat to the safety of Australians, eSafety may also apply for a Federal Court order that the provider of a particular social media service, relevant electronic service, designated internet service, or internet carriage service stop providing that service in Australia.
“One of eSafety’s main focuses since we were created in 2015 has been protecting children online and one of the most important ways we are achieving this is by tackling the online proliferation and trade in child sexual exploitation material,” eSafety Commissioner Julie Inman Grant said.
“Australia’s Online Content Scheme has been in operation for over 20 years and through it we have been very successful within our own borders, to the point that very little if any child sexual exploitation material is actually hosted in Australia.
“But of course, Australians are still exposed to this very harmful material through sites and apps that operate from more permissive hosting environments overseas.
“And our statistics show this. In 2020, we received over 21,000 public reports through our Online Content Scheme, the majority of which involved child sexual exploitation material. This was the most in the scheme’s 20-year history and a 90 per cent increase compared to 2019. Sadly, we have seen a continuation of this elevated level in 2021 and it represents a new normal.
“With these new powers, we will now be able to take real action to disrupt the trade in this distressing material and if online service providers fail to comply with our removal notices, they will face very real and significant consequences.”
The updates to Australia’s Online Content Scheme form part of Australia’s new Online Safety Act.
In addition, eSafety has also released new regulatory guidance for its Abhorrent Violent Conduct powers protecting Australians from exposure to harmful material including terrorist material like the viral spread of the live streamed footage of the 2019 Christchurch attacks.
Under the new powers, eSafety can request or require an internet service provider to block material that promotes, incites, instructs in abhorrent violent conduct, including manifestos like the one produced and shared by the Christchurch gunman. Already, eSafety has a 93% success rate in tackling abhorrent violent material overseas through its notification powers
Updates to the eSafety’s Online Content Scheme also includes provisions for the development of new industry codes, which are currently being developed by the online industry. The codes will address illegal and restricted online content including child sexual exploitation material.
The new Online Safety Act will come into force on 23 January 2022, while new industry codes are expected to also be in place next year.
See the new regulatory guidance here.