Development of new industry codes to better protect Australians online

Australia’s eSafety Commissioner will today present the online industry with a blueprint to guide the development of new industry codes to regulate harmful online content raising the bar when it comes to the safety of Australians online.

Following months of consultation with industry, the outcomes-based position paper will be presented to industry, detailing eSafety’s expectations for the development of the new codes, providing a foundation industry can build on during the drafting process.

The new codes, which will operate under Australia’s new Online Safety Act, will address issues like the proactive detection and removal of illegal content like child sexual abuse material, while also putting a greater onus on industry to shield children from pornography and other harmful content.

“eSafety has been working closely with industry to ensure that robust codes are developed which offer meaningful safety protections for Australians of all ages online and we will continue to work closely with them as they begin the drafting process,” eSafety Commissioner Julie Inman Grant said. “We want the online industry to succeed in this because their success is our success and will help protect more Australians online.”

“The codes we have now were developed almost 20 years ago before the explosion in the use of social media, messaging apps, interactive games, livestreaming and the widespread use of smart phones. With modernisation of the Online Safety Act, we’re also taking more of a harms-based approach to this range of issues.

“We’re here to support industry to create a modern fit-for-purpose online safety ecosystem that encourages industry to proactively detect and remove the most harmful content, because we believe eSafety and the online industry have a critical co-regulatory role to play here in helping keep Australians safe online.”

The codes will be drafted by industry and apply to eight industry sections, including social media services, websites, search engines, app stores, internet service providers, device manufacturers, hosting services, and electronic services, including email, messaging, gaming and dating services.

eSafety’s position paper proposes an outcomes-based framework, which aims to achieve proactive detection and removal of the worst-of-the-worst Class 1 content like child sexual abuse material and pro-terror content.

The responsibility then falls to industry in offering solutions to how they might achieve this, including through technology such as proactive human and machine monitoring, account suspensions and deactivations, deindexing of search results, and the use of forms of age assurance or parental controls.

The codes will also require that the industry limits children’s exposure to pornography and other harmful content, which could be achieved through age verification and age assurance mechanisms, internet filters and default safe search modes.

They will also be required to have tools in place that empower users allowing them to control their own access, as well as children’s access, to harmful content.

However, if industry is unable to establish appropriate codes, the eSafety Commissioner has the power under the Act to declare industry standards.

eSafety will be able to receive complaints and investigate potential breaches of the codes or standards, and they will be enforceable by civil penalties, enforceable undertakings and injunctions to ensure compliance.

/Public Release. View in full here.