Australia’s eSafety Commissioner has approved an industry code requiring search engine operators to safeguard against risks arising from generative AI.
The commissioner registered the new code after forcing the sector to rethink protections earlier this year.
The code aims to ensure that class 1 material, including child sexual abuse material, created with AI isn’t returned in search results, and can’t be “synthesized” using AI that is built into the search engine.
In June eSafety declined to register an earlier version of the search code following notice from Microsoft and Google they would incorporate generative AI functionality into their respective internet search engines.
The revised code now includes requirements for search engine services such as Google, Bing, DuckDuckGo and Yahoo to incorporate protections that the AI functionality intertwined with search portals cannot be used not used to produce “synthetic” versions of child abuse material.
eSafety Commissioner Julie Inman Grant said in a statement that the “use of generative AI has grown so quickly that I think it’s caught the whole world off guard to a certain degree.”
“When the biggest players in the industry announced they would integrate generative AI into their search functions we had a draft code that was clearly no longer fit for purpose and could not deliver the community protections we required and expected,” she said.
“We asked the industry to have another go at drafting the code to meet those expectations and I want to commend them for delivering a code that will protect the safety of all Australians who use their products.”
It’s expected the revised code will come into play six months after registration.
Two other industry codes are also being drafted at present relating to designated internet services, covering file and photo storage services like iCloud and OneDrive and relevant electronic services, which include private messaging services.
Once these codes are completed, another set of standards will follow, focused on cracking down on inappropriate content targeting children.