Gov to make automated decision-making transparent – Security – Strategy


The government will draft laws to make automated decision-making more transparent, criminalise malicious re-identification of information, and add technical requirements to protecting data.



Today Attorney-General Mark Dreyfus released a response [pdf] that “agreed” to 38 of the 116 proposals his department made in its review of the Privacy Act [pdf].

The Government plans to legislate the agreed proposals next year; its response to the review said that more consultation was required on another 68 proposals that were “agreed in principle”, such as the introduction of a tort of serious privacy invasion; another 10 proposals were simply “noted”. 

Automated decision-making 

The Privacy Act will be amended to define “types of personal information that will be used in substantially automated decisions” affecting “an individual’s rights” and enshrine “a right to request meaningful information about how automated decisions are made.” 

“The information provided to individuals should be jargon-free and comprehensible and should not reveal commercially sensitive information,” the government’s response said.

Noting that the “parameters” of “substantially automated” would need to be “considered,” the report said decisions could include those made in “financial and lending services, housing, insurance, education enrolment, criminal justice, employment opportunities and health care services.” 

Examples of processes that might fall under this definition could include the former government’s Robodebt scheme, which raised incorrect debts against welfare recipients.

The report said that the reforms would be “implemented…as part of the government’s response to the Royal Commission” into Robodebt, which recommended both a legal framework to audit automated decision-making systems used by government agencies and an authority to enforce it.

There should be “a clear path for those affected by decisions” to review and understand the implications of automated decision-making and “explaining in plain language how the process works, and business rules and algorithms should be made available, to enable independent expert scrutiny”, the July report said.

The regulator would “monitor and audit automated decision-making processes with regard to their technical aspects and their impact in respect of fairness, the avoiding of bias, and client usability.”

Criminal penalties

Late last year the government passed amendments to the Privacy Act to incentivise data protection with increased civil penalties for organisations that experience “serious” or “repeated” privacy breaches. 

Which organisations this will apply to remains in the balance; removing an exemption protecting around 2.3 million small businesses from obligations in the Australian Privacy Principles was agreed only in-principle; as was the introduction of a tort of serious privacy invasion.

However, the government did commit to a proposal to add criminal penalties for “malicious re-identification…where there is an intention to harm another or obtain an illegitimate benefit.”

This would require “consultation” on questions about how de-identification and re-identification would be defined.

“Importantly, the government considers that an individual may be reasonably identifiable where they are able to be distinguished from all others, even if their identity is not known,” it stated.

“For example, if a website publisher uses persistent cookies, device fingerprinting, or similar unique identifiers, the publisher may be able to identify a visitor, even if the visitor’s IP address is not unique to that visitor.”

Technical obligations

The government agreed that it was problematic that the “reasonable steps” outlined in the Australian Privacy Principles for how organisations covered by the Act are required to protect user data include no “technical” or “organisational measures.” 

“The government agrees the Office of the Australian Information Commissioner should provide additional guidance to entities about what reasonable steps an entity should take to keep personal information secure, and what reasonable steps an entity should take to destroy or de-identify personal information.”

However, the government only agreed in-principle “that entities should be required to comply with a set of baseline privacy outcomes, aligned with relevant outcomes of the Government’s 2023–2030 Australian Cyber Security Strategy.”



Source link