Australia’s privacy regulator has called the use of facial recognition technology without consent of it customers, in dozens of Kmart stores around the country a breach of the long standing Privacy Act.
The Office of the Australian Information Commissioner (OAIC), led by Privacy Commissioner Carly Kind, found that one of the country’s leading retailer indiscriminately captured sensitive biometric data of every customer entering or visiting its returns counters in 28 stores between June 2020 and July 2022—all in a pilot program aimed at catching refund fraud.
Kmart argued it relied on an exemption under the Privacy Act meant to allow collection of personal information to tackle unlawful conduct. But the OAIC determined Kmart failed to meet critical conditions: the system was not proportionate, it collected biometric data of all entrants—not just suspected fraudsters—and there were less intrusive alternatives available. Sensitive information was collected without notice or consent.
In a blog titled “Is there a place for facial recognition in Australian society?”, Commissioner Kind lays out the privacy risks and legal tests businesses must satisfy before deploying facial recognition technology. She stresses that biometric data is classified as “sensitive information” under the Privacy Act, meaning its collection, storage, and use face heightened legal scrutiny. Entities must show that the use of facial recognition tech is reasonably necessary for their function or activity, that it is effective for that purpose, and that the interference with privacy is proportionate to the harm they aim to prevent.
How Kmart’s Pilot Worked and Why It Failed
The Kmart pilot collected five to six images of each person entering the stores or presenting at returns counters, regardless of whether they were under suspicion. Facial recognition software matched faces against a database of people suspected of refund fraud across Kmart stores. Non-matches were ostensibly deleted. Staff were alerted when there was a match and could refuse refunds.
OAIC reviewed internal documentation and found that Kmart’s belief in the FRT system’s necessity didn’t hold up across the board. The technology was useful in a subset of fraud cases—but it was not reliably effective in others, and Kmart did not clearly assess or document those limitations, the Privacy Commissioner said. The pilot’s benefits did not outweigh the privacy harms caused by capturing biometrics of many individuals not suspected of wrongdoing.
Kmart told local media the Privacy Commissioner’s determination was “disappointing” and thus it was reviewing its appeal options.
Like most other retailers, Kmart is experiencing escalating incidents of theft in stores which are often accompanied by anti-social behaviour or acts of violence against team members and customers,” a company spokesperson said. “Kmart remains committed to finding tools to reduce crime in our stores, so we deliver on team member and customer safety.
Regulatory Standards and Risk Factors
Commissioner Kind’s blog sets out six key considerations that any organization must assess before using FRT in commercial or retail settings. These include necessity (are other, less privacy-intrusive ways possible?), transparency (telling people their images are being collected), consent (where required), and safeguarding against false positives, misuse of a watchlist, retention policies, and access controls.
Public opinion bears out the concerns. Surveys show Australians are uneasy with facial recognition in retail. While many support its use for law enforcement purposes, fewer believe businesses should collect biometric data in the course of normal commerce.
For companies considering facial recognition tech, the OAIC decision should make them have second thoughts. Privacy law enforcement in Australia is no longer testing the boundaries, but drawing lines. Use of facial recognition for fraud prevention is not automatically defensible. Organizations must document effectiveness and necessity, consider whether proportionality is met, ensure consent and notice, and evaluate alternatives.
Time to Have The Ethics Dialogue
Commissioner Kind’s blog invites reflection beyond legal compliance. She raises questions about societal values. How much surveillance is acceptable, to whom, and under what oversight. Facial recognition technology may help with safety and fraud prevention, but deploying it without transparency, without guardrails, can erode public trust.
Kind writes that just because a technology can help doesn’t mean it should be used in every situation. She advocates for “privacy by design,” robust assessment of trade-offs, and ensuring individuals have real notice, control, and protection when their biometric data is at stake.
Also read: Australian Information Commissioner Seeks Civil Penalty Action Over 2022 Optus Data Breach
Source link