Australian Federal Police officers “potentially” used online facial image search tools on 10 occasions for operational matters in the first half of the year, against internal guidelines.
A freedom of information (FoI) request by Guardian Australia reporter Ariel Bogle published earlier this month [pdf], showed 35 “connections” were made from inside AFP to FaceCheck.ID and 742 “connections” to reverse image search tool PimEyes, between January and August this year.
This was clarified at a senate estimates hearing late last night, where chief operating officer Charlotte Tressler said the AFP had uncovered about 41 users of the tools in total so far.
Of the 41, 27 accessed the sites “for research and browsing purposes only”; eight “to understand the existing open source AI tools available online”; three “to test the platforms and assess the capability”; and another three who actually used the tools in active investigations.
What that last cohort of three officers did with the tools is now subject to internal investigation.
“We had three users that have potentially used the platforms for operational purposes,” Tressler said.
“We believe it may be [on] around 10 occasions, we’re still reviewing [it].
“We believe maybe one photo has been uploaded – we’re still confirming the details – and possibly one file.”
Tressler said that nine of the occasions related to use of PimEyes, with only one potential operational use of FaceCheck.ID.
She said the AFP had been “surprised” by the use of the two tools, and only learned of their use while fulfilling the FoI request.
“The AFP’s not endorsed … either of these platforms for use, which is why we’ve been, as a matter of priority, doing a review of all use of the platforms and ensuring that does not occur again,” Tressler said.
“I’m now doing a full stocktake of all use of all of our technologies to ensure that we’re not surprised again in future.”
Tressler said that in addition to the tool usage “stocktake”, a number of other actions have flowed out of the internal investigations.
“We are looking at all of our governance around those and looking to strengthen our arrangements,” she said.
“I have now contacted every command within the AFP to reinforce that they need to follow this governance when using new technologies.
“We have now established a new area to ensure that we are oversighting the use of AI and emerging technologies.
“We need to take this seriously. We were not satisfied when we saw that there was potential use of these platforms without the required checks in place and we are strengthening our governance appropriately.”
The AFP has previously been caught out using controversial open source facial image search tools.
It admitted to briefly trialling Clearview AI in 2020, a platform that scraped billions of images from Facebook and was later found to have breached Australian privacy laws.
In response, the AFP restricted its use of free software trials and said it had centralised evaluation of emerging technology, though it was not clear in the instance of PimEyes and FaceCheck.ID how these existing guardrails were circumvented.