By Guest (a concerned New Zealand resident)
Introduction
Facial Recognition Technology (FRT) is an incredibly insidious invasion of privacy. It can be collected without a person’s knowledge, as stated in the report by Nessa Lynch and Andrew Chen, in 2021. Although this report is over two years old, this seems to be the latest detailed analysis of the problems with FRT.
The 10 recommendations including the pausing of live FRT have been fully accepted by the police.
The report states,
“Facial Recognition has a significant potential impact on individual and societal privacy.”
Further,
“FRT could have a chilling effect on rights to freedom of expression and peaceful assembly. An expansion of facial comparison systems to include large scale collection from those who have not been convicted or charged could impact on a person’s right to be presumed innocent until proven guilty” (See page 4 of the Lynch & Chen report).
Reliability of Facial Recognition Technology
The police have stopped using live FRT, due to the fact it was determined to be both, “Inaccurate and ineffective for use” (See page 4 of the Lynch & Chen Report).
In terms of live FRT, at page 37, the report concludes FRT is a “disproportionate shift in the balance of power between individuals and the police.” The lack of consent leads to mistrust from the public. The report concludes, that the police should continue to pause use of live FRT and the pause should be permanent unless the following conditions are met: –
- the technology has, improved accuracy
- consultation with public
- and a clear, lawful purpose identified
The question to the Privacy Commission is, how can it be acceptable for Foodstuffs to use this unreliable technology, when the police, who have specialist training and experience have responsibly decided to pause the technology following the recommendations in the report? Bunnings and Kmart have followed the lead of the police and paused their FRT.
The starting point for any such invasive privacy breaching technology, should be that it is reliable beyond reasonable doubt. Until live facial recognition passes this fundamental test it should be banned completely. Period.
Jacinda Ardern is reported to have commented on FRT being used by supermarkets with the statement “A tool that is that inaccurate really does prove to be very, very problematic”. To prove the point there has been yet another recent case of wrongful identification in New Zealand.
There have been practical examples highlighted in the above report, of the inaccuracy of the technology, particularly with regard to different ethnicities. There are the questions of subjecting minors to this surveillance, and of course the inaccuracy in terms of children who do not have fully developed features. It is notable that there is no indication that the supermarkets have liaised as suggested with the Office of The Children’s. Commissioner.
The unreliability of FRT is explored in Part 7 of the Lynch & Chen Report, providing further evidence of the concerns about the accuracy of FRT in other jurisdictions.
When an unreliable system is used the consequent impact on individuals can be devastating.
“When it comes to identifying people accused of a crime, getting it wrong has a severe impact on the person affected”
Privacy Commissioner John Edwards, quoted in Prime minister expresses concern over facial recognition technology used by supermarkets, Madison Reidy, Stuff, 15 May 2018
In addition to reliability, is the question of efficacy.
Liz MacPherson, Deputy Privacy Commissioner, in a Stuff article discusses the concerns regarding Foodstuffs use of FRT, asking them to consider whether the use of FRT was a “necessary, proportionate and effective response to harmful behaviour”. The same sentiments are discussed in detail in the Lynch & Chen Report for the police[1] when discussing the concept of “efficacy.”
The principles of efficacy are wider than technical accuracy and include,
- How accurate the technology is across different demographics?
- How effective it is in achieving a stated purpose?
- What are the alternatives? Are they more effective?
Legal, Privacy and Human Rights Concerns
In relation to legal issues, the police are prohibited from using private FRT systems due to the pause on their use of FRT. The report points out,
“It would be problematic if third- party cameras and processing systems were used as a loophole to do things that Police cannot do with their own systems. We therefore recommend that Police use the same policies and rules for handling data derived from third- party FR systems as they would do their own. For example, if Police decide to place a moratorium on the use of live FR on their own systems, then that should extend that to live FR on third-party systems and Police should refuse offers from private entities.”
Pages 38 and 39 of the Lynch & Chen Report
This poses the question; how would the Foodstuff stores, conducting trials, practically have their FRT data introduced into evidence at court?
Human Rights
The use of FRT conflicts with international human rights treaties, particularly Article 17 of the International Covenant on Civil and Political Rights. There are also conflicts with the New Zealand Bill of Rights, including freedom of thought, freedom of movement, freedom from discrimination, respect for private life, the right to be free from unlawful search and seizure and the presumption of innocence.
The detection of emotions, a capability identified in the report, is dystopically disturbing, especially the potential breach of human rights such as freedom of thought. Whilst the Privacy Commission does not currently support the use of FRT for this purpose, it is the start of a slippery slope, to the extension of powers, to include thought crime surveillance in the future.
Natural Law
The unreliability of the technology, and the legal, privacy and human rights issues need to be promulgated, in detail, to the public, by the Privacy Commission, in accordance with the principles of natural law.
Whilst these fundamental issues remain, the use high risk facial recognition should be banned. Legislation banning the technology, until such a time that all the issues raised in the report are resolved, is the answer not a Code. The current trials by supermarkets have completely ignored all the above issues and have shown no ability to comprehend the risks of deploying FRT.
The proposal for a Code, when the first issue is the unreliability of the technology, is like sending out a faulty helicopter, but putting in place a Code to deal with the inevitable accidents. The fault needs to be fixed before letting the helicopter fly. Until the technology is proved to be accurate, the requirements under efficacy are met, and the legal, privacy and human rights issues are resolved and appropriate checks and balances are put in place as recommended in the report, the Code is an empty ambulance at the bottom of the cliff.
The Privacy Commission should therefore be supporting the position of the police in pausing high-risk FRT and liaising with the authors of the report, focusing on the reliability, efficacy and legality of this technology and exploring the checks and balances required, as discussed below.
The use by corporations, who do not comprehend the rules of evidence of such invasive surveillance technology is frightening. It is evident from the correspondence sent to Foodstuffs the security guard in charge of the FRT had no understanding of evidence principles and could not even have an informed conversation with a customer.
On page 31 of the Lynch & Chen Report, the checks and balances recommended for police in respect of AI and FR include:
- The police not having direct access to this technology. Instead, it will be dealt with by formal request to a specialist body called National Biometric Information Office (NBIO)
- It is proposed Staff at NBIO will be required to undergo a three-year training course, leading to a recognized qualification and a further requirement to undergo continuing education to keep up to date with emerging developments
- Consideration of Privacy Impact Assessments.
- Defined business processes and system rules.
- The starting point would be an investigation of a crime, not wholesale surveillance including the innocent.
- Other governance checks including strict search criteria.
- A requirement for corroborative evidence from other sources
These checks and balances should be the starting point for any future FRT deployment. There cannot be self- policing by private organisations. There has to be accountability. There has to be independent review and governance, with penalties for non-compliance, over any future use of FRT in accordance with natural justice principles.
The authors of the report reiterate that live FRT poses the highest risk, raising ethical and legal considerations. They point out the problematic issue of All persons being captured, which raises ethical concerns, as the fundamental principles of justice dictate the presumption of innocence. Further there is the lack of consent issue and the fact that there are accuracy concerns.
Facial Recognition Technology and Essential Services
Essential food services should never allow this invasive breach of civil liberties. It can never be proportionate to cut off access to food to those who value their privacy and have justifiable, serious, concerns regarding being wrongly accused.
If FRT ever does meet the requirements necessary to be reliable and effective there always needs to be an opt out alternative.
Foodstuffs and Transparency
Foodstuffs have not at any point demonstrated a willingness to be transparent. They failed to inform the Government Digital Services Minister Clare Curren about their use of FRT. The Privacy Commissioner has already expressed concerns regarding the assertion by Foodstuffs that this technology would have any practical effect on the safety of customers. Foodstuffs have failed to recognise any of the risks to customers, which indicates they are incapable of conducting a balanced benefit risk analysis This does not inspire confidence that they can effectively manage this intrusive surveillance, with all the risks to the public. Lay staff cannot be expected to comprehend the scope of ethical, privacy and human right concerns, so cannot be relied upon to successfully adhere to any Code.
It is notable in public surveys that many members of the public cite ‘I have nothing to hide’ as a reason why they do not object to intrusive surveillance (See page 57 of the Lynch & Chen Report). This is then used to bolster the narrative that FRT is accepted. However, this is a false perception. Those who profess to hold this notion do not realise they are not in control of deciding what is deemed to be problematic. Someone else with potentially very different views is.
Conclusion
FRT is an insidious breach of privacy and human rights. Currently it has proven to be inaccurate, unreliable and is not effective in achieving a stated purpose. The Privacy Commission should recommend that until these issues are resolved that the lead taken by the police should be followed in New Zealand. All live FRT should be paused. It should be permanently banned if the issues highlighted in the report cannot be resolved.
[The content of any Opinion pieces represents the views of the author and the accuracy of any content in a post labelled Opinion is the responsibility of the author. Posting of this Opinion content on the CityWatch NZ website does not necessarily constitute endorsement of those views by CityWatch NZ or its editors. CityWatch NZ functions to provide information and a range of different perspectives on New Zealand’s cities and local councils. If you disagree with or dispute the content, CityWatch NZ can pass that feedback on to the author. Send an email to feedback@citywatchnz.org and clearly identify the content and the issue.]