Facial recognition technology and privacy 

Watch any modern-day cop drama and you may well see the good guys using facial recognition technology (FRT) to hunt down the bad guys. They tap the tech to home in on the villains by finding them in a crowd and then catch them. While it is all well and good to see facial recognition being used in a fictionalised police procedural, it is another thing in the real world. And it is in the real world that the use of the technology is growing in popularity – and with it, concerns over privacy. 

What is facial recognition technology? 

Facial recognition is a biometric technology that involves identifying or verifying individuals based on their facial features. The software uses advanced algorithms, artificial intelligence (AI) and machine learning to scan human faces and analyse unique facial characteristics to create a digital representation known as a facial template. The template is then compared against a database of known faces (based on existing biometric data) to determine a match. 

According to Monash University’s Australian Public Attitudes to Facial Recognition Technology report, there are two main uses of FRT. The first, known as one-to-one use, ensures someone is who they say they are, for example to enable a smartphone to be unlocked or to gain access to a secure building. The second, known as one-to-many uses, enables the identification of an unknown suspect or a face in the crowd. 

Consumers are embracing facial recognition 

Many individuals use FRT on a daily basis. A common use of the technology is to access online accounts, apps or devices by simply allowing their face to be scanned. For example, many smartphones, laptops and tablets use FRT to enable the device to be unlocked, providing a fast and secure way to access the device (replacing traditional passwords or PINs). FRT is also commonly used as a component of multi-factor authentication. The combination of facial recognition with other authentication methods, such as passwords or SMS codes, adds an extra layer of security.  

A poll conducted by Visa found over 86% of respondents prefer using biometrics, such as facial recognition, over standard passwords to verify their identity or make payments.  

The research from Monash University found Australians are more comfortable with one-to-one uses of the technology. For example, to access devices or government services. It was also found that around three-quarters support the use of FRT for identifying criminal suspects, and eight out of 10 support its use to help verify the identities of people who lose their credentials during disasters or war. 

Use of facial recognition among businesses is growing 

FRT is also making inroads into the workplace. Some businesses use the technology for access control, using facial recognition to grant or deny entry to secure premises, and to access platforms and software instead of using traditional log-ins. It is also being used for identity verification purposes, for example to log hours on-site which helps to streamline payroll, and can also enhance fraud prevention by safeguarding against unauthorised transactions and access.  

The tech can also be tapped to monitor worker productivity, track worker location and even analyse mood, although support for these uses is low amongst the majority of Australians (according to the Monash research). 

One of the main uses for FRT among businesses is in the realm of public safety and law enforcement. It acts as a ‘virtual detective’, able to identify individuals in the workplace or surveillance footage and help authorities in addressing security threats and apprehending wrong-doers, among other applications.   

Key benefits of FRT within businesses include: 

  • Enhanced security 

By accurately verifying identities, FRT provides a high level of security. Passwords and PINs can be forgotten, stolen or hacked, but facial features are unique and difficult to replicate, making it harder to gain unauthorised access.   

  • Speed and convenience 

FRT simplifies access and authentication processes, enabling users to quickly unlock devices, authorise payments, or gain entry to secure areas. Facial recognition processes require no action on the user’s part (unlike a fingerprint scan) and negate the need to remember passwords or carry physical keys or swipe cards.   

  • Improved efficiency 

FRT streamlines operations and reduces wait times by automating identity verification processes. 

  • Enhanced customer experiences

FRT can offer customers a seamless authentication process. It can also enable contactless transactions. It can allow individuals with disabilities to have alternative methods of interaction and provide accessibility features. The technology may also afford the business the opportunity to offer personalisation for the customer. 

With increased use comes privacy concerns 

As it captures and analyses biometric data, FRT raises significant privacy concerns. Users may be concerned about how their facial information is collected, stored and used. Concerns may also be raised over the potential for the technology to enable unauthorised surveillance. Storing large amounts of facial data also poses a risk of unauthorised access and misuse.  

“The technology is more widespread than many realise – and it’s poised to spread rapidly. In the future it could be as common as CCTV. This raises real privacy issues,” said Monash’s Professor Mark Andrejevic. 

“People need a better understanding of how, why and where facial recognition systems operate, how their personal data will be processed, used and stored, what kinds of risks they might confront from participating, and what mechanisms hold the technology accountable.” 

The ability to recognise individuals automatically at a distance raises human rights issues such as the right to privacy, and the freedoms of assembly, expression and movement. Professor Andrejevic also noted that it is crucial that the technology “is deployed in ways that are in accordance with Australian values and commitments”. 

A case in point – Bunnings found to have breached privacy laws 

The Office of the Australian Information Commissioner (OAIC) has ruled that Bunnings breached the Privacy Act 1988 through its use of FRT. 

During a trial of FRT between November 2018 and November 2021, CCTV captured the face of anyone who entered 63 Bunnings stores in NSW and Victoria, likely collecting the facial data of “hundreds of thousands of individuals”. 

Bunnings said it trialled the use of FRT “with the sole and clear intent of keeping our team and customers safe and preventing unlawful activity by repeat offenders”. 

However, the OAIC said Bunnings breached the Privacy Act by collecting sensitive biometric data without customer consent, not including required information in its privacy policy, and failing to inform individuals that their data was being collected.  

‘We can’t change our face. The Privacy Act recognises this, classing our facial image and other biometric information as sensitive information, which has a high level of privacy protection, including that consent is generally required for it to be collected,” the OAIC noted. 

“Individuals who entered the relevant Bunnings stores at the time would not have been aware that facial recognition technology was in use and especially that their sensitive information was being collected, even if briefly. 

“In this instance, deploying facial recognition technology was the most intrusive option, disproportionately interfering with the privacy of everyone who entered its stores, not just high-risk individuals.” 

The OAIC concluded: “We acknowledge the potential for facial recognition technology to help protect against serious issues, such as crime and violent behaviour. However, any possible benefits need to be weighed against the impact on privacy rights, as well as our collective values as a society.” 

How to protect privacy 

In an era where private data is more valuable than ever, facial recognition raises significant privacy concerns regarding the collection and storage of biometric data without individuals’ consent, and how that information will be used by the company and any third parties.  

According to the Monash research, 90% of Australians want to know when and where FRT is being used on them, and they want to be provided with the opportunity to consent to its use. 

Under the Privacy Act, biometric templates and biometric information, which are collected by FRT, are considered sensitive information and there are heightened requirements applied to the collection, use and disclosure of sensitive information than to those applied to personal information. 

Law firm Norton Rose Fulbright note: 

  • individuals must consent to the collection of sensitive information (unless an exception applies under the Privacy Act) 
  • the information must be reasonably necessary for one or more of the organisation’s functions or activities, and  
  • the information must only be used for the purpose for which it was collected or for a directly related purpose. 

To help businesses using FRT in commercial or retail settings, the OAIC has issued guidance 

The guidance states businesses should implement a ‘privacy by design’ approach and consider FRT factors including (as outlined by Norton Rose Fulbright): 

  • Necessity and proportionality – personal information for use in FRT must only be collected when it is necessary and proportionate in the circumstances and where the purpose cannot be reasonably achieved by less privacy intrusive means. 
  • Consent and transparency – individuals need to be proactively provided with sufficient notice and information to allow them to provide meaningful consent to the collection of their information. 
  • Accuracy, bias and discrimination – organisations need to ensure that the biometric information used in FRT is accurate and steps need to be taken to address any risk of bias. 
  • Governance and ongoing assurance – organisations who decide to use FRT need to have clear governance arrangements in place, including privacy risk management practices and policies which are effectively implemented, and ensure that they are regularly reviewed. 

Compliance with Australian privacy laws needs to be actively considered at the design and implementation stage, notes law firm Gadens. At a minimum, considerations should include:  

  • the use of privacy-by-design and security-by-design methodologies 
  • carrying out privacy impact assessment(s) on the proposed technology and anticipated implications on the privacy of affected individuals, and 
  • implementing any recommended actions to mitigate privacy and ethical risks arising. 

Data may be subject to cyberattack 

Businesses deploying FRT need to ensure there is proper management and robust security measures in place to protect sensitive data from breaches and unauthorised access.  

Although FRT enhances security, the technology is not foolproof and may be subject to cyberattack. Cybercriminals could use techniques like deepfakes or high-resolution images to infiltrate the less robust facial recognition system. And, as biometric data cannot be changed (like a password), it could cause long-term security issues if it is compromised.  

Protect your business 

The use of FRT is on the rise, but there are significant privacy compliance requirements associated with its use. The OAIC has put organisations on notice that they must ensure biometric information is not collected unlawfully or unnecessarily through the use of FRT. In addition, all information collected must be stored and used in a manner that is consistent with the requirements of the Privacy Act and general expectations of privacy. 

Talk to your EBM Account Manager about the risks associated with using facial recognition technology and the insurances available (including cyber and liability covers) to help protect your business.