Assistant Editor, Informa

September 16, 2022


Whitepaper: Enhancing security, resilience and efficiency across a range of industries

Video Surveillance

Automatic Facial Recognition – The debate in 2022 

As the technological world advances rapidly, the use of AI in video surveillance devices is becoming increasingly apparent. In particular, public and private sectors are exploring the use of solutions utilising Automated Facial Recognition (AFR) to prevent crime and improve the safety of their facilities – but where are we now in the AFR debate? 

In IFSEC Global’s Video Surveillance Report of 2021, the second highest reason for adopting AI/Video analytics was facial recognition; ‘with 39% citing this as a key driver behind them or their customers adopting video analytics. Now an everyday phenomenon in personal mobile phone authentication, there is a growing awareness that facial recognition systems can be integrated into access control software to support multi-factor authentication principles. A barrier to this adoption may be for those businesses operating in public spaces, which would have to consider data privacy laws and standards, such as GDPR and the UK’s Surveillance Camera Commissioner code of practice.’  

The use of facial recognition by businesses in ‘public spaces’ means that the general public are then under surveillance for business purposes – but is this a fair reason to have facial data collected, in many cases without the publics’ knowledge? 

facial recognition

The debate is also one that extends beyond the realms of the security industry. Mainstream television programmes have highlighted its widespread use in law enforcement, particularly in London – the BBC drama, The Capture, providing an obvious example. Yet, because the technology is embedded in physical security devices, such as CCTV and access control, it is an issue that security professionals simply can’t ignore.  

Firstly, what is Automated Facial Recognition (AFR)? 

AFR is a way of identifying or confirming an individual’s identity using their face, with camera technology capturing the measurements of the features on an individual’s face and collecting this data. AFR systems can be used to identify people in photos, videos, or in real-time. 

AFR is a category of biometric security. Other forms of biometric software include voice recognition, fingerprint recognition, and iris recognition. The technology is used widely in cases such as law enforcement, on phone devices, and in airport passport control to name a few. 

Dave Quinn, Product Manager for Fire and Security at G4S UK explains: “Facial recognition is gaining interest and traction with our clients, and we are increasingly asked for advice and guidance on the matter. It’s important to help the client understand what they are asking for. 

“We broadly categorise facial recognition into two types of application: 

“1. A closed and participant collaborative application – this is where all users of a system are actively aware and have given unambiguous permission that their face will be used to identify them for a particular use. A hypothetical example of this is where the face is used as a credential in an access control system.  

“2. An open and mandatory / involuntary application – this is where some or all of the users of a space have their facial recognition algorithm compared to a database for individual identification. A hypothetical example would be a retailer with a database of shoplifters that they want to be made aware of if they enter their premises.”  


The BSIA’s guide to AFR was released in 2021

The British Security Industry Association (BSIA) states in its AFR guide that “Automated Facial Recognition is a technology that has been designed to improve the safety and wellbeing of people, as well as providing a tool to assist and speed up operational processes”. However, that “the ethics of AI and its application need to be regularly reviewed to ensure that it is not allowed to act autonomously without human oversight and it should not be used in any way which causes harm to individuals.” 

AFR as a crime prevention tool and a smart city opportunity? 

For organisations, such as those in retail or leisure who have experienced regular incidents of crime, there is an argument for the use of AFR to alert store managers and security officers to offenders already known to them.  

Retailer Co-Op, for instance, recently made the news   for using AFR. Its use of creating biometric profiles, as reported, was to protect staff from stores with a history of crime. In total, 35 branches in southern areas such as Southampton and Portsmouth used the system to capture the faces of people who entered the stores. Their biometric profile was then used as a way to compare to images of people who may have stolen from a store in the past.  

Therefore, it could be argued that businesses directly public facing, such as retail and hospitality, need this type of protection to prevent criminal behaviour before it begins, similar to the BSIA’s advice to ‘improve the safety’ of people through its use. 

Rob Watts, CEO at Corsight AI, has also argued how AFR would benefit the private sector as well as the public sector, improving general public safety and evolving with smart cities to provide a “seamless” experience in their environment. The public could use their “face as their ticket” instead of contactless access via a mobile phone app or code, which would be applicable not only to sectors such as transport but building access control as well. 

“Building sites, maternity wards and critical national infrastructure can all benefit from facial recognition software, as it can enable the seamless flow of people and facilitate the protection of sensitive locations by restricting access to approved individuals only,” Watts concluded.  

“From the perspective of the public, if a building is facing a bomb threat, a system could monitor exactly who is coming in and out of that building. To me, it is no different to passport control, we’re all boarding a plane and want to feel safe on that plane, what’s the difference?” 

Problems in practice

There are, however, wide debates over privacy and data rights with the use of AFR, and with biometric technology that companies such as the Co-Op used which could infringe on the privacy rights of the general public. .  

The complaint against the Co-Op was that the system breaches data protection laws because the information is processed in ways which are not proportionate to the need to prevent crime. 

Silkie Carlo, Director of Big Brother Watch commented on the situation saying: “Our legal complaint to the Information Commissioner is a vital step towards protecting the privacy rights of thousands of people who are affected by this dangerously intrusive, privatised spying. 

“The Southern Co-Op’s use of live facial recognition surveillance is Orwellian in the extreme, highly likely to be unlawful, and must be immediately stopped by the Information Commissioner. People need to provide their consent for their data to be used and stored, in the Co-Op’s case data would be stored for two years.” 

Global conversation 


Co-Op has been under scrutiny recently for its use of AFR in several branches

In the US, popular department store chain, Macy’s, also uses facial recognition to track customers, ‘without their knowledge’ according to digital rights company Fight for the Future. Fight for the Future had tweeted that “Macy’s is flat-out on the wrong side of human rights when they use invasive surveillance technology to harvest sensitive biometric data on customers and their kids.” 

The state of Illinois recently filed a lawsuit against Macy’s and other retailers after they alleged there were biometric data privacy violations by searching for faces captured on cameras ‘against the company’s database of images scraped from the internet’.  

On the other side of the globe, China is a well-known reported user of facial recognition systems. It was found to be using a network of cameras using AFR systems to log the general public on a mass scale in the country. A database leak in 2019 showed that there were more than 6.8 million records taken in one day, from cameras positioned around hotels, parks and tourism spots collecting information on the general publics’ life. 

Safety ethics? 

The debate highlights the juxtaposition of ethics and technological advancement in security being at odds with each other when it becomes more of a preventive choice than a reactive one; at what lengths do we go to ensure safety? 

And, with the technology readily available, how do organisations, security installers, contractors and end-users ensure they are using it in compliance with regulations?  

Ben Linklater from the BSIA, discussed the BSIA’s guide to AFR and its importance in creating a reference for the use of this type of data with IFSEC Global last year: “A lot of people are saying it is the first step towards creating a natural standard around the ethical and legal implications of AFR. The guide walks you through the process of understanding, so that end-users can have more confidence in the software they use and start conversations with their solution provider. 

“From the perspective of the public, if a building is facing a bomb threat, a system could monitor exactly who is coming in and out of that building. To me, it is no different to passport control, we’re all boarding a plane and want to feel safe on that plane, what’s the difference?” 

However, the BSIA guide does note that ‘there is no single global ethical framework for the safe use of AI’, which explains the complexity of the issue for businesses.  

Quinn, from G4S adds: “We don’t agree that there is necessarily enough clarity in the legislation, especially GDPR, to be sure that the collation and production of the database is compliant and therefore legal. GDPR also raises the question if these systems should even be allowed to run the analytic on the general population without their permission – which obviously cannot be given by the general public. 

“Our approach to these opportunities is to advise the client to ensure their requirements meet the guidelines of the GDPR and have this officially ratified before looking for solutions to their issues. We can then assist in providing a solution that complies with that policy. 

“Whilst we have had enquiries from some clients for the above systems and have given this advice, we have not yet gone on to provide a technical solution. We assume that this is because when tested, their requirement for facial recognition cannot pass the requirements of the GDPR for proportionality or public interest.” 

Without the use of set guidance and legislation around AFR, the debate will surely continue over the coming years as to what extent AFR can be used in a way that is ethical but practical for businesses.  


Free Download: The Video Surveillance Report 2023

Discover the latest developments in the rapidly-evolving video surveillance sector by downloading the 2023 Video Surveillance Report. Over 500 responses to our survey, which come from integrators to consultants and heads of security, inform our analysis of the latest trends including AI, the state of the video surveillance market, uptake of the cloud, and the wider economic and geopolitical events impacting the sector!

Download for FREE to discover top industry insight around the latest innovations in video surveillance systems.


Related Topics

Notify of
Inline Feedbacks
View all comments