Avatar photo

Freelance journalist

Author Bio ▼

Ron Alalouff is a journalist specialising in the fire and security markets, and a former editor of websites and magazines in the same fields.
June 28, 2021

Download

Whitepaper: Enhancing security, resilience and efficiency across a range of industries

Facial Recognition

Warning on non-police deployment of live facial recognition technology from Information Commissioner

The Information Commissioner pulls no punches about the need to follow data protection legislation and best practice when it comes to non-police deployment of live facial recognition systems. Ron Alalouff reports.

Carrying out a number of investigations into live facial recognition systems, the UK Information Commissioner, Elizabeth Denham, found that often, insufficient consideration was given to the necessity, proportionality and fairness of the use of those systems, and processes failed to be sufficiently transparent. She also found that “[data] controllers did not always do enough to demonstrate a fair balance between their own purposes and the interests, rights and freedoms of the public.”

These comments are found in the latest Information Commissioner’s Opinion – published by the Information Commissioner’s Office (ICO) – on the use of live facial recognition technology in public places. Live facial recognition (LFR) is defined as facial recognition that is directed towards everyone in a specific area, rather than a “one-to-one” process such as passing through automated passport control. It has the ability to capture the biometric details of everyone passing within the field of view of a camera – automatically and indiscriminately – without any direct engagement with, or co-operation by, individuals. It involves the processing of personal data and biometric data, which the law recognises can be particularly sensitive and potentially highly intrusive.

The ICO assessed or investigated 14 instances of LFR deployments and proposals, and conducted wider research in the UK and internationally. The technology can be used to prevent crime and other unwanted behaviours in retail, leisure and transport settings, and increasingly for marketing, targeted advertising and other commercial purposes.

The technology has the potential to be used in conjunction with big data ecosystems from multiple sources such as social media, potentially leading to LFR becoming “supercharged CCTV”, according to Denham. As with any new technology, building public trust and confidence is essential to ensuring that its benefits can be realised.

The ICO’s document identifies a number of key data protection issues which can arise where LFR is used. These include:

  • the governance of LFR systems, including why and how they are used
  • the automatic collection of biometric data at speed and scale without clear justification, including of the necessity and proportionality of the processing
  • a lack of choice and control for individuals
  • transparency and data subjects’ rights
  • the effectiveness and the statistical accuracy of LFR systems
  • the potential for bias and discrimination
  • the governance of watchlists and escalation processes
  • the processing of children’s and vulnerable adults’ data
  • the potential for wider, unanticipated impacts for individuals and their communities.

Data protection law

Outside of law enforcement, which operates under a separate legal regime, the use of LFR is governed by the UK Data Protection Regulation (UK GDPR) and the Data Protection Act 2018. Data controllers seeking to deploy LFR must comply with these, says the document, including the data protection principles set out in Article 5 of the UK GDPR. These include lawfulness, fairness and transparency, including a robust evaluation of necessity and proportionality. Any processing of personal data must also be fair, so users should consider the potential adverse impacts on individuals of using LFR and ensure that they are justified. They should also consider and mitigate any potential biases in their systems, and take a “data protection by design and default” approach from the outset.

“It is not my role to endorse or ban a technology but, while this technology is developing and not widely deployed, we have an opportunity to ensure it does not expand without due regard for data protection.”

Before deploying LFR in public places, the ICO document says users must assess the risks and potential impact on individuals’ interests, rights and freedoms, including on their wider human rights such as freedom of expression, association and assembly.

The organisations examined by the ICO have all stopped the processing of personal data using LFR and where relevant, have provided assurances that they have deleted all biometric data collected. As such, these six cases under investigation were closed with no further action, but a number of further investigations into LFR cases are in progress.

Writing a blog on the publication of the Opinion, Denham wrote: “I am deeply concerned about the potential for live facial recognition (LFR) technology to be used inappropriately, excessively or even recklessly. When sensitive personal data is collected on a mass scale without people’s knowledge, choice or control, the impacts could be significant.

“We should be able to take our children to a leisure complex, visit a shopping centre or tour a city to see the sights, without having our biometric data collected and analysed with every step we take.”

Face recognition accuracy and bias

AI-RacialBias-20If LFR systems are not sufficiently statistically accurate, they may result in false positives or negatives. In some cases this may not have any significant consequences, but in others it could lead to unwarranted interventions such as additional surveillance, removal from a premises, or being detained by the police.

“The Commissioner believes that lawfulness, fairness and transparency, including a robust evaluation of necessity and proportionality, are the crucial issues for controllers to address before deploying LFR in a public place,” says the ICO document. “This is based on her assessment of current uses of LFR and her interpretation of data protection legislation. Controllers must comply with all part[s] of the legislation, but these issues are key challenges in the context of LFR.”

The document goes on to say that “where LFR is used for the automatic, indiscriminate collection of biometric data in public places, there is a high bar for its use to be lawful…[and] any investigation or regulatory assessment would be based on the facts of the case, considering the specific circumstances and relevant laws.”


READ: Why AI and facial recognition software is under scrutiny for racial and gender bias 


The compilation of watchlists itself also must comply with data protection law, meeting the same requirements of lawfulness, fairness, necessity and proportionality. Whenever a watchlist is used for LFR, the Commissioner expects controllers to:

  • strictly limit the images they include on the watchlist to those which are necessary and proportionate
  • ensure watchlist images are retained only as long as necessary, in line with the data minimisation and storage limitation principles
  • include only images that are lawfully acquired and accurate, ensuring that they understand their provenance
  • process images fairly, considering possible adverse impacts for the individual
  • ensure transparency and that individuals can exercise their rights – including the right to be informed, to erasure, and to object – unless relevant exemptions apply
  • ensure watchlists are compiled and maintained by staff who have sufficient knowledge of data protection to comply with the requirements of the law.

Addressing the LFR industry, the document recommends that developers, vendors and service providers should:

  • put a ‘data protection by design and default’ approach at the heart of any new products and systems
  • address and reduce the risks of bias and discrimination in LFR systems, and the algorithms that power them
  • be transparent about the effectiveness of LFR systems and consider adopting common standards to assess and describe their statistical accuracy
  • educate and advise controllers on how systems work, and be transparent about the potential data obligations that controllers need to meet.

These steps, it says, will be crucial to building and maintaining the trust and confidence of the public.

Commenting on the publication of the Opinion, Denham concluded: “It is not my role to endorse or ban a technology but, while this technology is developing and not widely deployed, we have an opportunity to ensure it does not expand without due regard for data protection.

“We will work with organisations to ensure that the use of LFR is lawful, and that a fair balance is struck between their own purposes and the interests and rights of the public. We will also engage with Government, regulators and industry, as well as international colleagues to make sure data protection and innovation can continue to work hand in hand.”

The British Security Industry Association (BSIA) has released a best practice guide on the use of facial recognition for the security industry, which you can read more about here: Industry first ethical automated facial recognition framework launched by BSIA

You can also watch an exclusive IFSEC Global interview on the subject with the Surveillance Camera Commissioner, Fraser Sampson, here.

Connect with the security industry online 1-30 June

Connect 2021 is your first major opportunity to come together with the security industry online from 1-30 June!

The month-long online event will give attendees the opportunity to make up for lost time by browsing security solutions, connecting with suppliers and accessing thought-leadership content - all from the comfort of your own home or workplace!

Related Topics

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments