PRIVACY CONCERNS

Automated facial recognition can benefit society – if we support its development properly

Avatar photo

Founder, Cloudview

Author Bio ▼

James is a co-founder of Cloudview, which leads the way in cloud-based video surveillance with a secure, scalable, user-friendly and affordable platform that can be managed and accessed from a browser using a notebook, tablet or Smartphone from anywhere in the world.
November 12, 2018

Sign up to free email newsletters

Download

Whitepaper: Multi-residential access management – The move to digital

There is an approach to urban design which looks at the whole landscape, and creates environments around core principles.

For example, doing certain things, and not doing others, can increase people’s desire to walk or cycle from A to B rather than get in their cars.

What’s that got to do with automated facial recognition (AFR) technology? Well it seems to me that we – both the industry and those who regulate it – don’t do enough to look at the technology in the round, considering its potential across all aspects of our lives, and taking steps to promote ways it can be used in our favour.

AFR has really positive benefits for modern society. Facial recognition can be used as a biometric for verification – indeed it is increasingly used in this way. Passports, computer and phone login, buildings access, even mobile payments, can use face recognition to smooth our way through ’gateways’.

When combined with other capabilities such as analysis of body movements, AFR can contribute to the detection of unusual behaviours that might be indicators of ill health such as heart attacks or strokes. Used in public places or other locations like care homes, this could trigger calls for assistance – and ultimately save lives.

Successes in initial trials include identification of a body in a non-suspicious sudden death case and the arrest of a man wanted on a recall to prison

And, by the way, what’s wrong with using AFR to identify those who are known to law enforcement agencies? These agencies have many tools at their disposal, and AFR is a legitimate part of the mix.

Successes in initial trials of AFR in South Wales include the identification of a body in a non-suspicious sudden death case, the arrest of a man wanted on a recall to prison and a number of other arrests, charges and prison sentences. A recent trial by Humberside Police on foot passengers arriving at the port of Hull successfully identified three police officers acting as ‘controls’ and a number of people on their watch list.

Yet all too often when AFR hits the news it is cast in a negative light. For example, there was a lot of reporting in the mainstream press earlier this year when Big Brother Watch issued a claim in the High Court requesting permission to proceed with a judicial review of the Metropolitan Police’s use of facial recognition surveillance, which they claim “tramples over civil liberties”, breaches peoples’ rights to a private life and signals a “slippery slope towards an Orwellian society”.

Towards a better approach to policy making

Whatever the pros and cons of that particular argument, its very existence creates an adversarial approach to policy making around AFR, with those using the technology immediately put onto the back foot, having to defend their approach, while governments are challenged for even allowing AFR to exist.

Police and law enforcement agencies, and others using AFR, may indeed be going through learning curves as the technology matures. After all it is early days for the development of the complex artificial intelligence and data analytics that lie behind AFR systems.

But rather than just reacting to challenges, let’s acknowledge this, do our best to understand where the problems lie and work towards solving them.

For the AFR sector to make genuine progress towards efficient systems backed up by highly effective artificial intelligence, the policy-making and regulatory system needs to be able to stop watching its back and start looking to the future. We need to move away from a challenge/response model into a more collaborative one.

So, for example, policy makers need to have the confidence to proceed knowing that personal privacy is protected by existing legislation including the European Convention on Human Rights, the EU Charter of Fundamental Freedoms, the EU General Data Protection Regulation and the UK Data Protection Act 2018.

Working within the existing legal framework, policymakers can take a stance which makes the desire to balance risk and benefits transparent. After all, a binary ‘good versus bad’, or ‘yes versus no’ approach applies to very little in modern life.

A new direction

Public perception of AFR suffers from the historical legacy of a CCTV sector often been found wanting when it comes to data security. For example, cameras sold with back doors into their firmware that are badly secured – or even not secured at all – and visual data storage systems whose security is so lax that basic port scanning can find a way through, often in minutes.

This is a sign of an industry in which vendors have no incentive to secure their hardware. The whole burgeoning connected devices market is open to the same problems.

It is perhaps not surprising that insecure CCTV coupled with the inevitable issues that stem from fledgling AI and data analytics create a perfect storm of negativity for AFR.

Policy and lawmaking is not the only way to change behaviours. Social pressures can be just as effective. For example, the anti-smoking movement in the UK benefited from a mix of legislative and social pressures to turn smoking from something that was chic and fashionable to something that’s frowned upon. When it comes to AFR the focus of social pressure could be self-regulation.

Consider this thought experiment. The industry could benefit from working with government to develop a kitemarking system that recognises a set of minimum standards of data security, product reliability and so on. Once in place, central and local government agencies could be encouraged (or even required) to only purchase from kitemarked suppliers.

Other organisations from schools and hospitals through to businesses may start to see value in using only kitemarked suppliers since this guarantees certain standards are met. Suppliers may experience tough competition as the number of users of AFR grows, and at the same time the kitemark proves a good way of filtering better suppliers from less good ones.

Over time, competition among providers means they develop the standards further, adding new elements or simply raising the bar.

In this imaginary scenario, the industry competes for custom by proving its ability to provide high quality, secure services, and striving to progress the technical competence of AI and data analytics in accordance with market demand. Those without kitemarks become insignificant in the market. Meanwhile, back in the real world, policy makers continue to simply react to challenges, leaving little room for a more imaginative approach.

I think I prefer the imaginary world. So, how do we make it real?

More information about AFR is available in the whitepaper CCTV, Data Analytics and Privacy: The Baby and the Bathwater, commissioned by Cloudview and written by Andrew Charlesworth, Professor of Law, Innovation & Society at the University of Bristol.

Subscribe to the IFSEC Insider weekly newsletters

Enjoy the latest fire and security news, updates and expert opinions sent straight to your inbox with IFSEC Insider's essential weekly newsletters. Subscribe today to make sure you're never left behind by the fast-evolving industry landscape.

Sign up now!

man reading a tablet, probably the IFSEC Global newsletter

Related Topics

Subscribe
Notify of
guest
1 Comment
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Simon Gordon
Simon Gordon
November 13, 2018 1:10 pm

Good article James – I agree