IFSEC Insider is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.
Although no comprehensive studies have been made, it has been suggested that anywhere between 8% and less than 1% of users read terms and conditions when installing software or downloading a new app.
As a result, very few users know what permissions they’re giving away to companies. Indeed, a 2017 study by York University in Toronto found that over three-quarters of subjects unwittingly signed their first-born child over to a fake social media platform simply because they’d failed to read the end-user agreement.
Understanding our right to information security and personal privacy is increasingly important – particularly with many commonly-used apps keeping more and more of your personal data. Take a look at any end-user agreement and you’ll realise you’re giving away the ability to read your files, use your camera and microphone and even post to your social media accounts to any number of apps.
The trouble with FaceApp
We’ve seen those privacy concerns come to the fore in the last week with the re-emergence of FaceApp, an app that lets you edit your selfies with filters that change your age, appearance or gender presentation. Twitter users were happily sharing hilarious, elderly, wrinkled selfies of their future selves under the hashtag #faceappchallenge – happily, that is, until somebody pointed out FaceApp was developed by a Russian company.
The internet was filled with scare stories about how much data the Russians were seizing, and for what nefarious schemes they were using their “perpetual, irrevocable, nonexclusive, royalty-free, worldwide” right to your aged-up selfies. As it turned out, not nearly as much as other social apps such as WhatsApp, Facebook, Instagram and so on. Google has admitted to using eight million user images to train AI facial recognition software, while Facebook may have tapped up the photos of as many as 10 million users.
FaceApp itself reassured users, telling 9to5Mac that although its “core R&D team is located in Russia, the user data is not transferred to Russia.” Scant relief, perhaps: Russia may be the current privacy bogeyman, but Western companies and governments are equally culpable.
So, panic over? Maybe, but the FaceApp saga speaks to the way privacy is becoming a much more complex subject.
⚠️FACEAPP WARNING:⚠️ Security experts say be careful if you’re going to use the viral app FaceApp. The Russian-owned app collects your photos, and more. Be sure to share this warning with your friends and family. https://t.co/RqgLqUwzT1#faceappchallenge#faceapp
This is a particular concern when it comes to facial recognition software, which is increasingly being tapped up by police forces to help identify criminals. However, like FaceApp, this technology can indiscriminately store information about any person it captures, regardless of their relevance to a criminal case. This is not just used by authoritarian governments – a man was fined £90 for covering his face during a trial of facial recognition software in London earlier this year.
Such is the level of concern around privacy that the House Of Commons Science and Technology Committee has recently released a report recommending the UK government ceases all facial recognition trials, citing a lack of oversight.
Under current rules, police must remove images of unconvicted individuals from its custody database after six years – something they have so far failed to do consistently. As a result, innocent people’s pictures have illegally remained on police databases, meaning they will continue to be watched by facial recognition software. In January 2018, the police held 12.5 million images.
Who watches the watchmen?
Companies are already working to tackle this invasion of privacy. At IFSEC International this year, IDIS launched its new cost-free dynamic privacy masking technology, which is designed to anonymise video surveillance footage by automatically blurring sensitive areas such as faces, vehicle number players and so on.
It’s a neat response to the demands of GDPR, which allows anyone to request access to the personal data recorded by surveillance cameras, but under which requires all other individuals to be anonymised. This kind of technology is becoming more and more important, helping to protect the privacy of individuals in an increasingly watched world.
“Perpetual, irrevocable”: The privacy implications of FaceApp and facial recognitionTwitter users were happily sharing hilarious, elderly, wrinkled selfies of their future selves under the hashtag #faceappchallenge – then somebody pointed out FaceApp was developed by a Russian company.
Olaf Jensen
IFSEC Insider | Security and Fire News and Resources
Related Topics
Bogotá Police using facial recognition to enable arrest of murder and theft suspects
Government urges police to increase use of facial recognition software
AI and machine learning for enhanced video surveillance security – A balancing act?