IFSEC Insider is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.
James Moore is the Managing Editor of IFSEC Insider, the leading online publication for security and fire news in the industry.
James writes, commissions, edits and produces content for IFSEC Insider, including articles, breaking news stories and exclusive industry reports. He liaises and speaks with leading industry figures, vendors and associations to ensure security and fire professionals remain abreast of all the latest developments in the sector.
There is a growing concern regarding deepfake videos, making it increasingly important for video technology users to prove the integrity of their evidence.
As The Guardian reported back in 2019, deepfakes – the ability of AI to fabricate apparently real footage of people via digital video manipulation techniques – are a growing problem. So much so, that both the British Government and US Congress have been looking into regulation surrounding the issue.
According to Forbes, at the beginning of 2019 there were close to 8,000 deepfake videos circulating online. By mid-2020, the number had almost doubled to just under 15,000, and since then it has only continued to rise.
There is also concern that deepfakes are being used for increasingly fraudulent purposes that present significant security and political risks. Social media can help to spread fake footage quickly, with the lines becoming ever-more blurred between artificially created and real video footage. The Brookings Institution, a US public research organisation, notes that the “capacity to generate deepfakes is proceeding much faster than the ability to detect them,” causing widespread confusion as officials are increasingly able to “challenge the authenticity” of video testimony.
IDIS, the South Korean video tech manufacturer, is one company that has highlighted the usage of deepfakes – and confusion created simply from awareness of their existence – will put pressure on both video tech users and prosecutors to demonstrate the integrity of footage being used as evidence.
“As we look ahead, wherever video is presented for use as legal evidence, or as part of internal disciplinary proceedings, we will see more attempts to assert that footage is not genuine. Courts will dismiss evidence where tampering cannot be ruled out,” says Dr Peter Kim, Global Technical Consultant, IDIS.
The company says its end-to-end video solutions have in-built “protection of video footage integrity” thanks to its patented Chained Fingerprint algorithm. This technology allows a unique numerical fingerprint to be assigned to each frame, meaning that “every single frame of the video is linked by an encryption chain with its neighbouring image frames”. The chain is then stored with the video recording via the IDIS ClipPlayer. If any part of the footage has been tampered with, the chain will be broken and an alert raised, the company explains.
So far, deepfake videos have already caused significant national security issues, such as in Gabon in 2018. As technology only continues to improve, security professionals will need to ensure the integrity of their footage can survive new lines of questioning they may not have witnessed or come up against before.
Discover the latest developments in the rapidly-evolving video surveillance sector by downloading the 2022 Video Surveillance Report. Responses come from installers and integrators to consultants and heads of security, as we explore the latest trends including AI, software and hardware most in use, cyber security challenges, and the wider economic and geopolitical events impacting the sector!
Download for FREE to discover top industry insight around the latest innovations in cameras and video surveillance systems.
Growing use of deepfakes puts pressure on integrity of video evidenceThere is a growing concern regarding deepfakes, making it increasingly important for video tech users to prove the integrity of evidence.
James Moore
IFSEC Insider | Security and Fire News and Resources
Related Topics
Three-way security partnership launches FaceComply to offer expert advice on facial recognition use
ZeroEyes secures $23 million capital for continued growth
thanks for the information