Avatar photo

Freelance journalist and copywriter, Textual Healing

June 21, 2019

Download

Whitepaper: Enhancing security, resilience and efficiency across a range of industries

IFSEC 2019

Overcoming the disruptive effects of criminality with AI video analytics

Warren Stein of Briefcam gave a detailed rundown of how AI and machine learning was being leveraged against crime in the IFSEC 2019 Keynote Theatre.

Stein is the Vice President of Sales, EMEA & APAC, at BriefCam, the industry’s leading provider of Deep Learning and Video Synopsis solutions for rapid video review and search, face recognition, real-time alerting and quantitative video insights.

Introduced by security journalist Frank Gardner, Stein set out the high cost of crime (estimated at £37 billion each year in the UK) but also the impact of crime in terms of the consequences, e.g. trauma, mental health issues, lack of productivity and the deterrent to investment if an area is blighted by crime.

Stein suggested three approaches, acknowledging that he was stating the obvious with number one, which was “to reduce the cost and impact of crime by reducing amount of crime.” Also on the list were “saving money in the long run by spending in the short run” and finally “spending smart to spend more.”

He described the rearguard action against crime using security technology amounting to 350 million IP video surveillance cameras shipped globally, 88% of which are recording non-stop. Some organisations, he noted, had up to 200 cameras across their premises.

The use of AI is not new, but it’s now being used “to transform video into actionable intelligence”

The use of AI is not new, said Stein, but it’s the way it is now being used “to transform video into actionable intelligence” in the face of such a massive amount of data available. For example, the 7/7 tragedy saw 80,000 video segments being studied – that’s over 6,000 hours of footage.

The versatility of surveillance was outlined by Stein when he pointed out that some important cameras have multiple uses, so retail, hotel and traffic cameras have a role to play alongside face recognition and licence plate recognition, for example.

An example of an AI /machine learning / deep learning security initiative was given – a case study of Hartford, Conneticut, a high crime area with an enormous drug problem. Stein explained: “The police in Hartford were reluctant to pay for video analytics unless it could be justified to help get more investment.”

What they eventually got was a system that could zero-in on various behaviours, drawing pathways of human traffic that, for example, located a crack house (visited by 656 people in one 12-hour period) and then used various filters to isolate suspects based on gender and clothing, for example. Ultimately, this technology did help bring back investment into the city as it made a real impact on the crime rate.

This level of capability is more advanced in terms of tech than the first iteration of AI twenty years ago but also more nuanced and less based on classification. It is, as Stein admitted, still expensive, though will become cheaper.

But cost isn’t the only issue and Stein was probed by Gardner on the ethics issues around security and privacy. “Where is right balance?” asked Garnder. “The world has gone a little bit nuts” admitted Stein, “compliance is essential.”

“Where I mark the line is not easy to say”, he continued. “There’s a big difference when a security agency is after an immediate threat, e.g. terrorism, to a commercial situation, e.g. retail.”

“I’m not sure that privacy will win the war” Stein concluded. “I started in this business nine years ago and Germany didn’t have cameras, now they have started deploying them.”

Privacy issues are by no means the only issue around the potential limiting of security technology. FARs – false alarm rates, for example, mean that relying on AI to identify people with a knife or a gun is problematic. Stein outlined that the pressure to have predictions in real time will suffer from this too. He gave an example of various forms of behaviourial analysis, e.g. gait recognition but said that when it came to, for example, predicting that someone would start a fight there could be misunderstanding. “What if I jump for joy when I meet someone at the airport? We still need a person behind a screen to understand the situation.”

Related Topics

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments