Assistant Editor, Informa

February 2, 2023


Lithium-Ion batteries. A guide to the fire risk that isn’t going away but can be managed

video surveillance

Tensions rise over France’s plan for AI video surveillance at Paris Olympics

The French government is ‘fast-tracking’ special legislation for the 2024 Paris Olympics that would allow the use of video surveillance assisted by artificial intelligence (AI) systems.

Ministers have cited that ‘exceptional security measures’ are needed to ensure the smooth running of the Games that is expected to attract 13 million spectators, but human rights groups have warned France is seeking to use the Games as a ‘pretext’ to extend police surveillance powers, which could then become permanent.

The measures include a proposal to legalise the use of AI-assisted video surveillance.

For the first time in France, according to The Guardian, this would allow automated video camera surveillance, in which AI algorithms would be used to detect suspicious or “abnormal” activity in crowds. These algorithms would analyse video images from fixed CCTV cameras or drones, highlighting behaviour deemed abnormal or suspicious, which would be automatically signalled to police, who could act.

The AI systems would then be able to monitor crowds in and around the stadiums, on streets and on public transport – all designed to avoid a repeat of last year’s chaotic events of the Champions League final, where fans were reportedly teargassed and mugged around the Stade de France.

The Government has said that facial recognition technologies were not included in the security proposals, while adding that any automated video surveillance would be used for a “defined period”.

A “turning point” in AI surveillance?

It is not uncommon for such plans on additional surveillance measures to be met with criticism relating to data privacy. Human rights groups, such as Amnesty International, argue that the proposal is set to cover “all sporting, festive and cultural events in a broad time frame”, including after the Games ended – citing concerns that the measures could then become permanent law.

Amnesty International called the proposal a “turning point” in the use of AI surveillance technologies in France and said it marked a “dangerous step” for human rights and privacy rights.

Katia Roux, the Advocacy Officer on technology and human rights at Amnesty International France, said: “We’re deeply worried by the fact that these algorithms will be able to analyse images from fixed CCTV cameras or drones to detect ‘abnormal or suspect’ behaviour. First, there is the issue of defining abnormal or suspect behaviour – who will decide what behaviour is the norm or not?

“Also, in terms of human rights and fundamental freedoms, we consider the proposal presents a danger to the right to a private life, it could also impact freedom of expression, freedom to meet, and the principle of non-discrimination.”

READ: How the Commonwealth Games were protected

Roux said that even though the government had said no biometric data would be used, “in reality the algorithms will analyse behaviour, and physical data, which is data that must be protected”.

Bastien Le Querrec, of the French NGO La Quadrature du Net, which campaigns on data and privacy issues, said that until now in France, video surveillance required “a human behind a screen” analysing CCTV footage, which made it impossible to monitor the entire public space all the time.

“But with the use of algorithms, that human limit is no longer there – it would be possible to permanently survey all video camera images – and there are several hundreds of thousands of video cameras in France. This means that any person filmed could see their behaviour analysed, their movements detected and classified and decided by algorithm if normal or abnormal.”

The French sports minister, Amélie Oudéa-Castéra, has described the bill as introducing “essential adjustments” in order for the games to run smoothly. The interior minister, Gérald Darmanin, said the bill would give a French “framework” for security for the games.

The bill will be considered by the lower house in the national assembly in February after Senate approval on Tuesday, but campaigners have called for more public debate.

Exclusive security industry insight: AI as a support mechanism for trained professionals

Pauline Norstrom, CEO at Anekanta Consulting, an AI innovation and strategic advisory company, offered IFSEC Global her thoughts on the French Government’s plans:

“It is widely known that a humans cannot maintain their attention on multiple cameras simultaneously. To solve this problem, analytics based detection technology was developed and has been used routinely with surveillance cameras for many years.

“More sophisticated AI technologies have brought the professional security process a means of locating people very quickly across multiple scenes using defined search criteria such as clothing type.

“The Bill passed in France on 31st January 2023, specifically permits the use of such AI technology at the Paris Olympics 2024, and in this case to automatically analyse human behaviour in public places such as stadia and public transport networks, to look for activities which just feel wrong.

“The elephant in the living room here, rather than the fact the AI based cameras will be used, may be the subtle reference in the French government reports to the Court of Auditors estimates that there may be a deficit of over 30,000 security officers per day required to properly secure the events. Without AI technology, security professionals on the ground would use specially trained eyes to spot suspicious behaviour and act appropriately.

“If AI is trained with the experts in the room so to speak, there is a greater chance of it working as intended, in this case to mimic human understanding of the human behaviours which precede terrorism, and in doing so create a form of intuition about people which can alert limited resources to investigate further and potentially act, in a professionally trained manner.

“The potential impact of such use of AI in public places on human rights is significant, for a multitude of reasons, privacy, discrimination and so on. However, if used within the guardrails of professional security processes and the law, and with trained operators, there is a lower risk of causing harm and creating a chilling effect associated with mass surveillance and a lower risk of there being blind spots where bad actors may seek an opportunity to prevail.”


Register today for IFSEC 2023

16-18 May 2023, ExCeL London | IFSEC 2023: Recognising the past, embracing the future

Join thousands of likeminded security and risk professionals at IFSEC 2023 in May, as the UK's largest and longest running security event looks ahead to what's next in the sector as it celebrates its 50th birthday. This year will see the launch of the IFSEC distributor network, while London's new Elizabeth Line makes travel to the venue easier than ever!

You’ll find hundreds of leading exhibitors from the physical and integrated security sector, showcasing all the latest in video surveillance, access control, intruder detection, perimeter protection and software solutions. Join the community and secure your ticket today!


Related Topics

Notify of
Inline Feedbacks
View all comments