IFSEC 2019

Modelling risk and crisis management for 2030

Julian Hall

Freelance journalist and copywriter, Textual Healing

July 16, 2019

Sign up to free email newsletters

Download

Vulnerable workers: A Barbour guide

“I imagine that, compared to 10 years ago the world looks pretty damn messy to you at the moment”, began Dr David Rubens, executive director at the Institute of Strategic Risk Management, in a panel debate on day three at IFSEC 2019.

Introducing the session about identifying risk and crisis management – part of the launch of the ‘ISRM Manifesto: Crisis 2030’ – he continued:

“The things that we thought we knew about security and risk management seem to be dissolving in the face of challenges that were inconceivable 20 years ago, and they seem to be developing at a pace which is leaving behind our capacity to model and engage with them.”

Rubens imagined what risk and security managers felt like waking up to that week’s power failures in South America and realising that everything they had put in place would probably not work. At this point, he asked the assembled delegates whether they thought their own crisis management programmes would work.

“If your crisis management plan works, it’s not a crisis”

It was a trick question: “If your crisis management plan works it’s not a crisis; the crisis is when it does not work.”

He went on to observe is that if you read executive summaries of disaster reports – be that Grenfell, Deepwater Horizon, Boeing, Volkswagen, Hurricane Katrina – the one word that keeps cropping up is “overwhelming” and this was “the definition of a crisis”.

“Are the changes to threats evolutionary – part of the natural change and development – or revolutionary – knocking down and rebuilding – or, as is my own personal opinion, are the threats mutational and the tools and methodologies we have are not strong enough?”

“There’s no question that if we look at natural disasters, technology failures, mutational viruses (technological or biological), global warming, the frequency of threat occurrence is getting faster and the impact greater.”

Dr David Rubens

Dr David Alexander, Professor of Risk and Disaster Reduction at UCL, introduced his contribution with a 1997 quote from sociologist of disasters

Professor Enrico Quarantelli: ‘Technology leads a double life, that which its designers and makers intended it to and that which they didn’t.’

“I’m part of a cascading disasters research group” began Alexander, “and we do a lot of black sky thinking. Blue sky thinking is when you look out of the window and let your mind wander a bit. Black sky thinking is asking ‘what do you do when the lights go out?’”

The idea of the lights going out universally might seem fantastical, but Alexander gave the example of a solar storm in 1859 that knocked out the telegraph – the only method of telecommunication at the time. If it happened today “it would knock out satellites, transformers and we would have problems for years on end with electricity supply. In 2012 a solar storm just missed the Earth…we have to think about worst case scenarios.”

Rick Cudworth, Crisis and Resilience Partner at Deloitte characterised risks as “evermore complex and uncertain and travelling towards us at increased velocity, meaning less time to prepare for them.” In his opinion, crisis management may have been partially successful in controlling and reporting risk but “a lot less successful in being agile and resilient to risk, and that is one of the big challenges for risk management going forward.”

However, Cudworth said he was starting to see a changing mindset in government circles and in corporations, with their attitude towards crisis shifting across the last 10-15 years from “it’s never going to happen” to “if it happens” to “when it happens.”

The 2010 explosion on Deepwater Horizon killed 11 crewmen and ignited a fireball visible from 40 miles away

Kev Brear, a consulting partner at Wipro Ltd, began by quoting analyst and consultant in crisis intelligence Patrick Lagadec and his term of ‘megacrises.’ Greer interpreted it as meaning “not having the bandwith to absorb the shocks and knocks that we have done in the past.” One factor that has exacerbated this is ‘just in time production’ and industries turning away from stockpiling resources as bad management. But ‘just in time’ relies on connectivity and organisations being aligned. “When we think about disrupting this chain, you can begin to see how these crises emerge and impact on our daily lives.”

“It’s not about plans, it’s about building adaptable structures”

Outlining a response to an existential challenge, Greer said: “It’s not about plans, it’s about building adaptable structures and having the people within those structures being able to adapt and evolve to a challenging situation.” Cultural change, he added, is key.

Grenfell and strategic and operational concerns

The panel then took questions, with one juxtaposing the strategic and operational measures around the Grenfell and Notre Dame fires. The issue raised the questions of what resources and capabilities you have at your disposal. Meanwhile, Dr David Alexander said that “there’s nothing that doesn’t have antecedents” and so disasters such as Grenfell should not be regarded as ‘Black Swans’.

David Rubens, who had earlier placed the emphasis on ‘Grey Rhinos’ (immovable problems demanding our attention) rather than ‘Black Swans’

(unexpected events that we notice and then move on from), pointed out that it was the local council’s response to Grenfell that made it a disaster. What also characterised Grenfell as a disaster was the communications failure. “The function of crisis management is not to tell people what to do, it’s to let them know what is going on.”

Getting it right first time?

“How do we teach people to learn to deal with a crisis?”, added Rubens, “you have to deal with one first.” He gave the examples of learning from the first foot and mouth outbreak and the difference between the management of the Gatwick and Heathrow drone situations. Then he asked: “How do we create capability that works first time?”

This was also the concern of another delegate who asked whether security and risk managers would always be stuck in a vicious circle “If a crisis by nature is an unknown, what are you supposed to do about it?”

Rick Cudworth emphasised the value of an organisation’s capabilities being demonstrated in rehearsals and teaching people to “organise the chaos”. He mentioned that an increasing number of organisations now have to plan for an existential crisis, for example banks and insurance firms, and enable failsafe options.

Dr David Alexander testified to the value of scenarios, “not as predictions, but as methodologies of what possible future might occur.” They can also be used to argue for more funds at the next budget meeting, he added.

Kev Brear of Wipro Ltd admitted to being wary of planning and playbooks that give “predetermined scenarios and outcomes.” He referenced Patrick Lagadec again who once said: “We cannot know the unknowable, guess the unguessable, or imagine the unimaginable – all we can do is prepare, train and equip our people to deal with the challenges that these threats create.”

Reputation management

There was also a very interesting question about whether there were some corporate reputations that would always be protected, no matter the disaster, for example, VW, KFC, BP and Exxon Mobil.

Rick Cudworth believed that “no brand is totally insulated.” Multiple events can leave a company irreparably damaged. Conversely, a company can emerge stronger if it responds and makes changes.

Kev Brear referenced some Oxford University research that showed that share prices go down after a cyber breach but rally when the company involved addresses it. Likewise, a company’s fortunes can pick up when they start making efforts to rehabilitate with ethical measures.

Dr David Alexander reminded delegates that some companies – for example – Pan Am and TWA – were considered too big to fail, but they failed nonetheless. He then set out the difference between how Iceland and the UK dealt with the banking crisis, drawing out from that the role political support plays in the eventual crisis outcome.

Finally, David Rubens observed that crisis management is the study of failure and so “we often spend most of the time looking at what went wrong but we can learn from case studies of success”, for example KFC taking responsibility for their supply crisis last year.

He also made the point that the organisations that believe that they are too big to fail are often the ones who bypass regulatory oversight – e.g the banks in 2008, Boeing, VW etc – and do so in a way that factors in the fines for taking those risks.

Free Download: Connected security for smart infrastructure

Explore what the rise of the Internet of Things means for people, homes and businesses, and how smart technology is making them more connected than ever. This free eBook from Abloy UK discusses the changing face of smart buildings, cities and infrastructure.

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments