Lessons from Iraq for cyber security analysts
"I am going to inoculate you against over-confidence,” said George Beebe to the hundreds of cyber security specialists attending his keynote on day 1 of the ITWeb Security Summit 2020.
In a thought-provoking talk, titled ‘Lessons from failure: How cyber security professionals can learn from analytic mistakes in other fields’, Beebe, VP & director of studies at the US’s Center for the National Interest (CFTNI), discussed the challenges of anticipating attacks by adversaries.
Former head of Russia analysis at the Central Intelligence Agency, and author of ‘The Russia trap: How our shadow war with Russia could spiral into nuclear catastrophe’, Beebe drew a parallel between the incorrect judgements made about Iraq’s possession of WMDs (weapons of mass destruction) and cyber analysts trying to predict the next attack.
He said while topics may vary and technologies may advance, the principles of good analysis – which is breaking down something into its component parts and understanding how it works – don’t change. “Regardless of whether we’re looking at zero-day exploits or Chinese intentions toward Taiwan, the things that lead to analytic failure are almost always the same. So what I want to talk about today is exactly that – failure – and how you can learn from mistakes made by other people in other organisations.”
Addressing the cyber security specialists in the audience, Beeb talked about the cognitive traps that experts should be aware of.
“Expertise is a good thing, but it also means you’ve got a big problem – you are going to be bad at anticipating discontinuous change. Even the best experts correctly forecast discontinuous future developments about 30% of the time, most come close to 20%. This means you’ll be surprised a lot in your career,” he said.
There are three reasons for this: education does not prepare us for discontinuous change or departure from the normal – things we have not seen before; it involves complexity – interaction of a wide range of factors, while expert specialise in narrow fields; and over-confidence – it makes you prone to ignoring and underplaying important pieces of information that do not fit into familiar patterns, and more prone to deception by adversaries that are familiar with our biases and assumptions – and who devise tactic that exploit them.
“And let’s face it – the field of cyber security is all about deception.”
Three lessons
But cyber analysts can learn from mistakes made in other fields and Beebe shared three lessons from the Iraq WMD intelligence failure and the cognitive traps that led to it, saying g that they apply to all analyses:
Lesson 1: Explore alternative explanations for the things you are seeing – seek different explanations for emerging developments.
Lesson 2: Take a walk in the other guy's shoes – attempt to see things through the eyes of potential adversaries.
Lesson 3: Look to disconfirm rather than to confirm – i.e. support plausible analytic hypotheses. Don't fall into the trap of confirmation bias.
“One of the most significant problems facing intelligence analysts is nearly always that the information you have is consistent with multiple explanations. A basic data that proves one hypothesis can, in fact, be completely consistent with a different hypothesis that you hadn’t considered,” Beebe explained.
These lessons can help cyber security professionals mitigate risk, avoid failure and improve their chance of successful prevention and detection.
AI is a powerful tool
Beebe said he expects artificial Intelligence to dominate cyber security in 2020, with both attackers and defenders exploiting the advantages the technology offers.
“AI is becoming an increasingly powerful tool, one that can allow penetration of systems, it can be programmed in ways that really complicate the task of defenders,” said Beebe.
On the defence side, the same technology can be used just as effectively to identify breaches, expose anomalies and help cyber defenders cope with the ‘data fire hose’ problem – or the ever-increasing volume of data that is made available and has to be worked through or filtered.
“It is important to find a way to filter the data, to focus on the most relevant. Don’t start with the information that is already available. That is the natural response, but you are basically inviting confirmation bias,” Beebe added.