What 80’s pop can teach us about Rocket failure and incident management


Most accidents originate in actions committed by reasonable, rational individuals who were acting to achieve an assigned task in what they perceived to be a responsible and professional manner.

(Peter Harle, Director of Accident Prevention,Transportation Safety Board of Canada and former RCAF pilot, ‘Investigation of human factors: The link to accident prevention.’ In Johnston, N., McDonald, N., & Fuller, R. (Eds.), Aviation Psychology in Practice, 1994)

I don’t just read infosec blogs or cartoons that vaguely related to infosec, I also read other blogs from “normal” people. One such blog is from a chap called Wayne Hale who was a Fligh Director (amongst other things) at NASA until fairly recently. As a career NASA’ite he saw NASA from it’s glory days through the doldrums and back to the force it is today. There are a number of reasons I like his blog, but mostly I have loved the idea of space since I was a little kid – I still remember the first space shuttle touching down, watching it on telly, and whooping with joy much to my mother’s consternation and chagrin. The whole space race has captured my imaginaion, as a small child and an overweight adult. I encourage anyone to head to his blog for not only fascinating insider stories of NASA, but also of the engineering behind space flight.

What Wayne’s blog frequently shows is one thing; space is hard. It is an unforgiving environment that will take advantage of every weakness, known and unknown, to take advantage and destroy you. Even just getting into space is hard. Here is Wayne describing a particular incident the Russians had;

The Russians had a spectacular failure of a Proton rocket a while back – check out the video on YouTube of a huge rocket lifting off and immediately flipping upside down to rush straight into the ground. The ‘root cause’ was announced that some poor technician had installed the guidance gyro upside down. Reportedly the tech was fired. I wonder if they still send people to the gulag over things like that.

This seems like such a stupid mistake to make, and one that is easy to diagnose; the gyro was in stalled upside down by an idiot engineer. Fire the engineer, problem solved. But this barely touches the surface of root cuse analysis. Wayne coniTunes;

better ask why did the tech install the gyro upside down? Were the blueprints wrong? Did the gyro box come from the manufacturer with the ‘this side up’ decal in the wrong spot? Then ask – why were the prints wrong, or why was the decal in the wrong place. If you want to fix the problem you have to dig deeper. And a real root cause is always a human, procedural, cultural, issue. Never ever hardware.

What is really spooky here is that the latter part of the above quote could so easily apply to our industry, especially the last sentence – it’s never the hardware.

A security breach could be traced back to piece of poor coding in an application;

1. The developer coded it incorrectly. Fire the developer? or…

2. Ascertain that the Developer had never had secure coding training. and…

3. The project was delivered on tight timelines and with no margins, and…

4. As a result the developers were working 80-100 hrs a week for three months, which…

5. Resulted in errors being introduced into the code, and…

6. The errors were not found because timelines dictated no vulnerabiliy assessments were carried out, but…

7. A cursory port scan of the appliction by unqualified staff didn’t highlight any issues.

It’s a clumsy exampe I know, but there are clearly a number of points (funnily enough, seven) throughout the liufecycle of the environment that would have highlighted the possibility for vulnerabilities, all of which should have been acknowledged as risks, assessed and decisions made accordingly. Some of these may fall out of the direct bailiwick of the information security group, for instance working hours, but the impact is clearl felt with a security breach.

A true root cause analysis should always go beyond just the first response of “what happened”? If in doubt, just recall the eponymous words of Bronski Beat;

“Tell me why? Tell me why? Tell me why? Tell me why….?”

Why do we put brakes on cars? Perhaps not for the reason you think.

Bosch Predictive Emergency Braking System

I have never liked the analogy;

Why do we put brakes on cars? So we can go faster. Therefore we put security controls in place so we can do riskier things.

I mean, I get it, the analogy makes sense, but like many analogies, if we are not careful they are likely to become a little too one dimensional. We also have brakes on cars to slow down for traffic lights, to ensure we don’t go too fast and run into the back of  the car in front, and also to stop the car quickly to avoid someone crashing into us. I am sure with a squeeze and a shove we could fit these analogies into an infosec analogy, but why bother?

I was reminded of this particular analogy and why I don’t like it this morning as I read my paper. The headline really resonated with me;

‘Living rooms’ on wheels put drivers at risk

The Times, Monday 23rd February 2015

The Times, Monday 23rd February 2015

The article discusses how the increase in technology in cars has actually led to an increase accidents in recent years. The anti-lock brakes, stability control etc. is creating complacency amongst users, and putting them and others at risk.

If we are not careful we are shifting towards this in our industry. It is of course a good thing to focus on secure coding practises, OWASP, secure by design etc., because that is as important as a seat belt and an air bag in a car (oops, see how easy it is?!), but if we try and put everything into those particular controls, we are abdicating responsibility away from the user more and more. By creating an insulated and isolated environment in which they operate there is no positive/negative feedback loop, no opportunity to learn from mistakes, near misses or even dumb good luck. They quite literally are on their own being guided only by what their immediate vicinity is reporting to them. Another quote;

They are as uninvolved in the process as they can possibly be

This could be describing our users and clients who we are removing more and more responsibility from when it comes to making sensible, thought out decisions about basic security. We are removing their perceived responsibilities as they say to themselves “if the system is letting me do this, it must be alright” as they download malware specifically designed to undermine so called built in security. (Actually the quote is from Peter Rodger, chief examiner for the institute of Advanced Motorists commenting on cars being turned into living rooms.)

Let us continue to understand how mature our security development framework is, let’s observe the OWASP top ten, but let’s also continue to establish clear guidelines, education and expectations of our people at the same time. If we don’t, we may be congratulating ourselves little too early for running a good security programme.

If we do that, we risk going back over a century in time, and putting the cart before the horse, let alone putting better brakes on the car.

(If you want good analogies however, that can help your people truly understand the information security environment they are operating in, head over to the The Analogies Project.)

Securi-Tay IV

TransparentLogo1-e1423236103647I will be spending the end of week with the Abertay University Ethical Hackers at their Annual Securi-Tay conference in Dundee. It’s a great conference so if you are at a loose end for Friday and in the area make sure you rock up and say hello to the lovely folks up there!

Not All Risks Are Bad (even the bad ones…)

Keep_Calm_Big_ThinkThe very term ‘risk” often makes people feel uncomfortable, with connotations of bad things happening and that if risk is not minimized or removed then life (or business) becomes too dangerous to continue.

Crossing the road is risky, especially if you live in a busy city, and yet people, young and old alike, do it every day. In fact it is riskier than flying  and yet I would argue that there are more people afraid of flying that of crossing the road. Hugh Thompson of RSA put it very well in his 2011 RSA Conference Europe presentation when he raised the issue of “Sharkmageddon”; more people are killed every year sitting on the beach by falling coconuts than those by sharks, but there is an almost universal fear of sharks. We irrationally consider swimming in the sea safer (less risky?) than sitting under a coconut tree.

Risk is an inherent part of our lives, and if we let the realities of risk take control of our business decisions we become the corporate version of an agoraphobic; staying in the safe confines of the environment  we know and not ever venturing out to be active in the outside world; ultimately we wither and fail be it as individuals or as a business.
In my experience, one of the most misunderstood approaches to treating a risk is to accept or manage it. Most people are comfortable with mitigating, transferring or avoiding a risk as they involve some kind of act to deal with them, something we are all familiar with. We fix a problem, give the problem to someone else or stop doing the thing that causes us the problem in the first place. However, it often feels wrong to simply accept a risk, in essence to do nothing. Although this is not strictly the case, it is essentially how we feel we are dealing with it. You are accepting that there is either nothing you can do, or nothing you are willing to do to reduce the risk. However, you are not blindly accepting it at face value; rather you are being cognisant of the risk as you continue your operational activities. You know it is there as you carry on your day job. These activities and the very environment you are operating in can change without notice, and make the decision to accept a risk now the wrong course of action.

For instance, it may now be cheaper to fix the risk than it was going to cost you, or the highly lucrative contract that made the risk acceptable is now over and there is a greater risk of financial lost that costs more than the revenue you are bringing in. The reasons for change are often financial, although not always. Your risk appetite may also have reduced or the industry you are operating in becomes more regulated; all of these example mean your decision to accept needs to be reviewed.

All risk decisions need to be reviewed regularly, for exactly the reasons given above, but in my opinion it is risk acceptance decisions that should be reviewed more often, as they are the ones that are made as a result of more transient and changing factors, and are the ones that will potentially harm the organisation the greatest.

tiger__extIt’s a bit like keeping a tiger as a pet – it looks awesome and maybe even draws admiring glances from many, but if you forget you locked it into your bathroom overnight you are going to have a very big surprise when you get up to go to the toilet in the middle of the night. You can’t accept risks without truly understanding them in the first place.

Why I am an Analogies Project contributor

Bruce_Hallas-300x286That devilishly handsome bloke you see to the right is Bruce Hallas. I used to go to school with him nearly 25 years ago, and then last summer, at the first old boys school reunion that our year organised since leaving I met him again, and it turns out we are in the same infosec business. I spoke to him about all of the good work I am doing, the company I work for, the many countries I visited and generally tried to make myself feel more important than the skinny eighteen year old I was when I last saw him. He told me that he runs his own infosec consultancy, his own blog, works with the UK government, and was in the process of setting up “a project” as a freely available, self funding, resource of analogies/stories to help people better understand information security. (Bruce immediately won the “my life is awesome since leaving school” competition of course.)

Since that time, The Analogies Project has grown from one man, an idea and a website to something producing real, quality content, and with a very promising and bright future.

In the words of the Project itself;

The Analogies Project has a clear mission. To tackle the unintelligibility of information security head on and secure the engagement of a much broader audience. Its aim is to bridge the chasm between the users, stakeholders and beneficiaries of information security and those responsible for delivering it.

Through a series of innovative initiatives the Analogies Project will enable information security professionals to effectively communicate with their chosen audiences. The content will be delivered through a variety of alternative communication techniques, media and partners.

The part of this project that I like the most is that it is essentially a community project. Bruce isn’t charging money for membership to the analogies as they are written (and they are coming thick and fast now!), and none of the contributors are charging for their work either. There are not only the web contributions in the form of a library, but a book planned, a conference, and even an opera! With the momentum that is currently behind the project at the moment there is every reason to believe in its future success.

So why am I contributing? Honestly, I have selfish and philanthropic reasons to do so. Obviously it gets my name out there, allows me to practise my writing, test some ideas and also say “I was there from the start”. All that aside though, I have frequently struggled in my day job to get infosec concepts across to people, either directly, in meetings or even in awareness training. To have had a resource like this available to me five years ago would have made my life so much easier, allowed me to advance the infosec “cause” more effectively and given me a set of tools I knew were consistant with the prevailing thoughts of industry commentators. Having a centralised, peer validated, toolkit available is fundamental to us as professionals when it comes to the messaging we give to our users, clients, bosses, teams and even the infosec community as a whole.

It’s still early days, but I have submitted my first contribution just last week (soon to be published I hope) and I am already inspired enough to be working on my second and third. There are a number of analogies already in place, and I would urge you to read them and consider them in the context of your current communications to your audiences, whomever they may be.  The book will be another important milestone and one I hope to play a part in; indeed I hope to be able to play a part in the the project for the forseeable future, and why I am happy and proud to display my “contributor” badge up on the top right of this site.


If you feel you have something to contribute, then head over to The Analogies Project and let Bruce and the organisers know. If you don’t feel ready to, then certainly check it out anyway. You won’t regret it.