Everything that is happening now has happened before

While looking through old notebooks, I found this piece that I wrote in 2014 for a book that never got published. Reading it through it surprised me how much we are still facing the same challenges today as we did four years ago. Security awareness and security training are no different…

So, you have just been given responsibility for your company’s information security awareness programme and you have rolled out an off the shelf training product to the company. Job done? Probably not unfortunately, because like so many things in security, there is far more to an education and awareness programme than meets the eye. The following nine areas presented here are intended to give you guidance when establishing or improving your programme. Some may not be relevant to your organisation, some will be very relevant, but all of them are intended to provide ideas and insight into what is often a very emotive and personal subject.

 

Start at the Top

No business programme, least of all a security awareness one, is going to have any ongoing impact in an organisation if it doesn’t have the full support the senior leadership. Depending upon the type and size of organisation this could be the Board, the senior management team or even the C level executives.

Be wary of them just paying lip service as well, as they are crucial for the ongoing engagement of the company and your programme’s success. If they are the ones that haven’t taken their training then they are not committed to your programme. Senior leadership should be helping to not only communicate the training, but also reinforcing key messages and certainly leading by example.

Finally, make sure you can report back the senior leadership on the value of the training on a regular basis, be it every three, six or twelve months. However you choose to do this, bear in mind that the key purpose is to ensure your awareness programme is aligned with the business goals, and that is seen as a part of your organisations continued success.

Don’t Rely on Compliance

Using compliance as a key driver for acquiring investment for an education programme does work, but it is a short sighted approach that will limit what you can do in the future. This is because compliance is a very specific business problem that awareness addresses, and when the compliance requirement has been met there is no reason for the business to invest more money, investigate alternative approaches or expand the programme. That tick in the box limits the future of your programme.

Instead, use compliance as just one of the many drivers to build your programme, along with profit retention, reputational damage control and a protection against lost billable time for instance. These drivers will help your programme, again, align better with the company’s goals.

Teach Them to Fish

Now onto the content! No training is going to be able to put across the correct response to every single threat, every single implication of regulations and laws, and every single type of social engineering approach. The goal of the training is to arm people with a mindset, not all the answers.

Educating people on the implications of their actions, and not their actions alone is key here. By understanding that clicking on a link could result in something bad happening is more effective than just telling them not to click on links. Helping them appreciate that social engineers use an array of techniques to build a picture of the environment is more important than telling them to mistrust every interaction with every person they interact with.

In your position as an InfoSec professional, how do you know when a link or a question is dangerous? Try to put that across, and you should end up with an awareness programme that educates people not programs them.

Make it Relevant

Off the shelf awareness programmes are often seen as a quick, cost effective and easy approach to educating people. Many of the courses are very good too. However, you should be aware of your own organisational culture. Large, regulated organisations probably couldn’t effectively train through regular lunchtime briefings, and smaller organisations probably wouldn’t receive too well being in a room for three hours and having a PowerPoint shouted at them.

Additionally, there are going to be activities, lexicon and even teams and roles that are unique to your organisation. Try and avoid people having to “translate” the training they are taking to be relevant to their daily lives as much of the impact of the training will be lost.

Make it Useful

Not only should the training be useful in someone’s working lives, but also in their personal lives. In a world of Bring Your Own Device (BYOD) the lines between the workplace and home are increasingly blurred, and home networks, tablets and computers are increasingly being used to deliver into the workplace.

Educating people on how to secure their home network and WiFi, how to use a VPN in a cafe with their personal laptop, and even how to manage their own online lives not only helps secure the workplace, but also gives them a sense of being valued for the contributions they are making to the organisation.

Don’t be Too Serious

Humour is always an awkward subject when it comes to education and awareness, as it is rarely a universally agreed topic. However it is worth bearing in mind that given the often large amounts of “compliance” training often required these days (ethics, anti bribery, harassment etc training) making your course stand out is important.

Wherever possible draw upon the culture of the organisation, use in-house references (so everyone understand them) and try and avoid obscure internet humour as many people in the workplace may not understand it. Never, ever use offensive humour, or even anything that comes close to it. If your grandparents are unlikely to laugh then don’t use it!

Go MultiChannel

Taking a leaf out of the book of the marketeers and advertisers, your awareness program should be multichannel and use a number of different approaches to ensure the message gets across. Consider using videos wherever possible, leaflets, internal blogs, “sponsoring” internal events, using town halls and company meetings to present on specific security awareness projects. Poster campaigns are also a useful method of putting core concepts and points across, although a key part to their success is that they get changed on a regular basis to avoid becoming blind to them over time.

Also consider branding items like stickers, pens and pencils with a tagline or advice that ties in with your overall campaign in order to keep your security message in regularly being reviewed. Again this depends very much on the culture of your organisation as to what may seem like a cheap gimmick versus a good idea.

The core concept with this is to constantly engage with people through different means to maintain their attention and recollection of your security training.

Confirm Their Understanding

Making sure people actually understand the fruits of your hard labour goes beyond asking ten banal and blindingly obvious questions at the end of the training. These questions are table stakes when it comes to meeting compliance requirements but do nothing for actually confirming understanding. Conducting social engineering tests, sending false phishing emails (a whole topic in of itself) and even leaving trackable USB sticks lying around are valid ways to test peoples knowledge. The results of these tests can be written up providing even further educational opportunities in articles for the intranet and email updates.

Get Feedback & Start Again

The only way your awareness programme is going to improve over time is to ensure you gather open and honest feedback from all of those that you engage with throughout every phase of your involvement in your security awareness programme. Feedback from all of the recipients of the training, after every talk or awareness session and certainly feedback from the overall programme on an annual basis is an important way of ensuring good elements are enhanced and bad elements are removed.

Gathering feedback however is only half of the story; providing feedback on the effectiveness of the security awareness programme to senior leadership is also important. Consider metrics and the correlation of elements of the training as they roll out over the year to reported security incidents. Wherever possible do you best to monetise the incidents in terms of cost to the business so that over time, as security incidents decline (which they should do!) you can demonstrate the value of the programme and its contribution to the business.

Not all of these may be applicable to you and your organisation, but they should provide some guidance and ideas for you and your security awareness programme.


Ground Control to Major Thom

I recently finished a book called “Into the Black” by Roland White, charting the birth of the space shuttle from the beginnings of the space race through to it’s untimely retirement. It is a fascinating account of why “space is hard” and exemplifies the need for compromise and balance of risks in even the harshest of environments.

Having seen two shuttles first hand in the last nine months (the Enterprise on USS Intrepid in New York and the Atlanta at Kennedy Space Centre), it boggles my mind that something so big could get into space and back again, to be reused. Facts like the exhaust from each of the three main engines on the shuttle burn hotter than the melting temperature of the metal the engine ‘bells’ are made of (they ingeniously pipe supercooled fuel down the outside of the bells to not only act as an afterburner of sorts but also cool the bells themselves) go to show the kind of engineering challenges that needed to be overcome.

There was one incident however that really struck me regarding the relationship between the crew onboard and the crew on the ground. On the Shuttle’s maiden flight into space, STS-1 also known as Columbia carried out 37 orbits of the earth with two crew on board, mission commander John W. Young and pilot Robert L. Crippen. Once orbit was achieved an inspection of the critical heat tiles on the underside of the shuttle showed some potential damage. If the damage was too extensive the return to earth would (as later events in the Shuttle’s history proved) be fatal.

The crew however were tasked with a variety of other activities, including fixing problems onboard they could address. They left the task of assessing and calculating the damage to those on the ground who were better equipped and experienced to deal with the situation. This they duly did and as we know Columbia landed safely just over two days later.

It struck me that this reflects well the way information Security professionals should treat the individuals we are tasked with supporting. There is much that individuals can do to help of course, and that is why training and awareness efforts are so important, but too often it is the case that “we would be secure if it wasn’t for the dumb users”. The sole purpose of the Columbia ground crew was to support and ensure the safe return of those on board STS-1 so that they could get on with their jobs in space. Ours is the same.

Just because te crew had extensive training to deal with issues as they arose, the best use of their time was to focus on the job in hand and let ground crew worry about other problems. The people we support should also be trained to deal with security issues, but sometimes they really need to just get on with the deliverables at hand and let us deal with the security issue. They might be trained and capable, but we need to identify when the best course of action is to deal with their security issues for them, freeing them to do their work.

Never forget that we support our organisations/businesses to do their jobs. We provide tools to allow them to be more effective in their end goals but it is still our responsibility to do the heavy lifting when the time comes. Except in very rare cases we are there because of them, not in spite of them.

(Photo courtesy of William Lau @lausecurity)


Security is Not, and Should not be Treated as, a Special Flower

My normal Wednesday lunch yesterday was rudely interrupted by my adequate friend and reasonable security advocate Javvad calling me to ask my opinion on something. This in itself was surprising enough, but the fact that I immediately gave a strong and impassioned response told me this might be something I needed to explore further…

The UK Parliament in this report have recommended that CEO salaries should be defined by their attitude and effectiveness of their cybersecurity. I am not one normally for histrionics when it comes to government reports, partly because they are often impenetrable and not directed at me or my lifestyle, but I will make an exception in this case. I think this attitude is quite simply short sighted and a knee jerk reaction to a very public breach that was admittedly caused by a lackadaisical attitude to security.

I have argued for a long time that the security function is not a “special flower” in the business, and that by supporting that case security becomes an inhibitor of the business, restricting it from taking the kind of risks that are vital to a growing and agile business. The only way I would agree to this demand would be if the CEO’s compensation was directly related to financial performance, staff attrition, number of court cases levelled and number of fires or false alarms in its premises, and have that all supported by a change in the law. If that happened, there would suddenly be a dearth of well paid, well motivated CEO’s in the country.

By calling security out individually means the security function will all to easily slip back into old behaviours of saying NO! to every request, only this time the reason given is not just “it’s not secure”, but also “Bob’s pay depends on it”.

This can only work if every other function of the CEO was also covered by similar laws as I said above. Sure, there are basic behaviour laws around financial, people, legal, facilities etc. such that a company can’t be embezzled, people can’t be exploited or put into danger etc.. But this recommendations makes security far to primary a concern. It also doesn’t even take into account the fact that determined hackers will get in anyway in many cases, or that data can easily be stolen through softer, social engineering techniques. Zero day exploit, never before seen? Sorry Mr CEO, you need to take a pay cut for not having a cyber crystal ball and defending against it. Determined nation state attacks? Tough luck you only have a cyber budget a fraction the size of the attackers, back to reduced pay.

I get that many folks are angry with the level of CEO pay and reward in the workplace these days. In the case of Talk Talk I find it astounding that Dame Dido Harding has been awarded £2.8 million GBP in pay and shares after what has to be an absolutely disastrous year fro Talk Talk. That said, I also don’t know the details of her contract and the performance related aspects of it; maybe she hit all of her targets, and cyber risk was not one of them.

This is where we need to address this; not in law and regulation, but in cyber savvy contracts and performance metrics within the workplace and enforced by the Board. No emphasis on cybersecurity, but a balanced view across the entire business.

No single part of a business is the special flower, we all have an equal and unique beauty and contribution to make.


“And the winner is… Compliance!”

real-men-real-men-demotivational-poster-1221782347Disclaimer: My comments below are based upon quotes from both Twitter and The Times of London on the UK’s TalkTalk breach; as a result the subsequent investigation and analysis may find that some of the assertions are in fact incorrect. I will post clarifying statements should this happen to be the case.

I am not normally one to pick over the bones of company A or company B’s breach as there are many people more morbid and qualified than me to do so, and I also hate the feeling of tempting fate. All over the world i would guarantee there are CISOs breathing a sigh of relief and muttering to themselves/psychoanalyst/spouses “thank god it wasn’t us”. Bad things happen to good people, and an industry like ours that tends to measure success on the absence of bad things happening is not a great place to be when those bad things appear to happen far more frequently than ever before.

So it took me a while to decide if I should write up my feelings on TalkTalk’s breach, although I had Tweeted a few comments which were followed up on.

Quentyn W Twitter 1

(that original quote I Tweeted from the Times)

that original quote I Tweeted from the Times dated 25th October 2015

Initially I was shocked that people are still using the same password across so many crucial accounts. After a ten minute rant in the car about it with my wife, she calmly (one of the many reasons I married her) explained that not everyone thinks like me as a security professional, and that I should remember my own quote of “convenience eats security for breakfast”. Having calmed down a little, I was then shocked by something else.  That something else was when the TalkTalk CEO, Dido Harding was on national television looking clearly exhausted (I can only imagine how much sleep she had been getting the last few days) giving out unequivocally bad advice such as “check the from address on your emails, if it has our address it is from us”. Graham Cluley’s short analysis was spot on here:

As if TalkTalk’s customers hadn’t gone through enough, they are then being given shoddy advice from someone in a supposed position of trust that is going to put them at even more risk. The scammers and phishers must have been rubbing their hands with invisible soap and glee as they prepared their emails and phone calls.

Now, the attack it seems did not disclose as much information as was first though, which is good news. So credit card numbers were tokenised and therefore unusable, so no direct fraud could be carried out there (again dependent upon the form of that tokenisation which I am sure there will be more details on in the coming months). Bank details were however disclosed, but again, there is a limited amount of damage that can be done there (there is some I acknowledge, but it takes time and is more noticeable… another time for that discussion). Here is the Problem Number One though; with Harding’s poor advice, many people subsequently (and allegedly) fell for phishing attacks through either phone calls or emails, and lost hundreds of thousands of pounds. TalkTalk’s response? Credit monitoring.

And then we move to Problem Number Two; Why weren’t the bank details stored safely? Why were they not encrypted? Armed with the knowledge of customers bank account details scammers can make a much more convincing case that they are actually from TalkTalk, especially if other account information was also lost (time will tell). TalkTalk’s response?

TimesTalkTalk

Dido Harding talking to The Times, 24th October 2015

So TalkTalk was technically compliant? Shouldn’t this kind of thinking be consigned to the same mouldering scrapheap where “we’ve always done it this way” and “we’re here to secure the business, not help it” lay? I sincerely hope that this episode will at the very least highlight that “compliance” and “security” are two very different things and that the former most certainly doesn’t automatically result in the latter. What has transpired is the perfect storm of a breach, unforgivably poor advice, and complacency based upon compliance and resulted in the pain of a lot of people involving large amounts of money.

If an example like this does not spur you into doing more as regards your own security awareness activities, then please go back to the beginning and start again. Why? I have been accused of “victim blaming” somewhat (see the above Tweets), but if individuals had an ounce of sense or training they wouldn’t have fallen for the subsequent scams and been more careful when responding to email supposedly from TalkTalk. I will leave the last word to Quentin Taylor, and as you carry on with your internet residencies, don’t forget you need to wear protective clothing at all times.

Quentyn W 2


What 80’s pop can teach us about Rocket failure and incident management

image

Most accidents originate in actions committed by reasonable, rational individuals who were acting to achieve an assigned task in what they perceived to be a responsible and professional manner.

(Peter Harle, Director of Accident Prevention,Transportation Safety Board of Canada and former RCAF pilot, ‘Investigation of human factors: The link to accident prevention.’ In Johnston, N., McDonald, N., & Fuller, R. (Eds.), Aviation Psychology in Practice, 1994)

I don’t just read infosec blogs or cartoons that vaguely related to infosec, I also read other blogs from “normal” people. One such blog is from a chap called Wayne Hale who was a Fligh Director (amongst other things) at NASA until fairly recently. As a career NASA’ite he saw NASA from it’s glory days through the doldrums and back to the force it is today. There are a number of reasons I like his blog, but mostly I have loved the idea of space since I was a little kid – I still remember the first space shuttle touching down, watching it on telly, and whooping with joy much to my mother’s consternation and chagrin. The whole space race has captured my imaginaion, as a small child and an overweight adult. I encourage anyone to head to his blog for not only fascinating insider stories of NASA, but also of the engineering behind space flight.

What Wayne’s blog frequently shows is one thing; space is hard. It is an unforgiving environment that will take advantage of every weakness, known and unknown, to take advantage and destroy you. Even just getting into space is hard. Here is Wayne describing a particular incident the Russians had;

The Russians had a spectacular failure of a Proton rocket a while back – check out the video on YouTube of a huge rocket lifting off and immediately flipping upside down to rush straight into the ground. The ‘root cause’ was announced that some poor technician had installed the guidance gyro upside down. Reportedly the tech was fired. I wonder if they still send people to the gulag over things like that.

This seems like such a stupid mistake to make, and one that is easy to diagnose; the gyro was in stalled upside down by an idiot engineer. Fire the engineer, problem solved. But this barely touches the surface of root cuse analysis. Wayne coniTunes;

better ask why did the tech install the gyro upside down? Were the blueprints wrong? Did the gyro box come from the manufacturer with the ‘this side up’ decal in the wrong spot? Then ask – why were the prints wrong, or why was the decal in the wrong place. If you want to fix the problem you have to dig deeper. And a real root cause is always a human, procedural, cultural, issue. Never ever hardware.

What is really spooky here is that the latter part of the above quote could so easily apply to our industry, especially the last sentence – it’s never the hardware.

A security breach could be traced back to piece of poor coding in an application;

1. The developer coded it incorrectly. Fire the developer? or…

2. Ascertain that the Developer had never had secure coding training. and…

3. The project was delivered on tight timelines and with no margins, and…

4. As a result the developers were working 80-100 hrs a week for three months, which…

5. Resulted in errors being introduced into the code, and…

6. The errors were not found because timelines dictated no vulnerabiliy assessments were carried out, but…

7. A cursory port scan of the appliction by unqualified staff didn’t highlight any issues.

It’s a clumsy exampe I know, but there are clearly a number of points (funnily enough, seven) throughout the liufecycle of the environment that would have highlighted the possibility for vulnerabilities, all of which should have been acknowledged as risks, assessed and decisions made accordingly. Some of these may fall out of the direct bailiwick of the information security group, for instance working hours, but the impact is clearl felt with a security breach.

A true root cause analysis should always go beyond just the first response of “what happened”? If in doubt, just recall the eponymous words of Bronski Beat;

“Tell me why? Tell me why? Tell me why? Tell me why….?”