I spent last night with five eloquent, passionate and above all opinionated colleagues arguing the pros and cons of security awareness training. We were doing this at the monthly Acumin RANT forum to a packed crowd who, as always, were not shy in holding back on their opinions.
We had two stand ins replacing Christian Toon and Kai Roer in the form of Bernadette Palmer and Andrew Agnes both of whom bought a huge amount of experience, opinion and humour to the evening. The lineup therefore was:
(The Award Winning) Javvad Malik, @j4vv4d
Andrew Agnes @sirjester
Rowenna Fielding @infosecgeeklady
We did a standard pre vote just before starting (we garnered no votes and a lot of good natured laughs as expected!) and then we went straight into the standard For and Against cycle with me kicking off. Nobody had briefed me (or perhaps I hadn’t listened…) that we were reducing our standard six minutes each down to three! A quick reshuffle in my head and we were off. The photos may look like I am singing Karaoke, but beneath the entertaining exterior was my serious message!
I have posted my core arguments to this blog before so I won’t rehash them here again but what followed over the next eighty minutes was hugely interactive, passionate, thought provoking and hilarious! With a few dongle and fork gags thrown in this debate had everything! Of course there was no real conclusion but at the closing vote there was a small but very definite swing in our favour, hooray!
What I found the most interesting however was that on the whole our arguments converged; we all acknowledged that information security training as it stands now is simply not working. What we do with it however, was where the real debate lay. Do you throw the whole lot out and start form scratch or do you continue to try and fix what we have? I think this is the dilemma we need to face up to sooner rather than later in the industry, once of course we accept that our training programs don’t work. That part is where the industry needs the most help.
I normally try and stay around after these kinds of events and listen to other peoples opinions, gather feedback and generally mingle. Tonight however I had dinner with a few folks (@jimshout, @j4vv4d, @sirjester, @jee2uu) to discuss an upcoming project. More on that in the next few months but it was a productive and exciting evening overall.
Finally, there was some footage taken of the evening by Gemma of Acumin; like all my footage if it ever sees the light of day I will get it posted here as soon as possible! As always a huge thank you to Gemma, Simon, Chris et al from Acumin for not only making this happen but asking me to be a part of it.
Originally attributed to Mark Twain, who subsequently attributed it to Benjamin Disreali (although no evidence has been found that he actually said it), the above quote sums up how the use of statistics can blur the lines between powerful argument supporter and simple use of numbers to confuse and deceive.
When used properly, statistics in your risk management programme help support your recommendations, allow you to build effective business cases and even allow for a certain amount of self-analysis and performance reporting. When used badly, you run the risk of undermining the credibility of your entire risk management programme.
Consider the following two statements made by security awareness training companies:
“Reduce phishing click-throughs by 75%!”
KnowBe4 Internet Security Awareness Training
“…successfully trained over 7000 employees” (Fox Entertainment)
TerraNova Security Awareness
In the first instance there is a bold claim in that click-through rates reduce by 75% which on the face of it sounds great. When reading in details there are some more impressive results but I can’t help thinking of the somewhat artificial nature of the test, i.e. “I have just taken anti phishing training and I am suddenly getting five phishing type emails, hmmmmmm”. Perhaps a more suitable test would have been to wait two months before sending the test emails? (The time between training and testing is unfortunately not specified however). There was also no mention of any feedback given in between each test. Security awareness training is such a hot topic however that I will leave that well alone for now!
In the second case the banner across the top of the website proudly announces how many people have been successfully trained; unfortunately it makes no mention of the other 5,500 employes who were not trained in Fox Entertainment (headcount checked at 12,500).
Now this is just standard sales patter and I certainly don’t mean to pick on these two companies specifically, but both statements illustrate the point perfectly. In both cases the products are probably very good in their own field, but when you “reverse” what it is they are saying they speak volumes. Some foods for instance are labelled as 90% fat-free, but in reality that means they contain 10% fat, and so here therefore there is still a 25% sized group of people who did click through and there are still 5,500 people who were not trained (and why not?). This is related to the fear, uncertainty and doubt that is often touted in the industry and can be used to scare and subsequently encourage people to buy products.
As risk professionals we need to take a more balanced, calmer route. We need to use statistics more carefully and responsibly, especially when what it is we are presenting makes its way into the core of the business, the leadership, the board, and ends up being used to make business decisions with serious implications. We can’t take sales statistics for instance on face value and use them to recommend a product or emphasise a point.
A Google search of “risk management statistics” produces over a billion results (in of itself a bad and useless statistic to present) so there is plenty of work out there on how to present your work, so I won’t be suggesting anything specific here. There are also plenty of other issues with statistics, for instance causation and inference which can be looked at in more detail at a later date.
i will however close on three key points I use whenever I am producing statistics for anything that comes out of the data gathered by a risk management programme:
- “Reverse” the statistic (see above). If you don’t like what you see, don’t use it.
- Be careful of your sample size; too small and the statistics are meaningless, too big and the resulting statistic you are focussing on is still a big and scary number even though you are potentially trying to emphasise quite the reverse.
- Look at what you come up with cynically; is it a lie, or a damn lie?
And to underscore how statistics can mess with your head, statistically there are six Popes per square mile in the Vatican; go figure.
In my last post I referred to ensuring that your risk management programme is producing the quality of output to ensure the business information it feeds into is of the highest quality; maintaining the integrity of your programme.
If there is one thing that can be done to improve the integrity of your risk assessments it is simply to get your hands dirty during them. I have had a number of conversations with people who have been on the receiving end of an assessment where the assessor simply sits at the table and asks for evidence in the form of documentation, verbal responses or even just PowerPoint presentations to confirm the effectiveness of the information security programme in question. Personally I have sat in a conference room for one or two days at a time and only left the room for a short thirty minute ‘walkabout’. Quite how the assessor felt they were getting a representative view of what we were doing was beyond me.
There are a number of problems with this hands off approach:
The ability of those being assessed to ‘play’ the assessor increases with their reluctance to physically move around the organisation. Pre-prepared evidences (the so called “audit box” as was once described to me) can be made available, the organisations SME’s can be wheeled in to ensure the right things are said at the right time and the people who never seem able to say the right thing at the right time (and every organisation has them!) can be told to work in a different building that day.
Secondly, unless the assessor is actually looking at the evidence first hand, even down to rifling through the physical pieces of paper or reviewing server logs, there is absolutely no way any kind of discrepancy will ever be found. Of course this is a sampling exercise, and of course there is no way every single piece of evidence, paper or electronic can be reviewed, but some kind of benefit can be gleaned from going though them. Quite apart form anything else it gives the clear impression that “no stone is unturned” during the assessment process. I have come up with a surprising number of findings from simply taking a few minutes to look through large piles of paper records.
Finally, and perhaps slightly more esoterically, the action of a walkabout can give a very good “feel” for a place. If the presence of the auditor brings hurried and furtive glances everywhere they go, it may give the indication of nervousness or unwillingness regarding the assessment (or of course just a healthy distrust of strangers). If there are rows of empty desks that are obviously normally in use but seem to be vacated for the day this may give the indication that special plans have been laid on for the assessment (or that the sales team are in a meeting). This last point is not so clear cut as the other two, and should only be used as an indicator of what is already coming out of your assessment, but it is a useful one nonetheless.
I have a colleague who every time he enters a “serious” meeting, he undoes his cufflinks and rolls up his cuffs a couple of times; this is his way of mentally preparing for the challenge ahead by literally rolling up his sleeves. When it comes to risk assessments that is exactly what you need to do, and then prepare yourself to get your hands dirty.
I presented at the eCrime and Information Security Congress on Wednesday, and had a terrific time presenting on my thoughts around making risk assessments more effective for the business. It was probably the largest audience I have presented to, and the stage and AV set up was suitably impressive. I had the support of two fine upstanding members of the infosec community (as well as @j4vv4d and @sirjester…) throughout the day and was fortunate enough to get some great feedback from both the organisers (in the form of @jonhawes) and Javvad after the event.
The key points I was making were:
- Ensure your risk management programme is producing the quality data that subsequently becomes business information.
- Know how to present your information in a compelling manner to ensure your message (and business information) gets across to the right people.
- Understand the connection between your activities and your organisations primary purpose, whatever that may be.
The presentation ran to just under twenty minutes but unfortunately the house style appeared to be not to field questions at the end. I felt I engaged well with the audience and had some unsolicited feedback to that effect afterwards, but I would have welcomed the opportunity to chat around the ideas and cocepts I was putting forwards. If anybody who watched the presentation reads this post please don’t hesitate to ask something!
As usual I have posted the slides below; I also intend to post a movie of the slides with a voiceover, but those of you who are still waiting for the footage from an event I did in September will know how prompt I am in creating these film. Javvad I am not!
The event itself appeared to be very well attended by both the public and sponsors, in fact a huge number of sponsors compared to even RSA Europe last year. The break out session were apparently very useful (I was unable to attend any as i arrived only for the last half of the second day, but heard good things about them), and above all the food was excellent!
Thanks to the folks at AKJ Associates for inviting me to speak, and especially to Jon Hawes. With a bit of luck I will be doing more of this in the coming months.