When Auditors Attack!

Although I am not a formally qualified auditor, I have had a fair amount of experience of carrying out audits and risk assessments in met various roles towards becoming a CISO. I have also been able to present on the topic and have articulated many of the unique challenges faced by auditors and audits alike.

Reading about auditors on social media, articles and LinkedIn is never a pretty affair, and there is rarely any love lost between them and those posting about them. For instance, the QSA who asked for (amongst other things) a list of usernames and plain text passwords. This auditor then doubled down when pressed, accusing the auditee of ntrying to hide a poorly maintained system.

A similar thing happened to a (barely adequate) friend of mine recently, when his auditor reported a finding that “users have read access to the Windows System32 folder” flagging it as a high risk. Even Microsoft stated that this is how their operating system works, and under “normal operation” cannot be changed. My (barely adequate) friend does not run nuclear power stations, by the way.

And attack they will.

Pushing back against these decisions in a formal manner is the only approach you can take; remove the emotion from the conversation and engage as soon as possible, even if it means potentially derailing the audit for an hour or so. If you are able to get team members to do research on the subject, or call in recognised SME’s, then all the better, but establishing the facts early is important. The longer the matter goes on though, the harder it is to resolve.

If that fails, wait until the report or draft comes in. This is an opportunity to formally respond and present evidence to the contrary. This response should be sent not just to the auditor, but also the company they work for (i.e. up the chain of command), as well as other stakeholders such as the clients that commissioned the audit. Their input is important as they are the ones both paying for the audit and with the most vested interest in its outcomes.

Finally, getting everyone involved around an actual table (difficult at the moment I know, but a videoconference will do the trick too) is the last course of action. Hopefully having line management, client/stakeholder, SME’s etc facing off will produce a more amenable result. Don’t expect it to disappear though, perhaps just be downgraded to medium or low.

Being an auditor has a complex dynamic. Third party auditors need to show value to whomever is paying the bills and can sometimes extend the scope or severity of issues to show “value for money”. They can also, ironically, be risk averse and not stand down for fear of being accused of wasting time and a subsequent law suit. An auditor is also trying to be an expert across multiple disciplines at once, as well the one of actually being an auditor, so there are always going to be knowledge gaps. Acknowledging that is a huge step to being a better auditor, and taking time to do independent research on topics you might have not understood as well as you have thought is vital.

For me, auditing/risk assessing was always an opportunity to help the people being assessed; this was a skill as well as a level of emotional intelligence that was shown to me by an ISO 27001 auditor in India, someone I remains friends with after over 12 years. That two-way engagement has been vital to establishing trust and subsequent transparency during audits, and has resulted in better quality findings and a willingness to address them.

Worst case, when it comes to an auditor that won’t back down, you can always just be Accepting the Risk and moving on with the day job.

(TL)2 Security has experience is risk assessment and audit across the security organisation. From a high level risk and gap assessment through to advisory and support services on meeting various certification audits, contact us to find out more.

Too Much of a Good Thing

The one thing the current lockdown has taught me is that you really can eat too much chocolate… who knew?

Left to my own devices and without the distraction of a routine, regular work and people observing my unhealthy eating habits, my faulty brain tells me that more chocolate can only be a good thing and that I should continue to eat it until physical discomfort forces me to stop (in spite of my brain’s protestations.). It is an obsessive and compulsive behaviour that I recognise in myself, and do my best to contain, but it is a constant struggle arguing with myself that chocolate is not the most important thing in my life.

The same could be said to be true of many security professionals and their desire to roll out security practises to their organisations, implementing new procedures, standards, policies and ways of working that are designed to make the organisation very secure. They do this despite the protestations of the organisation itself telling them they have had enough, the new ways of working are too restrictive, difficult to follow and ultimately leave them with a security stomach ache.

This weeks Lost CISO episode talks about when too much security, like chocolate, is a bad thing.

This compulsion to think that security is the most important part of a business’ life is one that leads to users having security headaches all day and the business itself feeling slovenly, bloated and sluggish. (OK, that’s enough of the analogies.)

It is ultimately self-defeating, as users will do their best to work around draconian working practices, and the perception of a security organisation will be one of business prevention than vital service. I, and many others, have spoken about not being the department of “no”, but it goes well beyond just saying “yes”.

Agreeing to everything without thought of the consequences is potentially even more dangerous than saying no, especially in the short term. The vital distinction that needs to be made is that of a two way conversation between security and the end users and business. Finding out what is trying to be achieved is far more valuable than just focusing on what is being asked. Requests can be addressed in many different ways, not just by punching a whole in the firewall or switching off 2FA on the VPN, for instance.

In fact, this very conversation helps create even stronger relationships as it highlights two things:

  1. How seriously you take their request.
  2. How much you care about the organisation you both work for.

A great example of this in the above video is that of companies relaxing their security stance during the remote working ramp up of the lockdown. If the response was simply “no”, or even a straight “yes” with no consequences there would have been issues sooner or later. Working with the business, relaxing the standards for the initial growth and then methodically scaling and tightening the security once the initial growth is over is absolutely the right way to go.

So next time you feel yourself reaching for the chocolate wanting to say “no”, think beyond the the immediate consequences and how you can use security for the long term betterment of your organisation rather than your simple security stats.

And one bar of chocolate/security is always enough for everyone, right?

Do you need two re-align your security team to your business and don’t know where to start? (TL)2 Security has a proven track record helping security leaders and teams creat strtaegies and business plans that make real, competitive, differences to organisations. Contact (TL)2 to find out more.

Shameless Coronavirus Special Promotion – Risk Edition!

iu-18Many, many moons ago, my good friend and learned colleague Javvad Malik and I came up with a way to explain how a risk model works by using an analogy to a pub fight. I have used it in a presentation that has been given several times, and the analogy has really helped people understand risk, and especially risk appetite more clearly (or so they tell me). I wrote a brief overview of the presentation and the included risk model in this blog some years back.

And now the Coronavirus has hit humanity AND the information security industry. Everyone is losing their minds deciding if they should self isolate, quarantine or even just generally ignore advice from the World Health Organisation (like some governments have shown a propensity to do) and carry on as usual and listen to the Twitter experts. During a conversation of this nature, Javvad and I realised that the Langford/Malik model could be re-purposed to not only help those who struggle with risk generally (most humans) but those who really struggle to know what to do about it from our own industry (most humans, again).

Disclaimer: we adopted the ISO 27005:2018 approach to measuring risk as it is comprehensive enough to cover most scenarios, yet simple enough that even the most stubborn of Board members could understand it. If you happen to have a copy you can find it in section E.2.2, page 48, Table E.1.

Click the image to view in more detail and download.

The approach is that an arbitrary, yet predefined (and globally understood) value is given to the Likelihood of Occurrence – Threat, the Ease of Exploitation, and the Asset Value of the thing being “risk measured”. This generates a number from 0-8 going from little risk to high risk. The scores can then be banded together to define if they are High, Medium or low, and can be treated in accordance with your organisation’s risk appetite and risk assessment procedures.

In our model, all one would have to do is define the importance of their role from “Advocate” (low) to “Sysadmin” (high), personality type (how outgoing you are) and the Level of human Interaction your role is defined as requiring. Once ascertained, you can read off your score and see where you sit in the risk model.

In order to make things easier for you, dear reader, we then created predefined actions in the key below the model based upon that derived risk score, so you know exactly what to do. In these troubled times, you can now rest easy in the knowledge that not only do you understand risk more but also what to do in a pandemic more.

You’re welcome.

Note: Not actual medical advice. Do I really need to state this?

The Lost CISO who?

And why am I being spammed with Twitter and LinkedIn about him all the time at the moment?

I came up with the concept of The Lost CISO when I was working late in the office one night. I decided to start writing and doing something about it straight away, and even created the banner and took my own picture for it sat at my desk. I also pulled the graphics together there and then, not in Photoshop, but Apple Pages (I was an executive at the time and to my shame do not know how to use PhotoShop. It still came out alright I think, though.


The idea was to create short informational videos, 2-3 minutes long, almost like a high energy presentation, in front of a green screen that I could then superimpose relevant imagery etc. It was a good concept, I thought, and within my technical skills with a camera and Final Cut Pro X. Or so I thought. I could also put all of my other InfoSec videos under the same brand, tying it up into a neat piece of branding. The films would be aimed at people simply are keen to learn, and no more. Not all of it will be groundbreaking stuff, but it will be researched, experienced or just advice that flies in the face of common knowledge. The basics, Plus, I suppose.

I created a test and shared it with some friend who gave me some honest feedback on quality, imagery etc.. I then did a first episode (bearing in mind each one took me about 7 days of intermittent working to edit), shared it again, and excitedly held my breath.

“Do not release this… it will do your personal brand more damage than good…”


Back to the drawing board; except I didn’t, life and work got in the way. Until twelve months went by, and I decided to just get this done properly once and for all. So I invested in some quality lighting, foley and a decent green screen, and even hired someone to do the filming and editing for me, and got to work. Of course, now I run my own business, I wasn’t able to prepare the topics as well as I wanted. To be honest, I pretty much flew through the filming so I could get onto the next job in my increasingly long To-Do list, but the quality, and to be honest, the creative talent I hired shines through far more than before.

As always, my success (such as it is) is tied to the talent of others. A lesson for everyone there, I think…

What’s the infosec lesson here? None really, although perhaps at a stretch I could say that just because my original idea failed didn’t mean it was a bad one, and I just needed the right resources. I don’t know, parallels to infosec education and awareness training maybe.

I hope you enjoy the series, and please do comment on them, let me know what you think and also if you would like a particular topic covered.





Keeping It Supremely Simple, the NASA way

Any regular reader (hello to both of you) will know that I also follow an ex NASA engineer/manager by the name of Wayne Hale. Having been in NASA for much of his adult life and being involved across the board he brings a fascinating view of the complexities of space travel, and just as interestingly, to risk.

His recent post is about damage to the Space Shuttle’s foam insulation on the external fuel tank (the big orange thing),and the steps NASA went through to return the shuttle to active service after it was found that loose foam was what had damaged the heat shield of Columbia resulting in its destruction. His insight into the machinations of NASA, the undue influence of Politics as well as politics, and that ultimately everything comes down to a risk based approach make his writing compelling and above all educational. This is writ large in the hugely complex world fo space travel, something I would hazard a guess virtually all of us are not involved in!

It was when I read the following paragraph that my jaw dropped a little as I realised  that even in NASA many decisions are based on a very simple presentation of risk, something I am a vehement supporter of:

NASA uses a matrix to plot the risks involved in any activity.  Five squares by five squares; rating risk probability from low to high and consequence from negligible to catastrophic.  The risk of foam coming off part of the External Tank and causing another catastrophe was in the top right-hand box:  5×5:  Probable and Catastrophic.  That square is colored red for a reason.

What? The hugely complex world of NASA is governed by a five by five matrix like this?

Isn’t this a hugely simplistic approach that just sweeps over the complexities and nuances of an immensely complex environment where lives are at stake and careers and reputations constantly on the line? Then the following sentence made absolute sense, and underscored the reason why risk is so often poorly understood and managed:

But the analysts did more than just present the results; they discussed the methodology used in the analysis.

It seems simple and obvious, but the infused industry very regularly talks about how simple models like a traffic light approach to risk just don’t reflect the environment we operate in, and we have to look at things in a far more complex way to ensure the nuance and complexity of our world is better understood. “Look at the actuarial sciences” they will say. I can say now i don’t subscribe to this.

The key difference with NASA though is that the decision makers understand how the scores are derived, and then discuss that methodology, then the interpretation of that traffic light colour is more greatly understood. In his blog Wayne talks of how the risk was actually talked down based upon the shared knowledge of the room and a careful consideration of the environment the risks were presented. In fact the risk as it was initially presented was actually de-escalated and a decision to go ahead was made.

Imagine if that process hadn’t happened; decisions may have been made based on poor assumptions and poor understanding of the facts, the outcome of which had the potential to be catastrophic.

The key point I am making is that a simple approach to complex problems can be taken, and that ironically it can be harder to make it happen. Everyone around the table will need to understand how the measures are derived, educated on the implications, and in a position to discuss the results in a collaborative way. Presenting an over complex, hard to read but “accurate” picture of risks will waste everyone’s time.

And if they don’t have time now, how will they be able to read Wayne’s blog?