Category Archives: Culture

NBlog April 23 – David v Goliath


Thanks to a mention in the latest RISKS-list email, I've been reading a blog piece by Bruce Schneier about the Facebook incident and changing US cultural attitudes towards privacy.
"As creepy as Facebook is turning out to be, the entire industry is far creepier. It has existed in secret far too long, and it's up to lawmakers to force these companies into the public spotlight, where we can all decide if this is how we want society to operate and -- if not -- what to do about it ... [The smartphone] is probably the most intimate surveillance device ever invented. It tracks our location continuously, so it knows where we live, where we work, and where we spend our time. It's the first and last thing we check in a day, so it knows when we wake up and when we go to sleep. We all have one, so it knows who we sleep with."
With thousands of data brokers in the US actively obtaining and trading personal information between a far larger number of sources and exploiters, broad-spectrum and mass surveillance is clearly a massive issue in America. The size and value of the commercial market makes it especially difficult to reconcile the rights and expectations of individuals against those of big business, plus the government and security services. This is David and Goliath stuff.

GDPR is the EU's attempt to re-balance the equation by imposing massive fines on noncompliant organizations: over the next few years, we'll see how well that works in practice. 

Meanwhile, US-based privacy advocates such as EPIC and EFF have been bravely fighting the individuals' corner. I wonder if they would consider joining forces? 

NBlog April 12 – bringing managers up to speed

Today I Googled across a thought-provoking opinion piece in Computerworld back in 2008. Jay Cline's top 5 mistakes of privacy awareness programs were:
  1. Doing separate training for privacy, security, records management and code of ethics. 
  2. Equating "campaign" with "program." 
  3. Equating "awareness" with "training." 
  4. Using one or two communications channels. 
  5. No measurement. 

Hmmm, not a bad list that. I've cut almost all of it here so if those few remaining words intrigue you, please read the original article.

We've been addressing all those points ever since NoticeBored was launched way back in 2003. It's galling, though, to note that those 'top 5 mistakes' are still evident today in the way that most organizations tackle awareness. 

We're doing our best to take current practice up a level through this blog, our awareness materials and services, and occasional articles. Perhaps we need a change of approach ... and we're working on that.


Jay's list of mistakes could be extended. In particular, most awareness programs focus on general employees or "end users". While Jay mentions offering role-based training for particular specialists, I feel that still leaves a gaping hole in awareness coverage, namely management. You could say they are specialists in managing, although there's no hint of that in Jay's piece.

Looking again at the list, all those mistakes could be classed as management or governance issues, being problems in the way the awareness and training programs and activities are structured and driven ... which, to me at least, implies the need to address that. It's a root cause. If management doesn't first notice that mistakes are being made, and then join-the-dots to figure out that the way security awareness as a whole is handled is probably causing the mistakes, then we're unlikely to see much improvement.

So, raising management's awareness of information security, risk, compliance, privacy, accountability, governance, assurance and so forth makes a lot of sense ... which is exactly what we aim to do through the management stream in NoticeBored. If management truly 'gets it', the awareness task becomes much more straightforward, giving the awareness and training program as a whole a much greater probability of success, leading to a widespread culture of security.

That leaves us with a chicken-and-egg conundrum though. If management doesn't quite 'get it', in other words if this security awareness stuff doesn't presently register with them as an issue worth investing in (or, more often, is treated as something trivial best left to IT or HR, with no real support and bugger all resources), then how can we tackle management's lack of awareness and break the deadlock?

I'll leave you now to contemplate that question, as I will be doing over the weekend. Maybe the vague thoughts I have in mind will crystallize into something more concrete for the blog next week. Meanwhile, by all means chip-in through the blog comments, or email me directly. I'd love to know what you think, especially any innovative and effective solutions you can offer. Is this an issue you face? How are you tackling it, or planning to do so? 

NBlog April 3 – blowing the whistle

No, Panera Bread Doesn’t Take Security Seriously is a heartfelt piece by Dylan Houlihan regarding a company that was notified responsibly of a privacy breach but apparently failed to act until, some 8 months later, it was informed by Brian Krebs. Then, all of a sudden, it reacted. 

This is far from the first time a whistleblower has been rebuffed.

Organizations clearly need strategies, policies and procedures for receiving and dealing with incident notifications and warnings of all sorts. 

Doing so makes sense for several good reasons:
  • Business reasons e.g. hacking, fraud, privacy breaches and other inappropriate disclosures;
  • Compliance reasons e.g. PCI-DSS and [soon] GDPR;
  • Ethical/social reasons e.g. offensive/inappropriate behavior or bribery & corruption by workers, failure to uphold corporate social responsibilities;
  • Bringing those responsible for various issues to account. 
So why don't they? Lame excuses include:
  • It's not the done thing;
  • We can't be bothered - we don't give a hoot - we simply don't care;
  • It hasn't occurred to us;
  • It is too risky to open Pandora's box;
  • It positively invites trouble;
  • It is too expensive;
  • It is not a priority - there are More Important Things we'd rather do;
  • We didn't invent it;
  • We are not formally required to do anything of the sort, therefore we won't even consider it - it's not on the table.
More sinister reasons include:
  • We are scared of being found out and held to account;
  • We know we have big issues already - telling us won't help;
  • So long as we have our eyes closed and fingers in our ears, we can pretend everything is alright.

NBlog February 9 – mapping awareness memes

Yesterday I came up with the suggestion of using memes to spread security awareness messages from person to person, in a similar fashion to the way that computer viruses and worms spread from IT system to IT system. 

Today I'm trying to come up with something that people will spread among each other by word of mouth, through email and TXT etc., something funny, shocking or useful - such as tips to avoid falling prey to malware maybe, or rumors about a serious malware infection within or close to the organization.

'Too close for comfort' has potential, perhaps a malware incident and business crisis narrowly averted by sheer good fortune. Or maybe we could fool workers into believing that the auditors will soon be coming to check up on the antivirus controls?

Such an approach could be unethical, risky even (e.g. if it prompted workers to meddle inappropriately with antivirus configurations or audit trails, rather than ensuring that the antivirus controls were operating correctly). It would need to be carefully considered and planned, which itself constitutes an awareness activity even if, in the end, the decision is taken not to go ahead.

The 'meme map' (derived from "Meme Maps: A Tool for Configuring Memes in Time and Space" by John Paull) represents the lifecycle and spatial or geographical spread of the meme. Reading from the bottom up, both the yellow area prior to the meme's release, and then the green area, are awareness opportunities.  

Mapping and demonstrating the gradual spread of a security awareness meme within the organization (e.g. mapping the source of clicks on a link to a fake internal memo about the fictitious antivirus audit, or tracking calls about the audit to the Help Desk) is yet another possible awareness activity, with similarities to the spread of malware ... at which point I recurse up my own backside, so that's enough idle musing for today's blog.

Quit Talking About "Security Culture" – Fix Org Culture! – The Falcon’s View

I have a pet peeve. Ok, I have several, but nonetheless, we're going to talk about one of them today. That pet peeve is security professionals wasting time and energy pushing a "security culture" agenda. This practice of talking about "security culture" has arisen over the past few years. It's largely coming from security awareness circles, though it's not always the case (looking at you anti-phishing vendors intent on selling products without the means and methodology to make them truly useful!).

I see three main problems with references to "security culture," not the least of which being that it continues the bad old practices of days gone by.

1) It's Not Analogous to Safety Culture

First and foremost, you're probably sitting there grinding your teeth saying "But safety culture initiatives work really well!" Yes, they do, but here's why: Safety culture can - and often does - achieve a zero-sum outcome. That is to say, you can reduce safety incidents to ZERO. This factoid is excellent for when you're around construction sites or going to the hospital. However, I have very bad news for you. Information (or cyber or computer) security will never be a zero-sum game. Until the entirety of computing is revolutionized, removing humans from the equation, you will never prevent all incidents. Just imagine your "security culture" sign by the entrance to your local office environment, forever emblazoned with "It Has Been 0 Days Since Our Last Incident." That's not healthy or encouraging. That sort of thing would be outright demoralizing!

Since you can't be 100% successful through preventative security practices, you must then shift mindset to a couple things: better decisions and resilience. Your focus, which most of your "security culture" programs are trying to address (or should be), is helping people make better decisions. Well, I should say, some of you - the few, the proud, the quietly isolated - have this focus. But at the end of the day/week/month/year you'll find that people - including well-trained and highly technical people - will still make mistakes or bad decisions, which means you can't bank on "solving" infosec through better decisions.

As a result, we must still architect for resiliency. We must assume something will breakdown at some point resulting in an incident. When that incident occurs, we must be able to absorb the fault, continue to operate despite degraded conditions, while recovering to "normal" as quickly, efficiently, and effectively as possible. Note, however, that this focus on resiliency doesn't really align well with the "security culture" message. It's akin to telling people "Safety is really important, but since we have no faith in your ability to be safe, here's a first aid kit." (yes, that's a bit harsh, to prove a point, which hopefully you're getting)

2) Once Again, It Creates an "Other"

One of the biggest problems with a typical "security culture" focus is that it once again creates the wrong kind of enablement culture. It says "we're from infosec and we know best - certainly better than you." Why should people work to make better decisions when they can just abdicate that responsibility to infosec? Moreover, since we're trying to optimize resiliency, people can go ahead and make mistakes, no big deal, right?

Part of this is ok, part of it is not. On the one hand, from a DevOps perspective, we want people to experiment, be creative, be innovative. In this sense, resilience and failure are a good thing. However, note that in DevOps, the responsibility for "fail fast, recover fast, learn fast" is on the person doing the experimenting!!! The DevOps movement is diametrically opposed to fostering enablement cultures where people (like developers) don't feel the pain from their bad decisions. It's imperative that people have ownership and responsibility for the things they're doing. Most "security culture" dogma I've seen and heard works against this objective.

We want enablement, but we don't want enablement culture. We want "freedom AND responsibility," "accountability AND transparency," etc, etc, etc. Pushing "security culture" keeps these initiatives separate from other organizational development initiatives, and more importantly it tends to have at best a temporary impact, rather than triggering lasting behavioral change.

3) Your Goal Is Improving the Organization

The last point here is that your goal should be to improve the organization and the overall organizational culture. It should not be focused on point-in-time blips that come and go. Additionally, your efforts must be aimed toward lasting impact and not be anchored around a cult of personality.

As a starting point, you should be working with org dev personnel within your organization, applying behavior design principles. You should be identifying what the target behavior is, then working backward in a piecemeal fashion to determine whether that behavior can be evoked and institutionalized through one step or multiple steps. It may even take years to accomplish the desired changes.

Another key reason for working with your org dev folks is because you need to ensure that anything "culture" that you're pursuing is fully aligned with other org culture initiatives. People can only assimilate so many changes at once, so it's often better to align your work with efforts that are already underway in order to build reinforcing patterns. The worst thing you can do is design for a behavior that is in conflict with other behavior and culture designs underway.

All of this is to underline the key point that "security culture" is the wrong focus, and can in some cases even detract from other org culture initiatives. You want to improve decision-making, but you have to do this one behavior at a time, and glossing over it with the "security culture" label is unhelpful.

Lastly, you need to think about your desired behavior and culture improvements in the broader context of organizational culture. Do yourself a favor and go read Laloux's Reinventing Organizations for an excellent treatise on a desirable future state (one that aligns extremely well with DevOps). As you read Laloux, think about how you can design for security behaviors in a self-managed world. That's the lens through which you should view things, and this is where you'll realize a "security culture" focus is at best distracting.

---
So... where should you go from here? The answer is three-fold:
1) Identify and design for desirable behaviors
2) Work to make those behaviors easy and sustainable
3) Work to shape organizational culture as a whole

Definitionally, here are a couple starters for you...

First, per Fogg, Behavior happens when three things come together: Motivation, Ability (how hard or easy it is to do the action), and a Trigger (a prompt or cue). When Motivation is high and it's easy to do, then it doesn't take much prompting to trigger an action. However, if it's difficult to take the action, or the motivation simply isn't there, you must then start looking for ways to address those factors in order to achieve the desired behavioral outcome once triggered. This is the basis of behavior design.

Second, when you think about culture, think of it as the aggregate of behaviors collectively performed by the organization, along with the values the organization holds. It may be helpful, as Laloux suggests, to think of the organization as its own person that has intrinsic motivations, values, and behaviors. Eliciting behavior change from the organization is, then, tantamount to changing the organizational culture.

If you put this all together, I think you'll agree with me that talking about "security culture" is anathema to the desired outcomes. Thinking about behavior design in the context of organizational culture shift will provide a better path to improvement, while also making it easier to explain the objectives to non-security people and to get buy-in on lasting change.

Bonus reference: You might find this article interesting as it pertains to evoking behavior change in others.

Good luck!