Tuesday, March 19, 2013

The Accidental Insider

   This week I did a national webcast with Capella University.  The topic was the Insider Threat.  But my take on this is a bit different than what's usually said on this subject.  You can see my slides here.

   The typical story about insider threat is about theft or fraud.  Here are some recent articles.  This is a real and present danger.

   But there is another category of internal issues... accidents.

   One very graphic example with which many people are familiar is the story of Bob Quick.  This was an unfortunate situation in which Mr. Quick, a senior law enforcement officer, on his way to a briefing at 10 Downing Street, inadvertently exposed (physically) a secret document detailing a pending anti-terrorism raid.  This caused the operation to be greatly accelerated and compromised the overall effectiveness.
   Mr. Quick was both well-respected and a leadership position.  I am not bashing Mr. Quick but pointing out that anyone with access to sensitive information can make a mistake that can expose that information.

   We read about hacking and stolen information all the time.  For how many of these is the root cause accidents/mistakes?  Unpatched systems, incorrect configurations, not following process, vulnerabilities in new software, debugging settings left in production systems, unencrypted (lost) media, lost devices,... these situations can all lead to some kind "hack" and subsequent data loss.  But these are all accidents!  (one of the best sources of data loss information is the OSF Datalossdb)

   And it's not just information that's at risk.  All organizations dependent upon technology suffer some kind of outage from time to time.  How many of those are caused by accidents or mistakes?  Consider issues like: fiber cuts, system maintenance issues, routing/re-routing issues, and many others.

   The well-known CSI/FBI Computer Crime Survey has started to differentiate between malicious and non-malicious insiders.  The equally well-known Verizon DBIR does not.  If you don't read these reports, you should!

   Now, this is absolutely not about blame or pointing fingers.  This is about identifying an issue and helping good, honest people do the right thing.  I think that many accidents happen because good people are trying to get work done fast.  And perhaps sometimes this is caused by security people putting too many hurdles in the way?

   So what do we do about this?

   CERT does some great research on this issue.  They list a set of good practices including:
  • Document and enforce policies/controls.
  • Provide security awareness training.
  • Use Separation of Duties and Least Privilege.
  • Use additional controls for privileged users.
  • Use Change Control.
  • Log, monitor and audit.
  • Have an Incident Response plan.
   I'm a big fan of Bruce Schneier's work.  He has 5 basic techniques around trust:
  • Limit the number of trusted people.
  • Ensure that trusted people are trustworthy.
  • Limit the amount of trust each person has.
  • Use overlapping spheres of trust.
  • Detect breaches of trust.
   So what do you think?  How many events, incidents, breaches or outages at your organization are actually due to mistakes when the blame was put on some external factor?  What do you do to help honest people do the right thing?


  1. Barry, it is refreshing to see a different take on the subject of insider threat.

    I agree with you and I think if we dig a little deeper and I will use an analogy - road accidents, it comes down to three root causes.

    (1) How do you know if someone is going to turn in front of you if they do not use their indicators? Root cause #1: Lack of signalling/communication.

    (2) It can be summed up by somebody who fails to check their mirrors, spends more time looking at foot traffic than road traffic, or strays across the line in the center of the road into oncoming traffic with one simple word... Root cause #2: Carelessness.

    (3) Did you ever come across someone who looks like they only just got their licence yesterday and seems to be a little too clueless for comfort about the rules of the road? Root cause #3: Ignorance.

    As a security strategist and advisor, the best solution I have found for these three scenarios:
    (1) Lack of communication
    (2) Carelessness
    (3) Ignorance
    is to not rely on prevention, but instead focus on detection (monitoring), response (countermeasures) and correction (long term strategies such as education)

    1. Hi Andrew, Great comments.
      I'll add one (possibly 2) groups. These actually might be part of "Lack of Communication".
      (4) Emergencies - not necessarily the police or an ambulance, but a driver may have a legitimate emergency need to get somewhere. In this case, the driver may ignore laws (controls) or drive in a way that may cause someone else to have an accident.
      (5) Disregard of laws - have you ever been sitting at an intersection where it seems that everyone has a red light? Or found other traffic flow designs that are a mess? Especially if there is no traffic, the impulse is to go through the red light. The problem here is poor control design.

      In each of these cases, the user has a good (well, at least to them) reason for ignoring the controls. Sometimes it's because something has to get done for a customer, but the security controls are in the way. Sometimes there's a (non-security) problem that needs to be fixed now.
      I didn't really explore these ideas in the post, but usually do when I talk about this topic.


  2. Barry, yes, these are excellent points. I have seen many organizations compromise on security due to impatience (let's just temporarily open up all TCP ports for this new application because it will be quicker than troubleshooting.... only temporary becomes permanent) or because of an emergency (such as network or application performance taking priority over security).

    It is another topic all by itself exploring what happens when controls are too rigid forcing individuals to purposely seek out ways to override or bypass those controls or when controls are in place but weakly enforced and may be open to interpretation by individuals.