Security mindset

A continual debate on the agile-testing mailing list is to what degree testers think differently than programmers and are therefore able to find bugs that programmers won’t. Without ever really resolving that question, the conversation usually moves onto whether the mental differences are innate or learnable.

I myself have no fixed opinion on the matter. That’s probably because, while vast numbers of testers are better than I am, I can imagine myself being like them and thinking like them. The same is true for programmers. In contrast, I simply can’t imagine being the sort of person who really cares who wins the World Cup or whether gay people I don’t know get married. (I’m not saying, “I couldn’t lower myself to their level” or anything stupid like that—I’m saying I can’t imagine what it would be like. It feels like trying to imagining what it is like to be a bat.)

However, I’ve long thought that security testers are a different breed, though I can’t articulate any way that they’re different in kind rather than degree. It’s just that the really good ones are transcendentally awesome at seeing how two disparate facts about a system can be combined and exploited. (A favorite example)

Bruce Schneier has an essay on security testers that I found interesting, though it doesn’t resolve any of my questions. Perhaps that’s because he said something I’ve been thinking for a while:

The designers are so busy making these systems work that they don’t stop to notice how they might fail or be made to fail, and then how those failures might be exploited. Teaching designers a security mindset will go a long way toward making future technological systems more secure.

The first sentence seems to make the second false. When I look back at the bugs I, acting as a programmer, fail to prevent and then fail to catch, an awful lot of the time their root cause wasn’t my knowledge. It’s that I have a compulsive personality and also habitually overcommit. As a result, there’s a lot of pressure to get done. The problem isn’t that I can’t flip into an adequate tester mindset, it’s that I don’t step back and take the time.

So, I suspect the interminable and seemingly irresolvable agile-testing debate should be shelved until we solve a more pressing problem: few teams have the discipline to adopt a sustainable pace, so few teams are even in a position to know if programmers could do as well as dedicated testers.

4 Responses to “Security mindset”

  1. BlueRaja Says:

    When he says ‘a security mindset,’ he doesn’t mean knowledge of good security practices - he means a completely different way of looking at the world in general, a mindset in which security is always a primary concern. This can be done while still being compulsive and habitually overcommitting.
    It’s not quite the same as bug-testing, though I can see where you’re coming from.

  2. George Dinwiddie Says:

    Very nice post, Brian! Thinking about the process stepping back and taking the time to switch between the mindset of making something work and the mindset of discovering where and how it might fail, it seems to me that this is almost guaranteed to fail over time. It’s not that I can’t think in both modes–it’s that the overhead of the context switching as I try to alternate between them repeatedly is bound to lead to shortcuts in my thinking. Otherwise my sustainable pace would become glacial, as I tried to restore more and more of the appropriate context over time.

    I’ve got an unfinished article on context switching at http://idiacomputing.com/moin/ContextSwitching

  3. Markus Gärtner Says:

    I heard that within our company there was a try - when I was not employed, so I don’t have the experience here - to switch between testing and development departments a few years ago. This attempt seems to be abandoned, since this is not exercised anymore.

    Personally I thought that testers are usually put under pressure in order to give the best results as fast as possible. But after reading your point of view I began to think that all parties within the software development are squished out like a sponge. What would be the outcome if developers had the time to get a mind for security?

    James Shore and Shane Warden describe this “taking time” process as using “Slack” during the iteration. Slack for investigation of related topics and for fighting technical debt. Personally I think the company’s processes need to give the whole team opportunities for the things you try to address here.

  4. Tom Macklin Says:

    Good security testers almost always need a decent knowledge of programming. This is because for all but the most obvious errors, a huge amount of time can be saved finding security flaws if the tester can perform a code inspection. This is one of the many, many reasons security testing generally gets relatively little consideration. Whether you have dedicated testers or not, security testers should have at least some programming experience.

    Then again, if there aren’t any security people at the table at an app’s design time, you probably won’t need a very good security tester to find problems anyway.

Leave a Reply

You must be logged in to post a comment.