Monday, June 14, 2010

the security user conversion problem

i'm going to start out by saying that i believe in the effectiveness of user education in making users better able to protect themselves. i have to - i'm a product of self-directed user education - not believing in user education would be the same as not believing in myself.

and what's not to believe in? over 20 years of computing with only a single partial compromise (malware got in but was effectively neutered due to my precautions and environment). that's a better track record than a lot of people who work in the security industry, and i don't work in that industry. that doesn't make me a security expert, mind you, (in fact, i refuse to accept that title) but simply what i like to call a security user (a user of security, it's concepts, it's techniques, etc).

i don't know what specific security goals other proponents of user education have in mind. i've never asked any of them and perhaps i should have. mine is pretty simple, though. it seems to me that other people would be a lot better off, or at least a lot more secure ("better off" might be too open ended) if they were more like me. i know that seems rather egocentric but i was a teenager when i arrived at that conclusion so a certain amount of egocentricity is not unsurprising, and to be perfectly honest there hasn't been anything in the years since to change my mind.

so the question i have been grappling with since i was a teenager is 'how do i make others more like me?', which is to say how do i turn ordinary users into security users? it's a challenging problem and one that i've been working on for years. everything from providing strategies for people to follow (in the form of the anti-virus cookbook, originally written in the pre-windows days), to making information more available and easily found (through the anti-virus reference library), to simply trying to guide the way people think (which i use this blog for), to even trying a bit of memetic engineering (over at security memetics - and i use the term memetic engineering loosely). unfortunately those efforts haven't had the effect i'd been hoping for so i can definitely see both sides of the user education efficacy debate - on the one hand i know it works (it worked on me), but on the other hand it doesn't seem to be working.

obviously something is missing but what? how is it that i became a security user and the people around me generally don't even ask for advice? therein, i think, lies the clue. i've already framed security as a broad class of strategies for satisfying one's need for safety. if the people around me felt their need for online safety wasn't being met then asking their friendly neighborhood security nut would be one of the easiest approaches to changing that. in the absence of that happening i'm left to conclude they don't actually feel their needs for safety aren't being met. the lack of adoption of security best practices could easily be due to this fact alone - people feel safe enough already and don't feel the need to take any added measures. their perceived needs are already being met.

does that mean in contrast that i became a security user because i didn't feel safe? that's certainly an easy conclusion to jump to. but what about now? i'm still learning, still evolving as a security user - am i doing that because i still feel unsafe? that doesn't ring true to me. i feel pretty safe and i think i've got most of my bases covered. if i look back at the beginning, at my beginning on this path, i have to go back pretty far. i've recounted before the story of how i got interested in malware when i was 14, but what i haven't discussed openly before is that my association with security (even computer security) predates the events in that story. i started teaching myself programming at the age of 10 and my first user input prompt was a password prompt. nevermind the fact that it was a vic20 with a tape drive and at 10 i didn't have anything that needed to be protected, i obviously already had a pre-existing appreciation for security (and a rudimentary understanding of how to apply those concepts to computers). i can think of any number of early childhood experiences that could be responsible - all of them, admittedly, incidents after a fashion, but virtually everyone has encountered those sorts of incidents in their lives at one time or another without instilling in them an appreciation for security. more pointedly, people encounter computer security incidents now and still don't develop an appreciation for security.

that, i think, is an important point, because the basic premise of user awareness is to make the user aware of how unsafe they are - nothing should drive that point home better than an actual incident. by showing people that they are not actually safe you are creating (or revealing) a state where their needs are not being met and the universal reaction to this is fear (and possibly anger if you're the one threatening their needs). inevitably it's the application of fear in order to drive change, and personally i find the concept of playing on people's fears distasteful. i also suspect that it is an exercise in futility in the presence security vendor marketing types who have a long and successful history of dispelling fears as a means of selling product.

beyond that, i don't really think of myself as being afraid, so using fear on others doesn't really mesh with the idea of making people more like me. before i bore the remaining 3 readers to death i'll try and get to the point. i was taught at a very early age the value of arguing as a learning tool. it taught me to the importance of looking up facts and figures in order to support or disprove my own hypotheses, but more importantly it taught me to question and not believe everything i heard or read. it taught me to be skeptical. it taught me doubt. one of the things i've observed over the years is that others don't regard arguments in quite as positive a light as i do - and they also don't seem as quick to form doubts, to question or challenge those who supposedly know more. that's a shame because skepticism is the foundation of critical thinking, it is the the cornerstone of the advancement of human knowledge. if we believed everything we were told we'd still be living in caves and using stone tools.

and that, i think, is the missing ingredient in making people more like me - not fear that their needs aren't being met, but skepticism about whether X, Y, or Z can really make them as safe as the box says. skepticism about whether what their local smart guy says is right. even skepticism about whether security experts have it right. security marketing may be good at dispelling fears, but when it comes to doubts (especially reasonable ones) it's an entirely different ball game - and once people start doubting the easy answers those answers won't be able distract people from the search for what will really satisfy their needs for safety. everytime you use fear to drive change you're just feeding the marketing machine more fuel to turn that change you hoped for into mindless consumerism. we need to sow the seeds of reasonable doubt, to foster skepticism and train people to question and challenge more - not just so that they'll become more secure but so that they'll become fundamentally better at critical thinking.

1 comments:

Peachy said...

People just aren't scared and the proof is when asked to go to this PC and that PC its because "hey its acting a little slow P" thats all people worry about when security effects their lives. And judging by the 900 infections to a PC/laptop that all seem to be toolbars "but it said it would make it faster" speed is the only motivating force for 'the others' to even think about getting AV (yes there are people out there without any form of basic protection)