Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
Security Books Media Book Reviews

Security and Usability 65

ewuehler writes "I don't think I've ever heard a security application, be it a consumer anti-virus application or an enterprise IPS application, described as "user-friendly" or "easy to use". When I read the title of the O'Reilly book Security and Usability: Designing Secure Systems That People Can Use, I took the bait and requested a copy for review. The title could also double as my current job description, so I was equally interested from a "job education" point of view. The book is a collection of (mostly) academic articles, grouped in sections and chapters. Each article/chapter is written by different authors; from Bruce Tognazzini who founded Apple's Human Interface Group to Blake Ross of Firefox fame to names previously unknown to me. Read on for ewuehlers' review.
Security and Usability
author Edited by Lorrie Faith Cranor & Simson Garfinkel
pages 714
publisher O'Reilly
rating 8
reviewer Eric Wuehler
ISBN 0-596-00827-9
summary Designing Secure Systems That People Can Use


Along with the variety of authors, their backgrounds are equally diverse. The majority of the articles come from academia, with a few corporate names and open source authors. While not exclusively US authors, the majority of the articles come from US institutions. Generally, I would expect the "new author every chapter" approach to be a distraction, but the editors have done a good job at grouping the articles and cross-referencing chapters where appropriate. However, I did not find this a cover-to-cover read, the book lends itself well to "flipping and skipping" around.

The editors claim the goal of the book is "first for researchers in the field of security of usability, then for students, and finally for professionals." While I fit in the "professionals" category (not a term I'd use, but I had to pick one of the three), I found the information very helpful and educational with respect to my current job. With a majority of the chapters coming from an academic perspective, there is room for debate and interpretation of the conclusions. For example, several chapters discuss the fallibility of passwords, making it obvious the issue of password security is not just simply whether or not to write them down.

The book is divided into six parts. The first, Realigning Usability and Security, introduces the premise of the book. These five chapters discuss the importance of usability when designing security applications. It is well known that the human element is "the weakest link in the chain" of system security. For example, "Kevin Mitnick revealed that he hardly ever cracked a password, because it 'was easier to dupe people into revealing it' by employing a range of social engineering techniques. He points out that to date, attackers have paid more attention to the human element in security than security designers have." The implication being the less usable the security, the less likely it will be used correctly, no matter how good it actually is. The chapters go on to describe different processes for designing usable secure systems and applications.

The Authentication Mechanisms section discusses the usability requirements around passwords and other authentication techniques. The information in this section dealt more with implementation than theory as compared to the other sections. The expected chapters covering the prevailing forms of authentication; text passwords, challenge questions, graphical passwords, and biometrics are there. I found most interesting the chapter Identifying Users from Their Typing Patterns. This refers to "keystroke biometrics" which "seeks to identify individuals by their typing characteristics." The concept has been around for a while and first suggested for identification in 1975. (Random fact I found interesting, it finds its roots in 19th century telegraph operators who could often identify each other by listening to the rhythm of each individual's Morse code keying pattern.) Despite the fact that the concept has been around for quite a while, it does not seem to appear much on the authentication mechanism radar.

Secure Systems is the "make or break" section covering the secure user experience. These chapters cover things such as fighting "phishing" at the user interface, making PKI easy, and "deleting" files vs. really deleting files. One of the more interesting chapters looks at security tools and practices based on ethnographic field studies. While ethnography (the study of customs of individual peoples and cultures) initially does not sound like a "security or usability" issue, it is used (and the author claims quite effectively) to understand "the work practices of computer users in context, informing the design of computer systems to better suit their needs."

The section Privacy and Anonymity Systems contains a chapter discussing Google's Gmail privacy policy with respect to "informed consent". For the most part, the authors give Gmail high marks, with the privacy concerns rooted in differing definitions of "privacy". For example, if you believe that content-targeted advertisements should require the consent of all parties involved, you (in sending an email to a Gmail user) have not consented to the targeted ads the recipient sees. The chapter describes two other cases of informed consent and presents a list of ten design principles based on the information. Another chapter discusses leveraging social processes for managing browser cookies. The concept being when a cookie is saved stored, it will provide you aggregate community data such as X percent of the people visiting this site blocked the cookie. One obvious benefit is the ability to educate "less knowledgeable" users about "good" cookies vs "bad" cookies.

The remaining sections discuss usability of products, for which the chapter titles are description enough. Overall, I found the book useful. The variety of authors and subject matter made it easy to skip around and choose what piqued my interest at the time. Along with the academic feel of the book, each chapter is generally descriptive enough to get an idea as to what subject matter will be covered. While the book's target is "researchers and students" first, as a "professional" working for a security company, I found it helped me better explain the pros and cons of these topics to the less technical people I work with every day. I'd recommend it to anyone involved with the usability of security applications and systems."


You can purchase Security and Usability from bn.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
This discussion has been archived. No new comments can be posted.

Security and Usability

Comments Filter:
  • by 44BSD ( 701309 ) on Wednesday November 02, 2005 @02:35PM (#13934651)
    The reviewer seems to think that security and usability are meant to be attributes only of security applications. That presumption is in error.
    • by Proaxiom ( 544639 ) on Wednesday November 02, 2005 @02:48PM (#13934768)
      Good point. Designing security into general applications that does not interfere with the user experience is a far more interesting problem than designing usable security systems.

      The adage that security is the opposite of usability is false, of course. The problem is that people aren't very good at making intelligent design decisions when faced with both sets of requirements.

      There is a great paper on this subject by Ka-Ping Yee here [berkeley.edu] (PDF link).

      • The adage that security is the opposite of usability is false, of course.

        IMHO they generally are. Having worked at a secure facility, it is expensive and inconvenient. That trickles all the way down to the desktop PC, having apps broken by firewalls, not being able to install software needed to get the job done, being unable to get access to network services I need because I can't keep track of dozens of randomly generated passwords that change every 6 months, having a computer that runs like molasses

        • by Jherek Carnelian ( 831679 ) on Wednesday November 02, 2005 @03:46PM (#13935277)
          Having worked at secure facilities for longer than I care to remember, I am familiar with all of the problems you list and more. But the key at all of the secure sites I've worked at is that usuability is not a requirement put upon those who create and implement security policy. So they often make decisions based on how easy it makes their job as a security officer and not how difficult it makes life for the users. Thus they tend to come up with all manner of silly policies that marginally improve security while seriously degrading usuability. As long as they don't cause work to completely halt, they usually get away with such draconian policies.

          If the security guys had to take responsibility for the indirect costs of their policies, I believe we would see a marked improvement in the usuability of the facilities being secured, probably bringing with it an increase in actual security because there would be less incentive for people to do bad things like write down a list of their 15 different passwords and what not.
          • >If the security guys had to take responsibility for the indirect costs of their policies,

            There would be no security.
          • Thus they tend to come up with all manner of silly policies that marginally improve security while seriously degrading usuability. As long as they don't cause work to completely halt, they usually get away with such draconian policies.

            They either get away with them, or the users find subtle ways to work around them, thus negating the policy completely. Either that or the users ignore the policy completely. The best security policies are the ones you can explain to the people who are affected by it and ha

        • It's easier for me to copy a file using a passwordless ssh key than it is to use ftp. It's easier for me to remotely run a X11 app over ssh than it is to manually telnet into the box and run the app. It's easier for a client to have a client side certificate to authenticate to a web site than to have them remember a userid and password.
          • It's easier for me to copy a file using a passwordless ssh key than it is to use ftp.

            But don't you see, the inconvenient password-protected applications you mention wouldn't need passwords at all if not for security.

            It's true that a particular app can be more secure and more usable than another particular app; insecurity certainly doesn't guarantee usability! So, yes, I think a book like the one being discussed could be a great thing. Besides, in the real world security is sometimes more important tha

      • Ka-Ping Yee also maintains a blog on the subject: usablesecurity.com [usablesecurity.com].
      • Another very useful resource at that site is a bibliography on security and usability compiled by Rachna Dhamija [berkeley.edu].

        Unfortunately it hasn't been updated for some years, but it's a good starting point for someone looking to put together a more complete bibliography.

  • Interesting (Score:3, Insightful)

    by Pandora's Vox ( 231969 ) on Wednesday November 02, 2005 @02:41PM (#13934703) Homepage Journal
    This looks like an excellent and long-overdue book. As the t-shirt says, "there is no patch for human stupidity" - making secure systems usable is really the more challenging part.
    • "there is no patch for human stupidity"

      I believe it's call the clue stick [sillyjokes.co.uk]
    • >As the t-shirt says, "there is no patch for human stupidity"

      Industrial and aerospace safety made their big improvements when they stopped blaming accidents on stupidity (or more politely, "pilot error" or "operator error"). Professionals in those fields took a step back and investigated whether the "stupid" user was even getting the information s/he needed, whether that information was being buried by irrelevancies, and whether the user was trained to understand the system well enough to build a correct
  • security != easy.

    In fact, lim v->oo s/v = 0, where s=security and v=variables in your environment. No real security, but you try.

    I cringe when I see all those books at the local computer mart with titles like "TCP/IP Security" with a cheesy rainbox-colored logo beside it that says "Made Easy" in an italic font. Publishers actually think people trying to secure networks will be fooled by a logo that belongs on a home decorating book or manual can opener package?

    Having ranted, I'm sure the O'Reilly book
    • by hal9000(jr) ( 316943 ) on Wednesday November 02, 2005 @03:02PM (#13934900)

      security != easy.

      But security doesn't have to be hard, either. Look at desktop firewalls. I last looked at them (zone, sygate, symantec)maybe two years ago, so perhaps they have gotten better, but, user install it and all of a sudden they get a bunch of pop-ups asking if this or that can access the internet and do you want to let it. No context, no explaination, your lucky if you can get the file path. So users start saying yes and pretty soon, that desktop firewall is swiss chese. Couldn't the vendors have at least profiles Windows services and common applications and told the user something like "The Windows Messenger service is trying to listen for connections. If your a home user, you probably don't need this, so say No. If your an company user, ask your IT staff if you need it." rather than some long path. That's useability.

      What about all the vendors selling home internet firewalls? Most home users don't need a firewall, they need a NAPT router. If they are running games or an on-line service, then perhaps they need to port forward, but all the rest of that stuff is cruft. But for $50 more you can get a stateful firewall. You don't need it, but you can get it.

      These are examples of making the deploymnet of security needlessly complex. Oh, and it gets no better in enterprise security.

      There is a lot that could be done to make security easier to deploy while still being robust.

      • But security doesn't have to be hard, either. Look at desktop firewalls. I last looked at them (zone, sygate, symantec)maybe two years ago, so perhaps they have gotten better, but, user install it and all of a sudden they get a bunch of pop-ups asking if this or that can access the internet and do you want to let it.

        Security is a process, not a product. More on this below.

        Now.... People usually say "there is a tradeoff between usability and security." What they really mean is "There is a tradeoff between
    • by Anonymous Coward
      you iz 31337 calculus skillz!!! pwned!
  • by kmactane ( 18359 ) on Wednesday November 02, 2005 @02:47PM (#13934750) Homepage

    The "social engineering" concerns that were brought up in the paragraph about Mitnick have now become large-scale issues that affect everyone from site operators to end-users. Every phishing attack is essentially a social engineering attempt. Many worms, from "I Love You" on to the current IM worm, spread by convincing each new recipient that they're safe to execute— again, a social engineering attack.

    It's too bad this 714-page book probably won't be read by the average end-user; the fact that the current IM worm is still spreading is ample proof that users still aren't sufficiently aware of social engineering issues. And that affects all of us; the spam I just cleared out of my inbox probably came through a zombie machine that got infected by just such a worm.

    • Social engineering will always be the broken part of security. You can have a system that requires users to have a 20 character randomly generated password, as well as a smart card, and they will still pass their password around along with their card, which will have the password written on the back. Until people start taking computer security seriously, there will be no end to the security breaches. One of the real problems is that we tend to put a lot of the blame on those breaking into the systems. T
      • > You can have a system that requires users to have a 20 character randomly
        > generated password, as well as a smart card, and they will still pass their
        > password around along with their card, which will have the password written
        > on the back.

        Don't miss the correlation here. If you require users to have a 20 character randomly generated password, how could they remember it? Like rational beings presented with an impossible problem, they wouldn't--they'd write it down.

        When security peop

        • The problem is, is that if you allow people to choose any password they want, many will choose a short, easy to guess password. I guess if they don't have to write it down, then it's more secure than a 20 character password which they do write down, but it is still very insecure. Making the system vulnerable to stuff that people can give away will always make it less secure. Until we move to something like retinal scans, the password will always be able to be given away by the user. And social engineeri
          • > The problem is, is that if you allow people to choose any password
            > they want, many will choose a short, easy to guess password.

            A good password needs to have some significant randomness in it and
            people are really bad judges of randomness. Even some pretty nerdly
            folks can be really näive on this.

            > I guess if they don't have to write it down, then it's more secure
            > than a 20 character password which they do write down, but it is
            > still very insecure.

            What's wrong with writing down a pass
  • Sometime this seems to work out well, sometimes not. I thought that the Pattern Languages of Program Design [hillside.net] editors did a nice job on making that work, and the same goes for the excellent Game Programming Gems [gameprogramminggems.com] series.

    On the other hand, sometime you can really tell that one author wrote a book and was interested in the topic - i.e., Component Development for the Java Platform [awprofessional.com] by Stuart Halloway comes to mind. This was an excellent book for intermediate to advanced Java programmers and Stuart's interest in
  • I think residential products are getting easier to use now, with easier and cleaner interfaces. But this is easier to do than corporate applications due to the complexity of what's involved.
  • Security/Cryptography is not easy, almost by definition, because you are building systems that withstand all attacks -- even ones you cannot foresee at the time of design. Defense is hard when you dont know exactly what to defend against. As opposed to traditional software design, where, you have a very well-defined goal to meet. Such paranoia almost always leads to lack of usability.
    • In the same way, it's very hard to have a home PC that functions without bugs. Its impossible to know what kind of software the user will be running, as well as what stuff the user doesn't even know their running. I've run many windows systems that were very stable over the years. The trick is, keep the installed software base small, and only install that which is completely necessary. Most of the buggy, crashy windows installations I see are own by people who install everything they see, without even s
    • So don't get immersed in cataloging all the things in the wild, that burns too many cycles and changes everyday. Just catalog that which needs to be done. Once you understand what needs to run, and how, and with who, etc. configure your systems and networks to accomodate those needs, and only those needs. Apply all rules such that you permit what you need to, and default to drop. It makes the world much smaller and less scary, and much easier to manage. This is a condensed version of Markus Ranum's "Six
      • So don't get immersed in cataloging all the things in the wild, that burns too many cycles and changes everyday.

        Aha, here goes the point. One can pare down a software system by limiting the number of situations in which the software is supposed to work correctly. This is not possible in security -- you can limit what you do, not what a malicious adversary can do. It is a much scarier situation to deal with, and that is why crypto is hard.
        • No, you CAN pare down the variables into discrete sets so that you cover 95% of the threats for a reasonable cost and time. Occam's Razor and Paretos Rule apply in security just as in other parts of IT. You cannot make a system 100% secure for all situations unless you have infinte time or money so you have to compromise wisely. People and weak security processes are more likely to cause loss than purely a technological attack. After all, 90% of the hacks come from INSIDE the company.
        • But if you architect your environment such that network checks and host checks complement your policy, so you don't rely solely on one or the other sentry to do it's job, there's much less wiggle room for an exploit to work. One example is not allowing IRC if you don't use it. Many worms and rootkits attempt to open up an IRC channel, so this layered approach would prevent the potential zombie from "reporting in" via the firewall, which is configured to block IRC ports. Likewise configuring hosts to only r
    • Information access control and verified identity (and the resulting explicit, rather than implicit, trust) can be quite simple. My company deals with security professionals every day, and they are very pleased with the simultaneous high security and ease of use in our product. We're handling compliance requirements for some tightly-regulated industries, without requiring any changes in the way they do their business.

      (Unfortunately for this comment, I've promised not to do any astro-turfing for my company
    • by Anonymous Coward
      Security and usability are only mutually exclusive when you are stuck with old (traditional modern) security approaches. It does not have to be this way. The book contains an article by Ka-Ping Yee that points the way to resolving this apparent conflict. Software systems that demonstrate the points made by Yee include the Polaris secure desktop being developed at HP, the CapDesk secure distributed desktop developed by Combex, and the Web Calculus secure bookmark system developed by Waterken. These example s
  • Firewalls? (Score:3, Interesting)

    by LilGuy ( 150110 ) on Wednesday November 02, 2005 @03:45PM (#13935269)
    I'd say most consumer grade firewall applications are incredibly easy to use. I have had friends fire up that one really popular ZoneAlarm or whatever and get it to lockdown their connections, even as they barely know how to install a piece of software. It doesn't get much easier than that.

    Then again, once the little popup messages start rolling in, they have a few questions for me, but all in all I'm impressed with the ease of use.
    • Re:Firewalls? (Score:5, Insightful)

      by Tankko ( 911999 ) on Wednesday November 02, 2005 @03:50PM (#13935301)
      Problem is, those little boxes start to pop up almost immediately. One of the first is "Microsoft Subsystem Spooler is trying to access the Internet. OK?".

      What the hell if the Subsystem spooler? Can it be used by other programs to avoid detection? I have no idea. It would be nice if Zone Alarm spend an once of effort explaining what all these things were.
  • Instant Messaging (Score:3, Informative)

    by saskboy ( 600063 ) on Wednesday November 02, 2005 @04:14PM (#13935506) Homepage Journal
    Before UPnP routers started to play nicely with Yahoo and MSN Messenger, getting a messenger with voice communication to work properly behind NAT was a NATMARE [said with southern accent for some humour]! Hardware and software can definitely be improved to be more user friendly, while at the same time be more secure, since it's better to have a router with a firewall or at least NAT, than to be connected to the Internet naked just so you can use MSN Messenger with Voice. And I mean the computer being naked, not the user.
    • And this is a MAJOR security issue! Allowing a PC to open outside ports to itself is the perfect vehicle for creating a backdoor into the system. I always turn UPnP off for that reason.

      It may be possible to make something that is both incredibly easy to use and secure but this is definitely not it.
  • by G4from128k ( 686170 ) on Wednesday November 02, 2005 @04:14PM (#13935510)
    Security is all about preventing undesirable events. Yet these systems will always be subject to false-positives -- preventing the user from taking a legitimate actions. For example, I hate *nix file permission systems because its too easy to set the permissions too restrictively and run into "permission denied" problems when working across several accounts and file systems. Sure, I can chmod things, but that's an added hassle (each added step or command is to the detriment of ease of use).

    In contrast, the easiest systems require no passwords, no authentication, and let the user do anything they want to any file, anywhere. But that is not secure.

    Yet I'd argue that security can be made easier. Single sign-on and password keychains help (although these arguably reduce security somewhat). The bigger solution is goal-oriented UIs, not mechanism-oriented UIs. Current security often assumes that the user understands the internals of the system -- that particular ports provide particular functions or vulnerabilities. Easy-to-use security software would guide the user is defining what they want to allow or disallow and hide technical details of how that is implemented.

    Even if we add a wonderful UI to security, it will never be perfect. Security is about saying "no", ease of use is about saying "yes." To that extent, the gap between security and ease-of use is permanent.

    • Security is all about preventing undesirable events.

      True — but so is usability. It improves both security and usability to ensure that desirable events do happen and undesirable events don't happen.

      Even if we add a wonderful UI to security, it will never be perfect.

      Real usability isn't about "adding a wonderful UI" to things. Real usability means changing what the system requires you to know, changing how the system forces you to do things, and changing how the system responds to your actions,

  • Little Snitch [obdev.at] (which unfortunately is only OS X) is the most usable security type of application I've ever used.

    It's essentially a network monitor app similar to Zone Alarm for the PC but IMHO 1000% more usable.

    When a request is made either by an app on your machine or from a remote machine to access a port, Snitch pops up a dialogue that asks you whether you want to allow or deny it, for that port or all ports, until the app quits or forever.

    This simple selection is enough to cover 99% of all cases. ie: if
  • I found user-friendly and absolutelly secure solution!

    .

    .

    It's called a power switch!
    Once you fleep it "off", you can be sure your system is safe!

  • Instinct tells us that computer security and computer usability are inversely proportional to each other. In other words, the tougher and stricter the security is, the less usability there is, and vice versa. However, there have been plenty of cases where both computer security and computer usability went hand in hand with each other and actually improved together. In the last few years security has been the biggest buzzword in computer systems and as such has become part of our computer systems. Before tha

There are two kinds of egotists: 1) Those who admit it 2) The rest of us

Working...