Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Book Reviews Books Media

Geekonomics 227

Ben Rothke writes "First the good news — in a fascinating and timely new book Geekonomics: The Real Cost of Insecure Software, David Rice clearly and systematically shows how insecure software is a problem of epic proportions, both from an economic and safety perspective. Currently, software buyers have very little protection against insecure software and often the only recourse they have is the replacement cost of the media. For too long, software manufactures have hidden behind a virtual shield that protects them from any sort of liability, accountability or responsibility. Geekonomics attempts to stop them and can be deemed the software equivalent of Unsafe at Any Speed. That tome warned us against driving unsafe automobiles; Geekonomics does the same for insecure software." Read on for Ben's take on this book.
Geekonomics: The Real Cost of Insecure Software
author David Rice
pages 362
publisher Addison-Wesley
rating 9
reviewer Ben Rothke
ISBN 978-0321477897
summary How insecure software costs money and lives
Now the bad news — we live in a society that tolerates 20,000 annual alcohol-related fatalities (40% of total traffic fatalities) and cares more about Brittany Spears' antics than the national diabetes epidemic. Expecting the general public or politicians to somehow get concerned about abstract software concepts such as command injection, path manipulation, race conditions, coding errors, and myriad other software security errors, is somewhat of a pipe dream.

Geekonomics is about the lack of consumer protection in the software market and how this impacts economic and national security. Author Dave Rice considers software consumers to be akin to the proverbial crash test dummy. This combined with how little recourse consumers have for software related errors, and lack of significant financial and legal liability for the vendors, creates a scenario where computer security is failing.

Most books about software security tend to be about actual coding practices. Geekonomics focuses not on the code, but rather how insecurely written software is an infrastructure problem and an economic issue. Geekonomics has 3 main themes. First — software is becoming the foundation of modern civilization. Second — software is not sufficiently engineered to fulfill the role of foundation. And third — economic, legal and regulatory incentives are needed to change the state of insecure software.

The book notes that bad software costs the US roughly $180 billion in 2007 alone (Pete Lindstrom's take on that dollar figure). Not only that, the $180 billion might be on the low-end, and the state of software security is getting worse, not better, according the Software Engineering Institute. Additional research shows that 90% of security threats exploit known flaws in software, yet the software manufacturers remain immune to almost all of the consequences in their poorly written software. Society tolerates 90% failure rates in software due to their unawareness of the problem. Also, huge amount of software problems entice attackers who attempt to take advantage of those vulnerabilities.

The books 7 chapters are systematically written and provide a compelling case for the need for security software. The book tells of how Joseph Bazalgette, chief engineer of the city of London used formal engineering practices in the mid-1800's to deal with the city's growing sewage problem. Cement was a crucial part of the project, and the book likens the development of secure software to that of cement, that can without decades of use and abuse.

One reason software has significant security vulnerabilities as noted in chapter 2, is that software manufacturers are primarily focused on features, since each additional feature (whether they have real benefit or not) offers a compelling value proposition to the buyer. But on the other side, a lack of software security functionality and controls imposes social costs on the rest of the populace.

Chapter 4 gets into the issues of oversight, standards, licensing and regulations. Other industries have lived under the watchful eyes of regulators (FAA, FDA, SEC, et al) for decades. But software is written removed from oversight by unlicensed programmers. Regulations exist primarily to guard the health, safety and welfare of the populace, in addition to the environment. Yet oversight amongst software programmers is almost nil and this lack of oversight and immunity breeds irresponsibility. The book notes that software does not have to be perfect, but it must rise to the level of quality expected of something that is the foundation of an infrastructure. And the only way to remove the irresponsibility is to remove the immunity, which lack of regulation has created a vacuum for.

Chapter 5 gets into more detail about the need to impose liability on software manufacturers. The books premise is that increased liability will lead to a decrease in software defects, will reward socially responsible software companies, and will redistribute the costs consumers have traditionally paid for protecting software from exploitation, shifting it back to the software manufacturer, where it belongs.

Since regulations and the like are likely years or decades away, chapter 7 notes that short of litigation, contracts are the best legal option software buyers can use to leverage in address software security problems. Unfortunately, most companies do not use this contractual option to the degree they should which can benefit them.

Overall, Geekonomics is an excellent book that broaches a subject left unchartered for too long. The book though does have its flaws; its analogies to physical security (bridges, cars, highways, etc.) and safety events don't always coalesce with perfect logic. Also, the trite title may diminish the seriousness of the topic. As the book illustrates, insecure software kills people, and I am not sure a corny book title conveys the importance of the topic. But the book does bring to light significant topics about the state of software, from legal liability, licensing of computer programmers, consumers rights, and more, that are imperatives.

It is clear the regulations around the software industry are inevitable and it is doubtful that Congress will do it right, whenever they eventually get around to it. Geekonomics shows the effects that such lack of oversight has caused, and how beneficial it would have been had such oversight been there in the first place.

To someone reading this review, they may get the impression that Geekonomics is a polemic against the software industry. To a degree it is, but the reality is that it is a two-way street. Software is built for people who buy certain features. To date, security has not been one of those top features. Geekonomics notes that software manufacturers have little to no incentive to build security into their products. Post Geekonomics, let's hope that will change.

Geekonomics will create different feelings amongst different readers. The consumer may be angry and frustrated. The software vendors will know that their vacation from security is over. It's finally time for them to get to work on fixing the problem that Geekonomics has so eloquently written about.

Ben Rothke is a security consultant with BT INS and the author of Computer Security: 20 Things Every Employee Should Know.

You can purchase Geekonomics: The Real Cost of Insecure Software from amazon.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
This discussion has been archived. No new comments can be posted.

Geekonomics

Comments Filter:
  • by jejones ( 115979 ) on Monday January 21, 2008 @04:56PM (#22130476) Journal
    Regulation is a means by which the established companies keep possible competition from developing. MS can pay for that overhead from pocket change; can Open Source developers?
  • as the review says (Score:5, Insightful)

    by ILongForDarkness ( 1134931 ) on Monday January 21, 2008 @04:57PM (#22130488)
    "Software is not sufficiently engineered to serve as a foundation" [for society] - I agree whole hardily. Things are getting better but we still have very little idea whether what we code "works" or not, let alone is secure. For example: a software vendor will say we have 80% path coverage. Great, now tell me: do you have 80% path coverage because only that 80% was deemed risky, or because writing tests for the remaining 20% was deemed too time consuming (or worse your test/dev team weren't skilled enough to write tests for those paths)?

    In my experience there is so much feature creap in software projects that there always seems to be that last feature that needs to get squeezed into the next release at the last moment and there isn't time to test. "lets just hope that 10k line module works and is secure. Even if it's not, we can always release a SP after we have the product on the market". It is even to the point where major software companies (MS comes to mind) have a concept of Zero Bounced Bugs. That is the point where the bugs getting fixed equals the bugs being found. If no "major" bugs and you've reached ZBB you ship. Now I can see you can't wait forever to ship, but there is this inherit acceptance of flaws in software that you won't see in say bridge building.

  • by jorghis ( 1000092 ) on Monday January 21, 2008 @05:06PM (#22130586)
    I always thought the bridge building analogy was a little bogus.

    Bridge building isnt really all that complex, there is a hell of a lot more going on in a software product of any real magnitude than in a bridge. Sure, there are a few things like wind you have to take into account, but there really arent as many variables in bridge building as there is in software development.

    In addition to that, software has to be exactly perfect, with a bridge you can just say "screw it, lets reinforce/add supports/whatever here, here, and there just to be safe" and you are good to go. (I know I am oversimplifying to some degree, but you see my point) It is possible to give yourself a lot more room for error.
  • by kebes ( 861706 ) on Monday January 21, 2008 @05:14PM (#22130670) Journal
    Indeed. Analogies to bridges and cars only make sense for software that can endanger lives: medical systems, bridge-designing systems, vehicle-control systems, etc. As you point out, in all those cases, the software (as well as any designs the software spits out) will be verified in detail and validated. The software vendor will usually be bound by stringent contracts and will indeed be contractually and legally responsible for defects.

    The rest of software, like word processors, and spreadsheets, and music apps, doesn't need that kind of stringent oversight. A better analogy in such cases is to other mundane things: books, binders, pencils. Poorly designed binders and pencils can lead to lost productivity in the same way that poorly designed software can. Those who care will go for the higher-quality product (which may require more money, either in initial expenditure or in staff expertise). Again, errors in books can certainly lead to lost productivity, but is there really any need for more "book security" and "book oversight" and "book regulations" to make sure that the contents of books are robust and error-free?

    I submit that such oversight is not really necessary (again, except in issues of health and physical safety). Most people can tolerate the occasional annoyances of breaking pencils, typos in books, and crashes in software. Ideally people should be educated about risk (e.g. don't put important documents in a flimsy box, put them in a safe; similarly, don't put important data in a low-security computer, get a properly administered server), so they can make informed choices. But more laws and regulation? Not necessary.
  • Re:I can't wait (Score:1, Insightful)

    by Anonymous Coward on Monday January 21, 2008 @05:17PM (#22130706)
    Which will only run Micrsoft software, since all other competitors have been regulated out of existence; especially open source software.
  • by HappySmileMan ( 1088123 ) on Monday January 21, 2008 @05:19PM (#22130722)

    Again, errors in books can certainly lead to lost productivity, but is there really any need for more "book security" and "book oversight" and "book regulations" to make sure that the contents of books are robust and error-free?
    I've yet to see a flaw in a book steal my, or anyone elses, credit card number, or delete all my other books, have you?
  • by mcpkaaos ( 449561 ) on Monday January 21, 2008 @05:20PM (#22130738)
    Bridge building isnt really all that complex

    (I know I am oversimplifying to some degree, but you see my point)

    Have you ever stopped to wonder if you are actually over-complicating software design rather than over-simplifying the analogy?
  • by nullchar ( 446050 ) on Monday January 21, 2008 @05:21PM (#22130750)
    Heh, and how many people/companies are really replacing XP with Vista?
  • by flabbergast ( 620919 ) on Monday January 21, 2008 @05:27PM (#22130808)
    The book tells of how Joseph Bazalgette, chief engineer of the city of London used formal engineering practices in the mid-1800's to deal with the city's growing sewage problem.
    Why is it that any time someone talks about software engineering they always bring up bridge/house/skyscraper building? Yes, Joseph Bazalgette used "formal engineering practices" to build London's sewers, but where did these formal practices come from? Why yes, through trial and error. Thousands of years of trial and error. Use concrete. Yes, it makes sense looking back because it worked, but what if it didn't work? What if the concrete failed? Or, What if he used clay pipes instead? Then we'd be saying "[insert name here] used formal engineering practices to deal with the city's growing sewage problem. Some guy before him failed miserably though." We simply haven't built up the software engineering toolbox yet. Software hasn't even been around for 100 years! But we're learning, and if you look in specific industries like medicine, banking, and avionics spring to my mind, they all spend billions of dollars to make sure their software works correctly because being correct for them is worth the cost.
  • by davidwr ( 791652 ) on Monday January 21, 2008 @05:29PM (#22130834) Homepage Journal
    Flaws in books can have disastrous consequences if someone depends on them to be flawless.

    Imagine a repair manual for a gas stove that said "blow out pilot light, turn on gas, wait one hour, invite your friends over, and light a match." Sure, it might not steal credit card numbers but in the face of an ignorant and trusting user, it could prove fatal nonetheless.
  • OT: Drunk driving (Score:4, Insightful)

    by operagost ( 62405 ) on Monday January 21, 2008 @05:30PM (#22130848) Homepage Journal

    Now the bad news -- we live in a society that tolerates 20,000 annual alcohol-related fatalities (40% of total traffic fatalities) and cares more about Brittany Spears' antics than the national diabetes epidemic.
    I love analogies, but I'm going to have to go way OT here and set you straight. In the USA, drunk driving is NOT tolerated. After years of onerous regulations, infringements on drivers' (and sometimes passengers') rights in the form of sobriety checkpoints, and ridiculously low BAC requirements (now commonly .08), we still have fatalities due to drunk driving.

    But this isn't because we don't care.

    Obviously, all those things I listed show that people do care; however, they are going the wrong things to address the problem. We have allowed special interests like MADD, who are modern-day temperance societies, dictate these changes to us with little review or oversight. It has been statistically proven [ct.gov] that fatalities do not decrease with a .08 BAC law, yet 15 states have passed such laws and MADD continues to pressure more. Sobriety checkpoints were begrudgingly allowed by the courts in the 1980s and 1990s to address the drunk driving "emergency"; but since judicial decisions don't have a sunset, and no one wants to challenge a policy that protects "the children", this infringement on our personal rights continues. The federal government infringed on states' rights in order to force the drinking age to 21 in the USA, even though Canada (with age limits of 18 and 19) has shown that drunk driving could be greatly reduced without infringing on the rights of young adults. Now MADD wants to require breathalyser interlocks in all new motor vehicles; ignoring the privacy rights, expense, and technological issues raised by such draconian policies. Think about how many miles passenger cars travel in a year, and decide in practical terms how many fatalities are practical and acceptable. Think about other oppressive regulations you could impose if safety were truly paramount: reducing the speed limit to 25 MPH, requiring 15 MPH bumpers, requiring driver retesting annually, etc. Rationalizing these kinds of laws in absolute terms such as "for the children" and "if it saves one life" makes no sense as we deal in statistics and weight everything in the balance every day. Life is truly precious, but we live in an evil, dangerous world-- not a rubber room.

    Maybe we need to do more. But remember that there will always be people who insist on doing the wrong thing, and finding a way to do it.
  • It's a gamble (Score:2, Insightful)

    by davidwr ( 791652 ) on Monday January 21, 2008 @05:33PM (#22130884) Homepage Journal
    If it costs 10x as much to fix a problem than prevent it, but for every $100 you spend on prevention you only prevent 1 failure, you are in the hole $10. That's rational math at work.

    If you are a greedy-bastard manager and you expect to be in your position for only a few years, all you care about is the failures that will come back to haunt you. You don't care if spending $5M now will save $1M in expenses over the next 5 years but save an additional $20M 10 years down the road. By then you and your greedy self-interested wallet will be out of harms way.

    On the other hand, if you are a human manager with a conscience, you'll look at things long-term and either ante up now or make sure the problem is addressed before it is too late.
  • by Digital_Mercenary ( 136288 ) on Monday January 21, 2008 @05:48PM (#22131020) Homepage Journal
    "Now the bad news -- we live in a society that tolerates 20,000 annual alcohol-related fatalities (40% of total traffic fatalities) and cares more about Brittany Spears' antics than the national diabetes epidemic. Expecting the general public or politicians to somehow get concerned about abstract software concepts such as command injection, path manipulation, race conditions, coding errors, and myriad other software security errors, is somewhat of a pipe dream. "

    Pipe dream... not quite...
    It just hasn't led to catastrophic loss of life...yet... when it does thats when we'll take notice... right now most of us are living week to week on our paychecks trying to get ahead... think of the public as what Morpheus talked about in the matrix...

    "...You have to understand, most of these people are not ready to be unplugged. And many of them are so inured, so hopelessly dependent on the system, that they will fight to protect it."

    When a 6 block radius of New York City is turn into dust, thats when we'll take notice... at least for a few weeks.

    -dml337ira (Resident ground Zero)
  • by moderatorrater ( 1095745 ) on Monday January 21, 2008 @05:49PM (#22131038)
    He's not, and here's why. In building and designing a bridge, you're not going to have your boss walk in halfway through the construction and tell you that you need to use this new concrete that only comes from LargeHard(c). You're not going to build the bridge so that you can take it from a two lane bicycle bridge to a 12 lane, double decker toll bridge with a minimum of work. You're never going to have someone walk over the bridge and promptly say, "sorry, this river is actually 50 feet wider, and I don't like the color, can you change that?" Feature creep is the biggest killer of productivity and security.

    Another reason is that you have too many people building a bridge for the majority to be badly built. You have the engineers, the construction company, the foremen and the works are looking at the bridge. Are all these people going to be qualified to catch an error? No, but enough of them will be qualified enough to catch an error that it's unlikely to be a problem. On the other hand, we have software, where there are lines of code that have never been seen by anyone but the original programmer.
  • by Fulcrum of Evil ( 560260 ) on Monday January 21, 2008 @06:02PM (#22131176)
    It's worse than that - 20,000 alcohol related deaths doesn't really mean anything. If anyone involved in an accident has measurable alcohol in their system, it's alcohol related. If you're looking for the number of DUI style fatalities, it's probably around 3000/yr, but we don't know because nobody tracks that. But yeah, everything else you said is right - the 3000 deaths are committed by people who blow .15 or more and often have multiple DUIs - lowering the BAC limit only serves MADD's agenda, which is prohibition. If you want to stop drunk driving, raise the limit back to .10 and imprison people who get multiples or cause any sort of injury (and keep their license/ban them from owning a car).
  • by lennier ( 44736 ) on Monday January 21, 2008 @06:20PM (#22131356) Homepage
    The problem is that with the rise of 1) mass e-commerce, e-government and Internet banking, and 2) Internet-enabled desktops, now EVERY piece of conceivably internet-facing software installed on a consumer desktop carries the risk of exploitation, criminal intrusion and identity theft.

    Yes, a security hole in a web browser won't directly cause loss of *life*. However, what it *can* do by allowing a trojan in is:

    a) Drain all your life savings from your bank
    b) Place illegal pornography on your computer, leading to serious prison time
    c) Propagate spam, worms, viruses and botnet epidemics
    d) Activate your webcam remotely and film you in your bedroom
    e) Directly financially support criminal organisations

    Those are now serious enough consequences - and given a single security hole in a mass-produced product, easy to reproduce on a mass planet-wide scale - that ALL developers of even the most trivial desktop software need to start thinking in terms of the kind of hard security requirements of banking, military, avionics and medical gear.

    But they're not, because they haven't caught up with reality.
  • by Naturalis Philosopho ( 1160697 ) on Monday January 21, 2008 @06:23PM (#22131380)
    Oddly enough, you just made one of the best arguments I've heard to date for regulation and licensing of software designers and engineers. If we can't trust people to make rational decisions, then we may very well have to regulate them into it.
  • by capnchicken ( 664317 ) on Monday January 21, 2008 @06:43PM (#22131568)
    I would argue that the burden would lie on ACME Electric to make sure they can get, at the very least, accountability from Plinko that their phones are capable of being mission critical equipment. Otherwise, why should Plinko invest in creating mission critical phones if all they are going to be used for is to discuss Britney Spears?

    Should geeks really start shooting themselves in the foot over this? Should we really be screaming out: "Please fine me, jail me, and fire me because I wasn't writing code with the aid of a crystal ball"?

    That's why we have these protections, so we aren't at fault when someone decided to go beyond the scope and requirements with the tool we created.
  • by jvkjvk ( 102057 ) on Monday January 21, 2008 @07:29PM (#22132058)
    Well, that may be true. How much is good software going to cost us if everyone is liable for the code they write?

    There are three avenues I can see that a company or individual doing development in the US could take if this becomes law:

    1) Pay the costs to develop bug free software.
    2) Stop developing software.
    3) Move to a country with a less onerous position.

    Of the three, the only one that is actually not feasible is 1! Why, you might ask? Because the company must make a profit, thus must sell the software for more than they developed it for.

    Yes, the shuttle software has ~0 bugs. The cost of that has also been estimated at $1000 per LOC. Apache, for example, might have around 81852 lines of code... $81,852,000, which is not bad considering! The linux kernel (2.6) ~5.2M LOC. Hmm $5B??? Not to mention the glacial pace that shuttle sw moves. The pace Hurd is moving at would look like light speed compared to changes to any medium to large sized codebase.

    But, you might say, what about people who give their software away for free? After all, I just used Apache and linux as examples of what it might cost if commercially developed but they were not! We could just get all that work for free. Free!

    Well, show of hands - who wants to give some software away for free and be liable for the results? Put something up as an individual and one lawsuit (even if wrongly brought) is enough to bankrupt you. I guess there is always posting anonymously but I assume any distributor of the software would then be liable. How many projects on SourceForge would be available if either the contributors (non-anonymous) or SourceForge (for anonymous projects) were liable? Likewise e.g. RedHat, could all be held responsible for not only code they wrote but what they distribute if it was anonymous code.

    Then there are shared objects like libraries. Is is misuse of the library by the end developer that caused the issue or a bug in the library itself? Or should this have been caught by the QA of the end developer? Are both liable? It could get very entertaining.

    So, we may be experiencing $180B loss for bad software, but I happen to think that we might lose much much more if software liability were a reality.

    Not that MS, IBM, Oracle, Apple, Adobe, RedHat, etc... would ever allow this to happen.

    Please note: Nothing in the above states that I'm for buggy software being written. I believe that we simply don't have the tools to liability proof these types of products yet in a cheap, fast way. We can write good software. We can even write great software. But that one bug you didn't catch is the one they will sue you for.
  • Re:Factual Error! (Score:2, Insightful)

    by Jansingal ( 1098809 ) on Monday January 21, 2008 @10:41PM (#22133362)
    dude, this is /., not Harvard Spelling bee.

    you want the truth or pretty spelling?

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...