Geekonomics 227
Ben Rothke writes "First the good news — in a fascinating and timely new book Geekonomics: The
Real Cost of Insecure Software, David Rice clearly and systematically
shows how insecure software is a problem of epic proportions, both from an
economic and safety perspective. Currently, software buyers have very
little protection against insecure software and often the only recourse they
have is the replacement cost of the media. For too long, software
manufactures have hidden behind a virtual shield that protects them from any
sort of liability, accountability or responsibility. Geekonomics
attempts to stop them and can be deemed the software equivalent of Unsafe at
Any Speed. That tome warned us against driving unsafe automobiles;
Geekonomics does the same for insecure software." Read on for Ben's take on this book.
Now the bad news — we live in a society that tolerates 20,000 annual
alcohol-related fatalities (40% of total traffic fatalities) and cares more
about Brittany Spears' antics than the national diabetes epidemic.
Expecting the general public or politicians to somehow get concerned about
abstract software concepts such as command injection, path manipulation, race
conditions, coding errors, and myriad other software security errors, is
somewhat of a pipe dream.
Geekonomics: The Real Cost of Insecure Software | |
author | David Rice |
pages | 362 |
publisher | Addison-Wesley |
rating | 9 |
reviewer | Ben Rothke |
ISBN | 978-0321477897 |
summary | How insecure software costs money and lives |
Geekonomics is about the lack of consumer protection in the software market and how this impacts economic and national security. Author Dave Rice considers software consumers to be akin to the proverbial crash test dummy. This combined with how little recourse consumers have for software related errors, and lack of significant financial and legal liability for the vendors, creates a scenario where computer security is failing.
Most books about software security tend to be about actual coding practices. Geekonomics focuses not on the code, but rather how insecurely written software is an infrastructure problem and an economic issue. Geekonomics has 3 main themes. First — software is becoming the foundation of modern civilization. Second — software is not sufficiently engineered to fulfill the role of foundation. And third — economic, legal and regulatory incentives are needed to change the state of insecure software.
The book notes that bad software costs the US roughly $180 billion in 2007 alone (Pete Lindstrom's take on that dollar figure). Not only that, the $180 billion might be on the low-end, and the state of software security is getting worse, not better, according the Software Engineering Institute. Additional research shows that 90% of security threats exploit known flaws in software, yet the software manufacturers remain immune to almost all of the consequences in their poorly written software. Society tolerates 90% failure rates in software due to their unawareness of the problem. Also, huge amount of software problems entice attackers who attempt to take advantage of those vulnerabilities.
The books 7 chapters are systematically written and provide a compelling case for the need for security software. The book tells of how Joseph Bazalgette, chief engineer of the city of London used formal engineering practices in the mid-1800's to deal with the city's growing sewage problem. Cement was a crucial part of the project, and the book likens the development of secure software to that of cement, that can without decades of use and abuse.
One reason software has significant security vulnerabilities as noted in chapter 2, is that software manufacturers are primarily focused on features, since each additional feature (whether they have real benefit or not) offers a compelling value proposition to the buyer. But on the other side, a lack of software security functionality and controls imposes social costs on the rest of the populace.
Chapter 4 gets into the issues of oversight, standards, licensing and regulations. Other industries have lived under the watchful eyes of regulators (FAA, FDA, SEC, et al) for decades. But software is written removed from oversight by unlicensed programmers. Regulations exist primarily to guard the health, safety and welfare of the populace, in addition to the environment. Yet oversight amongst software programmers is almost nil and this lack of oversight and immunity breeds irresponsibility. The book notes that software does not have to be perfect, but it must rise to the level of quality expected of something that is the foundation of an infrastructure. And the only way to remove the irresponsibility is to remove the immunity, which lack of regulation has created a vacuum for.
Chapter 5 gets into more detail about the need to impose liability on software manufacturers. The books premise is that increased liability will lead to a decrease in software defects, will reward socially responsible software companies, and will redistribute the costs consumers have traditionally paid for protecting software from exploitation, shifting it back to the software manufacturer, where it belongs.
Since regulations and the like are likely years or decades away, chapter 7 notes that short of litigation, contracts are the best legal option software buyers can use to leverage in address software security problems. Unfortunately, most companies do not use this contractual option to the degree they should which can benefit them.
Overall, Geekonomics is an excellent book that broaches a subject left unchartered for too long. The book though does have its flaws; its analogies to physical security (bridges, cars, highways, etc.) and safety events don't always coalesce with perfect logic. Also, the trite title may diminish the seriousness of the topic. As the book illustrates, insecure software kills people, and I am not sure a corny book title conveys the importance of the topic. But the book does bring to light significant topics about the state of software, from legal liability, licensing of computer programmers, consumers rights, and more, that are imperatives.
It is clear the regulations around the software industry are inevitable and it is doubtful that Congress will do it right, whenever they eventually get around to it. Geekonomics shows the effects that such lack of oversight has caused, and how beneficial it would have been had such oversight been there in the first place.
To someone reading this review, they may get the impression that Geekonomics is a polemic against the software industry. To a degree it is, but the reality is that it is a two-way street. Software is built for people who buy certain features. To date, security has not been one of those top features. Geekonomics notes that software manufacturers have little to no incentive to build security into their products. Post Geekonomics, let's hope that will change.
Geekonomics will create different feelings amongst different readers. The consumer may be angry and frustrated. The software vendors will know that their vacation from security is over. It's finally time for them to get to work on fixing the problem that Geekonomics has so eloquently written about.
Ben Rothke is a security consultant with BT INS and the author of Computer Security: 20 Things Every Employee Should Know.
You can purchase Geekonomics: The Real Cost of Insecure Software from amazon.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
Go back and read _Free to Choose_... (Score:5, Insightful)
as the review says (Score:5, Insightful)
In my experience there is so much feature creap in software projects that there always seems to be that last feature that needs to get squeezed into the next release at the last moment and there isn't time to test. "lets just hope that 10k line module works and is secure. Even if it's not, we can always release a SP after we have the product on the market". It is even to the point where major software companies (MS comes to mind) have a concept of Zero Bounced Bugs. That is the point where the bugs getting fixed equals the bugs being found. If no "major" bugs and you've reached ZBB you ship. Now I can see you can't wait forever to ship, but there is this inherit acceptance of flaws in software that you won't see in say bridge building.
Re:as the review says (Score:5, Insightful)
Bridge building isnt really all that complex, there is a hell of a lot more going on in a software product of any real magnitude than in a bridge. Sure, there are a few things like wind you have to take into account, but there really arent as many variables in bridge building as there is in software development.
In addition to that, software has to be exactly perfect, with a bridge you can just say "screw it, lets reinforce/add supports/whatever here, here, and there just to be safe" and you are good to go. (I know I am oversimplifying to some degree, but you see my point) It is possible to give yourself a lot more room for error.
Re:Software is under the eyes of regulators (Score:4, Insightful)
The rest of software, like word processors, and spreadsheets, and music apps, doesn't need that kind of stringent oversight. A better analogy in such cases is to other mundane things: books, binders, pencils. Poorly designed binders and pencils can lead to lost productivity in the same way that poorly designed software can. Those who care will go for the higher-quality product (which may require more money, either in initial expenditure or in staff expertise). Again, errors in books can certainly lead to lost productivity, but is there really any need for more "book security" and "book oversight" and "book regulations" to make sure that the contents of books are robust and error-free?
I submit that such oversight is not really necessary (again, except in issues of health and physical safety). Most people can tolerate the occasional annoyances of breaking pencils, typos in books, and crashes in software. Ideally people should be educated about risk (e.g. don't put important documents in a flimsy box, put them in a safe; similarly, don't put important data in a low-security computer, get a properly administered server), so they can make informed choices. But more laws and regulation? Not necessary.
Re:I can't wait (Score:1, Insightful)
Re:Software is under the eyes of regulators (Score:5, Insightful)
Re:as the review says (Score:3, Insightful)
(I know I am oversimplifying to some degree, but you see my point)
Have you ever stopped to wonder if you are actually over-complicating software design rather than over-simplifying the analogy?
Re:Who will advocate change? (Score:2, Insightful)
My favorite example... (Score:2, Insightful)
Why is it that any time someone talks about software engineering they always bring up bridge/house/skyscraper building? Yes, Joseph Bazalgette used "formal engineering practices" to build London's sewers, but where did these formal practices come from? Why yes, through trial and error. Thousands of years of trial and error. Use concrete. Yes, it makes sense looking back because it worked, but what if it didn't work? What if the concrete failed? Or, What if he used clay pipes instead? Then we'd be saying "[insert name here] used formal engineering practices to deal with the city's growing sewage problem. Some guy before him failed miserably though." We simply haven't built up the software engineering toolbox yet. Software hasn't even been around for 100 years! But we're learning, and if you look in specific industries like medicine, banking, and avionics spring to my mind, they all spend billions of dollars to make sure their software works correctly because being correct for them is worth the cost.
Think bad repair manuals (Score:3, Insightful)
Imagine a repair manual for a gas stove that said "blow out pilot light, turn on gas, wait one hour, invite your friends over, and light a match." Sure, it might not steal credit card numbers but in the face of an ignorant and trusting user, it could prove fatal nonetheless.
OT: Drunk driving (Score:4, Insightful)
But this isn't because we don't care.
Obviously, all those things I listed show that people do care; however, they are going the wrong things to address the problem. We have allowed special interests like MADD, who are modern-day temperance societies, dictate these changes to us with little review or oversight. It has been statistically proven [ct.gov] that fatalities do not decrease with aMaybe we need to do more. But remember that there will always be people who insist on doing the wrong thing, and finding a way to do it.
It's a gamble (Score:2, Insightful)
If you are a greedy-bastard manager and you expect to be in your position for only a few years, all you care about is the failures that will come back to haunt you. You don't care if spending $5M now will save $1M in expenses over the next 5 years but save an additional $20M 10 years down the road. By then you and your greedy self-interested wallet will be out of harms way.
On the other hand, if you are a human manager with a conscience, you'll look at things long-term and either ante up now or make sure the problem is addressed before it is too late.
The public is hoplessly dependent... (Score:1, Insightful)
Pipe dream... not quite...
It just hasn't led to catastrophic loss of life...yet... when it does thats when we'll take notice... right now most of us are living week to week on our paychecks trying to get ahead... think of the public as what Morpheus talked about in the matrix...
"...You have to understand, most of these people are not ready to be unplugged. And many of them are so inured, so hopelessly dependent on the system, that they will fight to protect it."
When a 6 block radius of New York City is turn into dust, thats when we'll take notice... at least for a few weeks.
-dml337ira (Resident ground Zero)
Re:as the review says (Score:5, Insightful)
Another reason is that you have too many people building a bridge for the majority to be badly built. You have the engineers, the construction company, the foremen and the works are looking at the bridge. Are all these people going to be qualified to catch an error? No, but enough of them will be qualified enough to catch an error that it's unlikely to be a problem. On the other hand, we have software, where there are lines of code that have never been seen by anyone but the original programmer.
Re:OT: Drunk driving (Score:3, Insightful)
But all desktop software is now identity-critical (Score:3, Insightful)
Yes, a security hole in a web browser won't directly cause loss of *life*. However, what it *can* do by allowing a trojan in is:
a) Drain all your life savings from your bank
b) Place illegal pornography on your computer, leading to serious prison time
c) Propagate spam, worms, viruses and botnet epidemics
d) Activate your webcam remotely and film you in your bedroom
e) Directly financially support criminal organisations
Those are now serious enough consequences - and given a single security hole in a mass-produced product, easy to reproduce on a mass planet-wide scale - that ALL developers of even the most trivial desktop software need to start thinking in terms of the kind of hard security requirements of banking, military, avionics and medical gear.
But they're not, because they haven't caught up with reality.
Re:as the review says (Score:4, Insightful)
Re:It varies by industry (Score:2, Insightful)
Should geeks really start shooting themselves in the foot over this? Should we really be screaming out: "Please fine me, jail me, and fire me because I wasn't writing code with the aid of a crystal ball"?
That's why we have these protections, so we aren't at fault when someone decided to go beyond the scope and requirements with the tool we created.
Bad software costs $180B! (Score:4, Insightful)
There are three avenues I can see that a company or individual doing development in the US could take if this becomes law:
1) Pay the costs to develop bug free software.
2) Stop developing software.
3) Move to a country with a less onerous position.
Of the three, the only one that is actually not feasible is 1! Why, you might ask? Because the company must make a profit, thus must sell the software for more than they developed it for.
Yes, the shuttle software has ~0 bugs. The cost of that has also been estimated at $1000 per LOC. Apache, for example, might have around 81852 lines of code... $81,852,000, which is not bad considering! The linux kernel (2.6) ~5.2M LOC. Hmm $5B??? Not to mention the glacial pace that shuttle sw moves. The pace Hurd is moving at would look like light speed compared to changes to any medium to large sized codebase.
But, you might say, what about people who give their software away for free? After all, I just used Apache and linux as examples of what it might cost if commercially developed but they were not! We could just get all that work for free. Free!
Well, show of hands - who wants to give some software away for free and be liable for the results? Put something up as an individual and one lawsuit (even if wrongly brought) is enough to bankrupt you. I guess there is always posting anonymously but I assume any distributor of the software would then be liable. How many projects on SourceForge would be available if either the contributors (non-anonymous) or SourceForge (for anonymous projects) were liable? Likewise e.g. RedHat, could all be held responsible for not only code they wrote but what they distribute if it was anonymous code.
Then there are shared objects like libraries. Is is misuse of the library by the end developer that caused the issue or a bug in the library itself? Or should this have been caught by the QA of the end developer? Are both liable? It could get very entertaining.
So, we may be experiencing $180B loss for bad software, but I happen to think that we might lose much much more if software liability were a reality.
Not that MS, IBM, Oracle, Apple, Adobe, RedHat, etc... would ever allow this to happen.
Please note: Nothing in the above states that I'm for buggy software being written. I believe that we simply don't have the tools to liability proof these types of products yet in a cheap, fast way. We can write good software. We can even write great software. But that one bug you didn't catch is the one they will sue you for.
Re:Factual Error! (Score:2, Insightful)
you want the truth or pretty spelling?