Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Security Books Media Book Reviews

Internet Site Security 122

Mirko Zorz writes "Internet Site Security - what a name for a book. When I first heard about it I was thinking: '1400 pages, 6 CDs,' but when the book came and I began to read through it, I realized how much good information the authors were able to fit into just over 400 pages. We all want 'big books' but with this one, the authors take a somewhat different approach, one that is less connected to software versions and that will endure in time. But, before we get into the core of the book, let's take a look at the people behind it." Mirko's review continues below.
Internet Site Security
author Erik Schetina, Ken Green and Jacob Carlson
pages 432
publisher Addison Wesley
rating 8
reviewer Mirko Zorz
ISBN 0672323060
summary This book manages to shed new light on the problems of security implementation; a good gift idea for both your IT manager and your system administrator.

About the authors

Erik Schetina, CISSP, is the CTO for TrustWave Corporation. He spent 14 years with the U.S. Department of Defense developing information security systems and public key cryptosystems. Jacob Carlson is a senior security engineer for TrustWave Corporation. His primary role is leading the penetration testing and vulnerability assessment team. In his copious free time he likes breaking things and writing code. Ken Green is a senior security engineer for TrustWave Corporation where he works extensively on intrusion detection systems, firewalls, and virtual private network initiatives.

When you read biographies like the ones above you can be somehow reassured that the content of the book is good. All of the authors come from TrustWave Corporation and the fact that they work together has influenced the writing of this book, in a very good way.

The basics

At the very beginning of the book the authors show us that the starting point of building a secure environment is not the implementation of a solution but rather the defining of the assets we want to protect. You have to know what's a threat to your assets in order to choose the best security solution.

The authors manage to successfully illustrate how different things such as system administration, policy and audits fit into an overall security plan. Through the book, the authors educate the reader by making sure he sees "the big picture." The bottom line is that "the transition from a techie to a security professional consists in the recognizing the importance of all the components of security." In the second chapter some great material is covered: description of the security process, assessment and policy, asset protection, monitoring and detection.

Which one is better?

When describing the way things can be done, the authors always give you the pros and the cons. For example, at one point they describe the difference when using commercial scanners in penetration testing compared to using a team of people who will do it by hand. They provide good pros and cons for both ways, and that's one of the great things about this book, you always get to look at the other side of the coin.

The insecurities

What we all know is that the Internet is inherently insecure -- that's why this book was published in the first place. The authors explain why it's insecure, who administers it and how it works. Some of the topics presented here are: an overview of TCP/IP, the Domain Name Service (DNS), Whois databases, anonymity, and much more.

History is also present in this book. Chapter 4 begins with a brief overview of the history of the Internet and the TCP/IP protocol suite. Also mentioned is the Morris Worm (November 1998). As we move on, the DNS is explained in greater detail (with some security issues addressed specifically), and we are slowly presented with an abundance of technical details that stretches over several chapters. Some of the things that are explained in the book include: secure protocols, virtual private network protocols and encapsulation, the secure shell (SSH) and authentication systems.

As an inevitable part of a book of this kind, there's a part dedicated to passwords (and good rules for their generation), and another on digital certificates. The authors present the shortcomings of certificates as well as their best uses. Although neither of these are explained in great detail, you'll be able to get an overview of the things presented.

Moving on, we get a plethora of information covering: firewalls, DMZs, VPNs, external and internal threats, the security of wireless networks, workstation management issues, intrusion detection systems and log processing, etc.

Operating systems

The book also gives some good information when it comes to operating systems and server software. Some of the covered topics include:

  • Windows NT and 2000 - authentication, access tokens, security identifiers, object access control lists, tightening Windows users rights, etc.
  • Linux - overview of the Linux Kernel, file system permissions, authentication mechanisms, how PAM works, etc.
  • Server security: web, mail, FTP, etc.
Attack and defense

If you want information about attacks, denials of service attacks are covered in great detail, along with many other attack scenarios. Since you also want to protect yourself from all of these attacks there's naturally much material dedicated to firewalls: their functions, implementation issues and vulnerabilities. Now that's not enough, is it? Now you want more. There's a whole chapter dedicated to intrusion detection systems and one dedicated to incident response and forensics. The chapter on incident response and forensics will be of particular interest for all of you who want more knowledge of legal and privacy issues.

Secure Code

To complete the book, there's a chapter dedicated to the developers, which discusses the development of secure Internet applications. Here you'll be able to read about common sources of programming mistakes, exploiting executable code, application-level security, coding standards, and more.

The verdict

This book manages to shade a new light on the problems of security implementation by explaining the position of the system administrator and the position of the IT manager in order to make them both understand their role in the overall process of security in the company. It's a good idea to give it to both your IT manager and your system administrator, they will both learn from it and in the process start to understand each other on a new level. With this book, you basically learn to think on a larger scale.

There are not many downsides. There are basically only two things that I didn't like about this book: the lack of resources, and (in parts) the writing style. There are not enough resources listed, and I always like to get to more information. As regards the writing style it's obvious that this book was not meant to entertain in any way, but it sometimes seems a bit too serious. I always believed that learning should be fun. That's just me :)

Overall, this is an excellent book, two thumbs up!


If you're interested in hearing what one of the authors of the book has to say, you can check out an interview with him here. You can purchase Internet Site Security from bn.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.

This discussion has been archived. No new comments can be posted.

Internet Site Security

Comments Filter:
  • by _Sambo ( 153114 ) on Tuesday November 26, 2002 @11:34AM (#4758961)
    "We all want 'big books'"

    I thought we all wanted bigger manhood. At least that's what 30% of my spam has been promising. And for only $50!
  • little suggestion (Score:5, Insightful)

    by newsdee ( 629448 ) on Tuesday November 26, 2002 @11:37AM (#4758996) Homepage Journal
    This may have been suggested before: /. should add what the "average price" for the book is in the review summary. I know some subjects are "priceless" for some but for the common mortals affordability is the main concern :-)
    • Re:little suggestion (Score:4, Informative)

      by Lev13than ( 581686 ) on Tuesday November 26, 2002 @12:01PM (#4759194) Homepage
      Looks like you can pick it up for about US$35.99 [informit.com] (I have no connection to this vendor).
      • It's not about "finding out how much it cost" (many ways to find out :-D ), but rather about "knowing the cost at a glance".

        Time is money :-)
      • Re:little suggestion (Score:2, Informative)

        by pbrammer ( 526214 )
        Why would you post such a high price? You can get it much cheaper... $23.95 [bookpool.com].

        BTW, the $ symbol tells everyone it is US currency, hence the US is redundant in your price post.

        • >> BTW, the $ symbol tells everyone it is US currency, hence the US is redundant in your price post.

          Really? Maybe we outta go smack those damn Canadians around then for weezing our gig</Pauly_Shore> ;-)
        • BTW, the $ symbol tells everyone it is US currency, hence the US is redundant in your price post.

          Yep, Australia, Canada, New Zealand, Hong Kong, the Bahamas, Barbados, Bermuda, Eastern Caribbean, Fiji, Jamaica, Singapore, Taiwan, Trinidad and Tobago, and probably some others I'm not aware of, have STOLEN the name of the AMERICAN currency!

          This is one of the funniest things I have seen on /. in a long time, because it was serious. My sig is aimed at people like you.

  • by FreeLinux ( 555387 ) on Tuesday November 26, 2002 @11:37AM (#4759000)
    The article states "We all want big books" but, I want to knwo if this is true. Lately I've been getting tired of suffering through these massive books that are being published of late. Most especially vexing is that it seems that many/most of these 1000+ page books are artificially inflated, size wise. They don't seem to have any more really valuable content than books half their size. Compound this with the fact that the wordy inflation of the books seems to make them much harder reads, not to mention taking an eternity to get through it.

    So, I ask the Slashdot crowd; Do you really want big/bigger books? Or, is 300 pages plenty, provided the information is in there?
    • Quality over size (Score:5, Interesting)

      by newsdee ( 629448 ) on Tuesday November 26, 2002 @11:42AM (#4759035) Homepage Journal
      Frankly, most IT books have more than 300 pages and are priced at $50 or more. Most of these, however, filled with a lot of "tricks" to increase page count - large fonts, large margins, useless chapters, you name it. Editors have to make a living, but if as the consumer we are always saying "the bigger the better", then it will only result in editors believing that "size is everything" as their daily spam suggests.
      • Re:Quality over size (Score:5, Informative)

        by pommiekiwifruit ( 570416 ) on Tuesday November 26, 2002 @12:33PM (#4759437)
        The books I buy tend to be ~256 pages (Scott Meyers, Herb Sutter, Kernighan etc.) or ~64 pages (Xenophobes Guides). I fail to see why people would read a 1500 page listing of windows.h or something like that.

        Some people buy cars with the turning radius of an oil tanker, books with 10 pages of useful content and 1000 pages of bug-ridden listings, and big plastic boxes with a couple of silicon chips in them, so maybe this is a cultural thing. I leave admiring the bigger is better idea to personal attributes (Jouko Ahola/Lola Ferrari/Filip Smirnov for example) or possibly monumental architecture rather than consumer items.


        • ...of books called the C++ In-Depth Series. Very well-written, all by experts.

          One of the series' rules is that the main body text of a book must be no more than 300 pages. Be clear, be concise, get to the point and shut up.

          Most excellent books.

      • Re:Quality over size (Score:4, Interesting)

        by rossz ( 67331 ) <ogre@@@geekbiker...net> on Tuesday November 26, 2002 @02:40PM (#4760603) Journal
        This is one of my pet peeves. Computer books are too damn expensive, especially given the fact that most of those $40 or $50 books will be obsolete within six months (I'm being generous by saying six months).

        Yes, it's a nitch market. Geek books will never have the market of someone like Stephen King. However, attempting to gouge me only means I buy a couple of computer books a year. Whenever I look at a computer book, I ask, "do I NEED this book", as opposed to just wanting it. Very few books can get by this rule.

        And why the hell is a paperback geek book more expensive than a hardbound novel?
    • I don't want to wade through 1500 pages of crap if the text can be condensed into 50 pages. I always got pissed of at those certain linux-books that had 100 pages of introductory material written by the author and 1400 pages of reprinted HOW-TOs and man-pages. That just plain sucks.

      K&R book was OK. Anything beyond 500 pages is way too much, unless you're aiming at "The World Explained for Really Dumb People: From Physics to Philosophy"
      • Stroustrup's book on C++ is big, but that's because it's a big language. The content is just as dense as K&R. It's an example of a huge technical book that is not worthless. I admit that most of the monstrosities you see in the bookshop are not worth buying (unless they come with a free Linux CD, and probably not even then).
    • Most of those huge books contain several hundred pages of pure reference in the back - for example, a large number of appendices. An html book I have here contains quick tag listings, number-->symbol conversions, etc. Sometimes they're more useful than the rest of the book's content.
    • Big books are only worth it if they're a decent reference. Take my HTML reference book. It's huge, but most of it is a dictionary like reference to HTML tags, entity tables, etc. Most of the books I learn from these days aren't beginner books, and unless the subject is huge, don't need to be very thick. The thinnest book I bought in recent years (4 ago) was UML Distilled - it was excellent and downright thin.
    • by StRex ( 32430 ) on Tuesday November 26, 2002 @12:11PM (#4759278)

      I honestly will not buy a technical book in the 1,000+ page range, especially if the title:

      • Includes the words "bible", "unleashed", or "secrets"
      • Is entitled Learn x in y days/hours

      Why? Because I know I'm unlikely digest the contents of 1000+ pages of text on one subject, if I manage to finish it. I also generally suspect large books of rehashing FAQs or other widely-available docs just to fill pages.

      I don't consider myself an O'Reilly bigot, though I do lean towards their books since they tend to publish smaller, focused books. If a book is pure reference, I may consider buying it if it's 1000+ pages. Following are examples of some great books I've bought that I found very useful and readable due to their small size:

      The Internet already offers me an overwhelming, disorganized pile of information on any subject--and at least it's searchable via Google. Dead tree books have use when they're usable and organized, and I've found that generally translates into a smaller book.

      • Yup for some strange reason BIG is seen as the way it should be in the States. In Britain our lovely publishers generally seem to understand that we like things a bit more petite. In my experience over the last 10 years through university, and into my career, is that the books published though British publishers are clear concise and to the point, and although the American texts do contain the information they are hidden away amongst the padding.

        Just my own personal experience.

    • It does not matter what size the technical book is, it only takes 5 pages to induce deep sleep and dooling!


      Life is like an elevator, sometimes you get the elevator and sometimes you get the shaft

    • Or, is 300 pages plenty, provided the information is in there?

      Not only is a big book not necessarily better; it is almost invariably worse. For simple reasons: Writing well is difficult. It takes a lot of work for an author--or a team of two or three working closely--to produce 200 quality pages. 1000 quality pages would be a monumental effort that, frankly, nobody's going to put into a book on Visual Basic. Further, concision is a mark of good writing, so when you see a big book, you should wonder, "why were they not able to get this into 200 pages?". Not to mention that a big book takes longer to read, is harder to find things in, and is less convenient to use and carry.

      For most technical books, the only ways to get to 1000 pages are to write sloppily, add filler, or employ many authors working independently. The last tends to produce an incoherent whole, and make each author care less about his contribution.

      The main exceptions are books with a justifiably large reference section (large because there is truly a lot of valuable material to reference, which is uncommon), and, to some extent, books that have been through several editions, whose authors have put new effort into each edition.

    • Actually, we just want good books.

      The idea behind "big" books, is that it's a reference book. You only open it once or twice a week when you want a specific information (aka phone book).

      if you want a book to introduce you to a subjet, it should'nt be 10000+ pages... just a small good read... that you read once and you can easily re-read when you to refresh you memory.
    • The last "big" book I had a hand in was 824 pages long, Hack Proofing Your Network, Second Edition. The publisher was concerned about it getting any larger, apparently once you get to 850, you have to have some special binding.

      It would have been extremely difficult to fit the material we covered in 300 or so pages. The standard layout stuff that tends to add pages is there, but no appendices with code listings or anything like that.

      There are some sections that I would call beginner sections that could have been left out if we were targeting specificly advanced users, it could have been smaller, maybe 500 pages.

      Still, at 800, there are still topics that didn't get coverage that could have, so I don't think page count by itself is any measure of how good the book will be.

      Its not the size, it's how you use it. :)
    • I think two [mis-]quotes about sum it up:
      • In the preface to Brian Kernighan & Dennis Ritchie's The C Programming Language [pearsoned.com], they write:
        C is not a big language, and it is not well served by a big book.
      • When I was a high school student, my English teacher quoted us a great line from Blaise Pascal [york.ac.uk]:
        Je n'ai fait celle-ci plus longue que parce que je n'ai par eu le loisir de la faire plus courte.
        Translated: I am sorry for the length of my letter, but I had not the time to write a short one.

      What else is there to say? There is wisdom in these two statements that I can't really expand on, and the trend towards bigger tech books certainly ignores.

      Like K&R's book, a lot of these tech books of arcana are about highly specialized areas. Taking a random stroll down the books that have a current place on my desk shelf, I've got books on: MySQL & Perl for the Web, HTML4, Programming Internet Email, VPNs, NFS/NIS, SQL, Perl DBI, TCP/IP admin, Bash, mod_perl, Perl LWP, and others. All of these are small subjects, and at a glance it doesn't look like any of these books goes much above 400 pages. Ah, I tell a like -- the mod_perl "eagle book" is much longer, but then it gets deep into the Apache API, so it isn't exactly padded.

      On the other hand, some of the longer books I've got -- one of FreeBSD, one on MySQL, Perl Cookbook, etc -- tend to cover a much wider variety of sub-topics within their stated area, but it's hard to do this in a non-superficial way. It's one thing to go down a checklist & mention every subject area (the FreeBSD & MySQL books seem to be guilty of this); it's much harder to say just enough about each area to be continually useful (Perl Cookbook does well here).

      In general, there's a sweet spot between brevity & long windedness. A proper density of information is hard to strike. If there is much to be said about a subject, then I personally would rather see aspects of the larger subject broken out into a more coherent text -- witness all the Perl books that, aside from the language itself, really don't have anything to do with one another (algorithm theory, database programming, client side http, server side http, graphics programming, win32 administration, web database automation, xml, bioinformatics, etc). Is the subject is big enough & cohesive enough to cover overlapping, related areas in one text -- Apache/mod_perl being a good example -- then fine, keep them together and let the book grow longer. But on the other hand if everything you need to know about the SQL implementation of half a dozen RDBMS engines will fit in a skinny little 200 page pamphlet, then let's just save everyone some time and not try to pad that out any further. I for one will never spend my money on those 1000+ page monsters unless they're in the remainder bin & it seems like five bucks would be worthwhile if I ever have to deal with one of those beasts some day.

  • by burgburgburg ( 574866 ) <splisken06@@@email...com> on Tuesday November 26, 2002 @11:40AM (#4759016)
    David Lightman was able to wardial his way into NORAD and almost start WWIII?

    Joshua: "Do you want to play a game?"

  • on the shelves of my former colleagues. One fellow in particular ws adamant about collecting these books. Unfortunately, he was not as rabid about IMPLEMENTING security. My point is, regardless of the size of the book or the library, it is all worthless unless the measures outlined and detailed are followed.
  • Synopsis. (Score:5, Funny)

    by grub ( 11606 ) <slashdot@grub.net> on Tuesday November 26, 2002 @11:44AM (#4759056) Homepage Journal

    Don't use IIS.

    what were the other 1399 pages for?

    • Re:Synopsis. (Score:1, Flamebait)

      by cscx ( 541332 )
      What a crock of uninformed bullshit.

      Although, since you're a BSD nut, I'm not surprised. ;-)

      The IIS server I used to admin had never been hacked. And yes, it was directly on the net and received a fairly high volume of traffic. Personally I would trust someone very knowledgeable with WinNT to put an IIS server up more than I would trust ye olde random Linux elitist to put his "uber-secure" Apache "b0x" on the web.

      • I guess that's why Microsoft says UNIX is better [slashdot.org]. :) Truthfully, they spent US$400M on Hotmail, they have a virtually unlimited budget to make this show-pony dance and they've failed miserably. I know this isn't directly linked to "security", but check out from where that whitepaper originated.. that's right, an unsecured box in Microsoft.
        • First of all, Hotmail was not originally owned by Microsoft -- they were bought out. Hotmail, by that original company, was originally written to run on FreeBSD, but it doesn't anymore. That white paper was written by an original Hotmail UNIX admin -- what do YOU think he'd stand up for? It meant (not) losing his job.

          As for that "unsecured" box -- it's not like it was a compromised box or anything. Some asshat uploaded the white paper to a public FTP server. That's their fault, not an inherent fault in the server software, which to me seems to be what you are implying.

          • As for that "unsecured" box -- it's not like it was a compromised box or anything. Some asshat uploaded the white paper to a public FTP server.

            Interesting, I stand corrected. Thanks.
        • This complaint about MS not securing its own boxen is FUD.

          I enjoy MS bashing as much as any Linux and *BSD (and Solaris) user, but this is wrong. The box this document was on was not cracked -- it was fulfilling its intended purpose as a public file server, allowing uploads to MS support and downloads by customers of files left there by the support staff. The security failing was in policy, not a technical problem.

          The server did what it was supposed to do -- staff members at MS who put a confidential document on a public server as a way of subverting policies related to internal filesharing.

          The boxes that MS puts on the 'net are generally fairly well secured. Most problems are due to people, not systems. (Yes, even though I am a UN*X bigot, I'm saying that it is possible to secure a Windows box on the big bad Interweb. Possible doesn't mean easy.)

      • Fair enough. But I think you should compare your trust of someone
        very knowledgeable with WinNT AND Apache AND Linux, to setting
        up either system.

        For instance, I'd certainly trust my own configuration of my own Apache
        server, before I'd trust my cat's setup of windows, but that's no more
        meanignful than your comment.
        • That cat must have one hell of a personality! How does it use the mouse with the paws and all? ;)
          • Cat's preferred use is strictly the keyboard; and her favorite debugging method is to powercycle. Occasionally she will get the ball out of the trackball and get it good and lost.
      • A percentage (I don't know what it is, but I suspect it's fairly large, especially for IIS) of web servers are installed in their default configuration, then quickly have the site replaced. The true test, then, would be to get a relativly recent linux distro w/ bundled apache, and a relativly recent IIS distro (say windows 2000 advanced server).

        Which would you rather run? Which one, if compromised poses a greater risk? I don't know about you, but I'd rather have users on _my_ network running apache (heck, I'd prefer an eggdrop with a hand-written TCL web server :) )
        At the very least, script kiddies seem to gravitate to IIS. Perhaps they just think it's cool to bash microsoft, perhaps they like the fact that exploits often run on windows, or perhaps they just like Administrator access better than non-priviledged "nobody" user.
    • Re:Synopsis. (Score:4, Interesting)

      by ceejayoz ( 567949 ) <cj@ceejayoz.com> on Tuesday November 26, 2002 @11:56AM (#4759163) Homepage Journal
      Honestly, if IIS was as insecure as Slashdot likes to think it is, wouldn't the Microsoft site have been hacked more often?
      • Re:Synopsis. (Score:5, Informative)

        by caluml ( 551744 ) <slashdot@spamgoe ... minus herbivore> on Tuesday November 26, 2002 @12:11PM (#4759280) Homepage
        Contrary to popular belief, it isn't impossible to run IIS and not get hacked.
        We ran about 30 of them, and if you are clever about it, you can do all kinds of things to keep the bugs out.

        Step 1. Remove all mappings apart from asp.dll
        Step 2. Keep web content on a different drive to the system (thus negating ../../../cmd.exe stuff)
        Step 3. Disable, and never use the default website.

        With those 3 things, you don't get affected by about 60% of the bugs.

        Add things like making all the static content read only, and only allowed a certain secured firewalled server to update the DBs, and you're almost there. Disallowed any net connections originated by the webservers (with exceptions of course) and you rule out strange shells making connections to IRC servers, etc.
        The only other thing is then to STAY PATCHED.

        Having said what I've said, I wouldn't like to do it again. Keeping those things secure took up so much of my time. Should it be a full time job to keep webservers running securely?
        rpm --freshen -vah apache*.rpm anyone?
        Now I have lots more time to do more interesting things ;)
      • Re:Synopsis. (Score:2, Insightful)

        by grub ( 11606 )

        Honestly, if IIS was as insecure as Slashdot likes to think it is, wouldn't the Microsoft site have been hacked more often?

        Microsoft's site isn't the problem. The problem is the hundreds of thousands of sites run by people that don't have unlimited budgets like Microsoft, nor the knowledge of the internals, nor the tweaked OSs. MS tried tweaking the hell out of their Hotmail backend to no avail.

        MS works OK on the desktop, heck I have one Windows machine at home for gaming but that's it. Robustness and security are not on the MS game plan, point-&-clickability and profit are.
  • Big Books (Score:4, Insightful)

    by Gary Franczyk ( 7387 ) on Tuesday November 26, 2002 @11:44AM (#4759059)
    Yes, big books are kind of a trend in the computer-book realm. Where else in the bookstore would you find monstrous tomes like the ones you see on Visual Basic or Java?

    It has a lot to do with how much they charge for these books. You want $50 for a book that costs $3 to print? (I can hear the anti-RIAA trolls now :-) Much like the huge portions that restaurants now serve, people want to feel that they are getting some value for the obviously inflated prices. Even if they were only going to eat a small amount, they are happy to see a huge plate of food for their $12.

    If you were to get a normal sized entree (or book) for the amount of money you are spending, you would feel that something is wrong, that you are getting ripped off somehow. Big books sell more than small books. Even if the content is the same. They will make the typeface larger and include tons of screenshots if that is what it takes to make a massive volume.

    • Lots of the true tech books (Fowler on UML for instance is $35 [amazon.com] and is brilliant.

      I'd say as a rule that like fine food, the best books are small compact and just tell you what you need to know.

      The big books are the crap resturants of the world, lots of food... not much quality.
  • Morris worm (Score:3, Informative)

    by Hulver ( 5850 ) on Tuesday November 26, 2002 @11:45AM (#4759068) Homepage
    Well, here was me thinking that the morris worm was in 1988, not 1998.
  • by Ektanoor ( 9949 )
    Reading this analysis, I would prefer to name this book as "Canned Internet Security". Maybe the book is good in its generic considerations about security. But I'm pretty sure that one cannot talk about Windows, Linux, Internet, Secure Programming and compress all of it into some 400 pages. Well, talking about Windows ACLs costs an expert nearly the same number of pages, you know? And that was a book for average to advanced experts. And, knowing the subtilities of both Windows and Linux, I am pretty sure that 1000 pages would not be enough to cover 10% of the problems. So ~400 pages?

    Besides the fact that one chapter is about history leads me to think that this is quite a general thing. However, for people who may want a startup on security, considering the level of the authors, this book may be some good stuff for them.
    • I'd agree with you, but the reviewer makes some mention of this. He says

      the authors take a somewhat different approach, one that is less connected to software versions and that will endure in time

      While a bit melodramatic, the principles of security are apparantly the subject of the book, not the method. It's quite possible to discuss what an ACL is from a system independant standpoint, without getting into the nitty-gritty of how Windows/Linux/OS 'o choice implements it.

      That's what the other 1400 some page books are for. Specifics.

  • at the end of the review. If you decide to buy this book consider using the /. clickthrough link. It generates revenue for the /. crew, and is a convenient way to shop. This is a new feature outside of the OSDN advertising. So support Rob and the crew directly and click on that link [bfast.com].
  • Make the book big and mostly useless, fill with so-called "useful information" and sell it at market price. - How to make decent money

    This does 2 things:

    A: It keeps good job security: Once an idiot(person with low intellect for my purpose) sees the 50+books on the shelf, each over 1k pages, he'll decide that it's a bad idea to go for an IT job. Or better, he shells out $500 for a class and finds out he's expected to read 2000 pages of text plus do a lab manual, plus tests. Makes more money for us who can endure it, or actually figure out how the stuff works.

    B: Provides an invaluable recource for referencing to. You're in a bind and can't remember a prociedure, dive into that book and you can find what you need to know.

    For example, I have 3 A+ cert books on my shelf. Each of them explain nada to me about the important stuff; how the processer works, how the busses work, basic electronics and binary, etc. eow most of the new stuff works becuase I never learned how the old stuff works in detail.

    It's also partially dependance. If you get your cert on one line of books, you'll buy the new editions to keep upto date, ensuring the next book will sell well. However, anyone with half of a brain knows that you read reviews about the new prociedures with the new tech before it comes out. If you don't you are most likely an idiot technician, or the guy who broke the simm slot on my grandma's motherboard, called a better tech over to save your ass, and taped the sim into the slot with masking tape...then stole the dip ram out of it...

    ....

    I feel the need to go down to best buy and open a can of whup-ass on their "techs"...
  • ...was 1988. I must be old if I remember that. Now someone hand me my walker and Metamucil.

    Path
    • Oh phew. I was sitting here thinking, "Morris Worm, wasn't that bad and famous but really old? I don't recall anything about it personally though, which would be weird if it happened in 98". Its good to be young :)
  • by jb_nizet ( 98713 ) on Tuesday November 26, 2002 @12:04PM (#4759216)
    I've been involved as a developer and/or security consultant in several internet projects, and I've noticed that the main problem, most of the time, is not really technical. If you really want a secure web site, you can have one relatively easily. Of course it might not be absolutely secure, but it's very hard to break by the average hacker.
    The main problem is that security is often developed at the end of the project, or completely forgotten, because it doesn't add any functionality to the application.
    At the beginning of the project, a prototype is developed (without security, because the goal is to show the application functionality), then a first version (still without security, because you're in a hurry), then the whole thing is developed and someone sometimes starts thinking about security.
    Since the application hasn't been designed at the very beginning with security aspects in mind, you end up adding hacks and workarounds to the application to make it a bit more secure, but it's sometimes very hard because it might break the functional spec or make the application look different than in the demo.
    At the end, you often end up with a solution which uses security through obscurity: since there is no link to this administration page in the welcome page, users won't find it! BAHAHAHAHA!

    JB.
  • Password generation (Score:5, Interesting)

    by ceswiedler ( 165311 ) <chris@swiedler.org> on Tuesday November 26, 2002 @12:04PM (#4759219)
    My favorite way to generate passwords is to alternate random consonants and vowels. The results are just about as 'secure' as any other randomly generated password (i.e., knowing the pattern won't help very much) and much easier to remember. Social engineering is always the easiest way to break passwords, and people often write down difficult-to-remember passwords.

    Some examples:
    gymolifi
    tosenima
    qopanela

    Because of the alternating pattern, the results are almost always pronouncable, which makes the passwords signifigantly easier to remember. Digits or symbols can be added to satisfy password requirements and increase security.
    • Ahem, what's your IP address....? ;)

      Of course, capitalise one of the letters, and fling in a little dash of puctuation somewhere, and it'll be x^n times hard to guess...
      • Alternating 21 consonants and 7 vovels (aeiou + jy) and beginning randomly with either a consonant or a vovel in upper- or lowercase, you get: 1.867.795.524 combinations, that's quite a lot!

        You really want to discover one of those passwords by brute-force guessing it?

        For comparison: using all alfanumeric characters (A-Z,a-z,0-9), we obtain 218.340.105.584.896 combinations - wow!

        ms
        --

    • by SirSlud ( 67381 )
      I usually select passwords that are patterns on my keyboard (non-trivial patterns).

      People seem to select lexographical patterns (words with numbers replacing some of the letters .. 1 instead of l, etc). And anybody trying out programmatic attacks on passwords is likely not going to consider the pattern-space of the keyboard (instead favouring dates, names, words, and variations thereof) as a possible source of passwords.

      And I don't have to remember the password proper; only the non-trivial pattern it makes on my qwerty. Results are usually with letters, numbers, and symbols, but its very easy to remember the visual footprint of these passwords.
    • I like to take the initials of a sentence. For example,
      "I like to take the initials of a sentence" -> "iltttioas"

      You can do things like alternate case and add symbols before/after.
    • Another method which works well for generating 'random' passwords is to use Phrases, for example:

      I'd like to fly, Shipwrecked and Comatose, Drinking fresh Mango Juice

      Would yield the password: Il2fSaCdfMJ

      Yes, I am a Red Dwarf Fan, and No, I have never used this as an actual password!!

      The phrase is more easily remembered than the password, and yields a password which is relatively hard to break by brute force. Of course, phrase based passwords can be weak to Social Engineering, but that's another issue entirely!
    • --I've wondered why passwords have to be inputed as text. Like, is there a way to use an audio/video clip instead? Perhaps keep it stored on removable media, you input it when needed by a direct cable connection, or inserting a USB device or a pcmcia card, etc. Is this being done?
    • Lately, I have been puzzling about the security of my web passwords. As a result, I wrote a little perl script [umich.edu] that derives passwords from a strong master password. Something like this:

      $ key.pl
      Password:
      Website: slashdot.org
      Password for slashdot.org:
      llynUngiltBerneLobal

      Its fairly useful on a day to day basis.

  • by customiser ( 150740 ) on Tuesday November 26, 2002 @12:05PM (#4759230) Homepage
    >We all want 'big books'

    No we don't. Personally, since reading Kernighan & Ritchie, I've been convinced that good books have to be slim.

    (another precondition is that it doesn't have a "Learn X in x minutes/hours/days" title)
    • (another precondition is that it doesn't have a "Learn X in x minutes/hours/days" title)

      Usually true, but not necessarily. Sams Publishing's "Learn Java 2 in 21 Days" is a wonderful book.
    • by Tet ( 2721 )
      Personally, since reading Kernighan & Ritchie, I've been convinced that good books have to be slim.

      I couldn't agree more. In fact, K&R themselves said it best in the preface to the second edition:

      C is not a big language, and it is not well served by a big book.
  • by warpSpeed ( 67927 ) <slashdot@fredcom.com> on Tuesday November 26, 2002 @12:10PM (#4759272) Homepage Journal
    Rating: 8

    Overall, this is an excellent book, two thumbs up!

    This seems kind of odd, or is it just "thumb" inflation. Two thumbs up does not mean what it used to...

  • by goonies ( 227194 ) on Tuesday November 26, 2002 @12:19PM (#4759340)
    Did anyone notice that there is a newer book [amazon.com] available on amazon.com [amazon.com] than the book mentioned [bfast.com] it the text above? The publisher is now Addison Wesley Professional and it is also a little bit cheaper. It has the same amount of pages and seems to be the same edition.
  • Personally I am sick of big bigs. I am starting to suffer from manualitis. Id rather get a book that is shorter and comprehensive and readable than a tome that covers ALL the scenarios.
  • by Anonymous Coward
    Tons of papers on site security
    www.cgisecurity.com/lib [cgisecurity.com]
  • Windows NT and 2000 - authentication, access tokens, security identifiers, object access control lists, tightening Windows users rights, etc.

    Ok, I could have ignored this review until I got to that part.

    Windows 2000 sounds like a complete, powerful system in theory. To salespeople and even project managers, Windows 2000 is packed with all of the buzzwords they want to hear. "Security Reference Monitor", "Policies", etc.

    Of course, anyone with practical, in-the-trenches experience knows that it's an impossible system to keep secure. It is big, it is bloated, it is closed source. It discourages deep understanding and is too complicated and restrictive to allow you to strip down to the parts you only need.

    The best reference manual you can find on Windows 2000 internals is "Inside Windows 2000", which is at best a pedestrian overview of the system. This is not the fault of the authors, it is the fault of the system.

    It is simply pathetic compared to The Design & Implementation of the 4.4BSD Operating System, or Linux Kernel Source Commentary, or UNIX System Internals, et al. You cannot download the source to Windows 2000 and evaluate it for yourself. You cannot compile out support for everything but the core essentials. You cannot identify and understand every process running on the system. You do not have the option of replacing them with specialized alternatives.

    Do not confuse this with Linux being unbreakable. That is not what I'm saying. I'm saying that in the hands of a competent person, one can achieve a far greater degree of security with Linux than they ever can with Windows. To some admins, this is not a big deal. To me, Windows is simply unacceptable.

    The authors appear to be blinded to technical realities by buzzword compliance. Not too suprising given their background.

    Many professionals are content with Windows, but anyone who is passionate about technology finds it reprehensible. It comes down to who you'd rather deal with. Would you rather hire a brain surgeon who was passionate about his craft or just one who simply treated it as a job?

    • Please explain... (Score:3, Insightful)

      by gillbates ( 106458 )

      why the parent post got modded down as a troll. I don't find anything particularly incindiery about what the poster said. Unlike a lot of posters here on slashdot, he actually took the time to think of something unique to say rather than repeating the same tired old arguments.

      You may not agree with what he said, but I think he made his points rather well. If you disagree with what the poster said, why not post a reply?

      • Most likely it was a politically motivated attack by someone who likes Windows.

        I bought "Inside Windows 2000" a year and a half ago, and it's a good introduction for someone completely unfamiliar with the system, but is sorely lacking for critical details. It also is very politically-correct, in terms of advocating the Microsoft way.

        The icing on the cake was the shrink-wrap EULA on the enclosed CD. After having spent a lot of time working with Linux, I found it positively insulting.

  • Having experienced the Morris Worm first-hand, I'd like to point out that it did not occur in 1998, but in 1988. (Specificall, Nov. 2, 1988) And, although the worm could only infect certain hardware/os platforms, it had the effect of knocking out many others. This was the result of a bug in the worm's code -- it would repeatedly infect vulnerable systems until so many instances were running on one box that it would bring that system down to its knees. As a side effect, its attempts to infect neighboring systems caused a major load on those systems, too. So, in some respects, it was the first [D]DoS attack, too.

    See: http://www.wikipedia.org/wiki/Morris+Worm [wikipedia.org]

  • by dwheeler ( 321049 ) on Tuesday November 26, 2002 @02:20PM (#4760410) Homepage Journal
    If you're interested writing secure applications for Linux/Unix systems, take a look at my free book, Secure Programming for Linux and Unix HOWTO, available at http://www.dwheeler.com/secure-programs [dwheeler.com].
  • Looks like a typo on the date of the Morris Worm. The Morris Worm was November 1988, not 1998. Here's a link [mit.edu] for reference.
  • by Anonymous Coward
    Just use a Mac. Not one exploit ever for mac webservers according to huge BugTraq database.

    You cannot use the unix OS X mac OS, it has already had over 30 remote exploit weaknesses that had to be pathced.

    I am talking about using a comemrcial webserver progrom on Mac OS 8.x or 9.x

    9.2.2 is latet Mac OS, yet NO mac OS from 8.x through 9.2.2 has ever had one remote weakness that did not involve specific user interaction to causea problem, the defaults are very secure.

    In fact so secure not one mac server has ever been compromised, though at one point a 3rd party addon cgi tool was found to be buggy and could cause a problem back in 1995 I believe. I forgot its name.

    I do not care for a reply to this. Therefore by definition, it is not a troll. I am sick of linux fans moderating "0" posts to -1 because they cannot handle the fact that NOT ONE REMOTE EXPLOIT FOR MAC OS exists and has ever been published or used.

    And, yes, macs are fast, and cached static pages are the same speed on almsot any OS, and Webstar 4.0 is fully featured.

    This 1400 page book should mention that a mac would give you 7 years of hassle free no-exploit web serving.

    Macs have no filename extensions, nor heavy usage of C strings (null terminated), nor root account, nor shell, nor dangerous way to pass return addresses, and avoid intel making hacking a little more difficult, also macs require files to have a second invisible file associated with them called "resource forks" to execute and network tools typically will not create these resource forks. There are countless reasons makes have not been exploited remotely.

  • by Anonymous Coward
    I find it both sad and amusing that people try to publish books about topics without first addressing the fact that there are more secure platforms for webserving. Most of these short-sighted me-too security bandwagons concentrate onthe porous unix/linux offerings, or MS weaknesses,

    It is a concrete fact that that no MacOS based webserver has ever been hacked into in the history of the internet.

    The MacOS running WebStar and other webservers as has never been exploited or defaced, and are are unbreakable based on historical evidence.

    In fact in the entire SecurityFocus (BugTraq) database history there has never been a Mac exploited over the internet remotely.

    That is why the US Army gave up on MS IIS and got a Mac for a web server.

    I am not talking about FreeBSD derived MacOS X (which already had a more than a 30 exploits and potential exploits ) I am talking about current Mac OS 9.x and earlier.

    Why is is hack proof? These reasons :

    1> No command shell. No shell means no way to hook or intercept the flow of control with many various shell oriented tricks found in Unix or NT. Apple uses an object model for process to process communication that is heavily typed and "pipe-less"

    2> No Root user. All Mac developers know their code is always running at root. Nothing is higher (except undocumented microkernel stuff where you pass Gary Davidians birthday into certain registers and make a special call). By always being root there is no false sense of security, and programming is done carefully.

    3> Pascal strings. ANSI C Strings are the number one way people exploit Linux and Wintel boxes. The Mac avoids C strings historically in most of all of its OS. In fact even its ROMs originally used Pascal strings. As you know Pascal strings (length prefixed) are faster than C (because they have the length delimiter in the front and do not have to endlessly hunt for NULL), but the side effect is less buffer exploits. Individual 3rd party products may use C stings and bind to ANSI libraries, but many do not. In case you are not aware of what a "pascal string" is, it usually has no null byte terminator.

    4> Macs running Webstar have ability to only run CGI placed in correct directory location and correctly file "typed" (not mere file name extension). File types on Macs are not easily settable by users, especially remotely. Apache as you know has had many problems in earlier years preventing wayward execution.

    5> Macs never run code ever merely based on how a file is named. ".exe" suffixes mean nothing! For example the file type is 4 characters of user-invisible attributes, along with many other invisible attributes, but these 4 bytes cannot be set by most tool oriented utilities that work with data files. For example file copy utilities preserve launchable file-types, but JPEG MPEG HTML TXT etc oriented tools are physically incapable by design of creating an executable file. The file type is not set to executable for hte hackers needs. In fact its even more secure than that. A mac cannot run a program unless it has TWO files. The second file is an invisible file associated with the data fork file and is called a resource fork. EVERY mac program has a resource fork file containing launch information. It needs to be present. Typically JPEG, HTML, MPEG, TXT, ZIP, C, etc are merely data files and lack resource fork files, and even if the y had them they would lack launch information. but the best part is that mac web programs and server tools do not create files with resource forks usually. TOTAL security.

    4> Stack return address positioned in safer location than some intel Osses. Buffer exploits take advantage of loser programmers lack of string length checking and clobber the return address to run thier exploit code instead. The Mac compilers usually place return address in front or out of context of where the buffer would overrun. Much safer.

    7> There are less macs, though there are huge cash prizes for cracking into a MacOS based WebStar server (typically over $10,000 US). Less macs means less hacker interest, but there are MILLIONS of macs sold, and some of the most skilled programmers are well versed in systems level mac engineering and know of the cash prizes, so its a moot point, but perhaps macs are never kracked because there appear to be less of them. (many macs pretend they are unix and give false headers to requests to keep up the illusion, ftp http, finger, etc). But some huge high performance sites use load-balancing webstar. Regardless, no mac has ever been rooted in history of the internet, except with a strange 3rd party tool in 1995.

    8> MacOS source not available traditionally, except within apple, similar to Microsoft source only available to its summer interns and engineers, source is rare to MacOS. This makes it hard to look for programming mistakes, but I feel the restricted source access is not the main reasons the MacOS has never been remotely broken into and exploited.

    Sure a fool can install freeware and shareware server tools and unsecure 3rd party addon tools for e-commerce, but a mac (MacOS 9) running WebStar is the most secure web server possible and webstar offers many services as is.

    One 3rd party tool created the only known exploit backdoor in mac history and that was back in 1995 and is not, nor was, a widely used tool. I do not even know its name. From 1995 to 2002 not one macintosh web server on the internet has been broken into or defaced EVER. Other than that event or a rouge 3rd party CGI tool ages ago in 1995, no mac web server has ever been rooted,defaced,owned,scanned,exploited, etc. They few mistaken defacements recently attributed to Mac OS are actually Mac OS X (unix) events.

    I think its quite amusing that there are over 200 or 300 known remote exploit vulnerabilities in RedHat over the years and not one MacOS 9.x or older remote exploit hack. There are even vulnerabilities a month ago in OpenBSD! Each month vulnerabilities in XP arise.

    Not one remote exploit. And that includes Webstar and other web servers on the Mac.

    A rare set of documentation tutorials and exercises on rewriting all buffer LINUX exploits from INTEL to PowerPC was published less than a year ago. The priceless hacker tutorials were by a linux fanatic : Christopher A Shepherd, 3036 Foxhill Circle #102, Apopka, FL 32703 and he wrote the tutorials in a context against BSD-Mach Mac OSX. but all of his unix methods will find little to exploit on a traditional MacOS server.

    BTW this is NOT an add for webstar.. the recent versions of webstar sold for over the last year are insecure and cannot run on Mac OS 9.x or 8.x, and only run on the repeatedly exploited MacOS X.

    --- too bad the linux community is so stubborn that they refuse to understand that the Mac has always been the most secure OS for servers.

    BugTraq concurs! As does the WWW consortium. So you do not need a book to teach you how to pathetically try to secure a website, just use a Mac, as many colleges and large media sites do, and most commercial airlines for there in-house security.

    • Wow, nice rant! I don't know about your item-numbering system though - what happened to point 6? Must be all that "allergy medication" ;-)

      Anyhoo, I'm sure what you say has some merit. If all you care about is security, and you've got the budget and available manpower of the US military.

      From perspectives other than security an OS9 web serving solution strike me as expensive, not terribly scalable, and - from my admittedly limited experience of OS8 and 9 - possibly not all that stable either.

      <cheap shot>Anyway, if it's so secure, how come you're whining that there's no section in the book for it?</cheap shot>

      Ah well, maybe there just wasn't anything the authors could say that could possibly improve the security of the mighty Mac...
  • I take issue with the statement in the post:

    We all want 'big books'

    My experience has overwhelmingly been that big computer books are a poor investment. I guess big books sell better, which is why one sees so many in the bookstores, but I have learned to skip over them and look for the little books.

    Why?

    Big computer books are often filled with fluff, such as large illustrations, fancy typography or sidebars that don't add much meat.

    What is really worse is that it takes a lot more work for a writer to explain a complex point concisely. I imagine it also takes more works for the editors to cut down on material that doesn't really add value to the book. My experience is that the big computer books are often poorly written.

    Also, I don't have enough time to read all the technical books. It simply takes less time to read a compact, well-written book than a big, verbose tome. There's also the problem of fitting a bunch of big computer books on my limited bookshelf space.

    I have observed that there seem to be an awful lot of big books for windows & web programming, and Java programming. The situation seems quite a bit better for C++ programming, with such slim books as Scott Meyers' "Effective C++".

    There are long books which are worthwhile, but the ones that are both big and good are covering a topic that is very broad and detailed, for example Foley, Van Dam, Feiner and Hughes' "Computer Graphics".

    My wife wanted to learn to use cascading stylesheets for her web design. The bookstore had two books, a small one by O'Reilly and a much larger one that seemed to cover the same material but had lots of fancy typography and illustrations and verbose text. In the end she chose the O'Reilly book, and after using it to design a website for some friends of ours [divinemaggees.com] she said she was very glad to have gotten the O'Reilly book.

    Finally, I'd like to suggest that before you purchase a new technical book, check to see if there is a review of it at the Association of C and C++ Users [accu.org] book reviews section [accu.org].

"If it ain't broke, don't fix it." - Bert Lantz

Working...