Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
News Books Media Book Reviews

Mastering Algorithms with C 57

The return of Doc Technical is marked by his review of the upcoming O'Reilly book Mastering Algorithms with C. Click below if you wish to learn more about this Kyle Loudon book.
Mastering Algorithms with C, 1st Ed. August 1999
author Kyle Loudon
pages 560
publisher O'Reilly and Associates, ISBN=
rating 6.5/10
reviewer Doc Technical
ISBN
summary A flawed but wide-ranging book for intermediate toexpert programmers.
[Reviewers Disclaimer: This review is based on a pre-publication copy of the book, so some changes may have occurred in the published copy. Most of my comments relate to structure and content, and I expect they'll remain valid.]

There are many books on programming algorithms, and quite a few of them use C for their examples, but even so, I was looking forward to this new book from O'Reilly.

Most of the existing algorithm books were designed as college text books. While they have their place, there is certainly room for a book that approaches the subject matter in the informal, clear writing style and practical focus that is O'Reilly's trademark. Unfortunately, for me at least, Mastering Algorithms in C was not that book.

My Problem with Algorithms

One of O'Reilly's strengths has always been their attention to the structure of their books. O'Reilly books are usually well designed, with logical and reader-friendly layouts. But Mastering Algorithms with C has a layout that I found somewhat vexing.

The chapter on linked lists serves to illustrate. An introductory section describes three kinds of lists, then briefly describes several applications for them (mailing lists, memory management, scrolled lists in GUIs, etc.). This is followed by a brief section describing the basic concepts of linked lists.

>From there the chapter moves directly to an implementation of linked lists, starting with a section called "Interface for Linked Lists". This section enumerates each function in a linked list library. For each function we are given its function prototype, followed by its return value, a brief description of the function's purpose, and finally, its algorithmic complexity in O-notation.

The next section is called "Implementation and Analysis of Linked Lists". After a paragraph describing some data structures, we are presented with a listing of the C header file, list.h, followed by a lengthier description of each of the functions described in the previous section. Finally, we are presented with a code listing of list.c.

And this is where I started having a problem.

First, I feel the author plunges much too quickly into implementation before providing a proper context. Let's assume that the reader knows nothing about linked lists (that's why, after all, they bought the book). Before describing a specific implementation, shouldn't the reader be clued in on the basic operations (independent of the implementation language) performed on linked lists: adding a node, deleting a node, etc.? While a little of this is given early on, the author seems in a rush to move on to the implementation.

And when we have the implementation described, the information is spread across (at least) three locations. For example, the function list_ins_next is first introduced in the "Interface" section, its prototype is shown in the list.h file, a more detailed description is provided in the "Implementation" section, and finally the actual code is presented in the list.c code listing. This forces the reader to jump around quite a bit to get all the info about each function.

How Much Should the Reader Know?

How much should the writer assume you know about linked lists (or any of the other algorithms described in this book)? If the reader already understands the basic concepts of linked lists, and if they're already a C programmer, they most likely wouldn't need this book. So let's assume the reader is starting at ground zero (or its general vicinity) when it comes to linked lists.

If that's the case, then I would expect the writer to present a relatively "bare bones" implementation, at least for basic algorithms like lists.

Instead, what is presented is an almost object-oriented implementation with init and destroy functions, function pointers, and other details that may provide a clean implementation but may be an impediment to the authors primary duty: teaching the reader the algorithm.

For much the same reasons, I found many of the examples out of sync with what was being taught. While virtual memory paging schemes are a lot of fun, they're a hell of a thing to spring on some poor sucker trying to grasp the concept of linked lists.

Finally, I had some problems with the author's prose style. This may be more my preconception of how an O'Reilly book should read. Luckily, you can (and should) make your own decision. O'Reilly has Chapter 13, Numerical Methods, available at their web site.

But Wait

While I have problems with this book, I don't mean to imply the book is bad. It's a fair book, I just don't think it's great book. It covers a lot of territory. For me, its main fault is that it fails to explain concepts clearly and basically enough before diving into the code.

On the plus side, I found the figures that illustrate concepts throughout the book to be clear and helpful. The selection of which algorithms to cover is also quite good, as these are all areas that a wide variety of programmers will face at some point in their work.

If you already have a moderate understanding of the algorithms, and want a text that presents a clear and documented implementation of them, then you may want to consider this book.

Pick this book up at Amazon.

Table of Contents:

Preface

  1. Preliminaries
    1. Introduction
    2. Pointer Manipulation
    3. Recursion
    4. Analysis of Algorithms
  2. Data Structures
    1. Linked Lists
    2. Stacks and Queues
    3. Sets
    4. Hash Tables
    5. Trees
    6. Heaps and Priority Queues
    7. Graphs
  3. Algorithms
    1. Sorting and Searching
    2. Numerical Methods
    3. Data Compression
    4. Data Encryption
    5. Graph Algorithms
    6. Geometric Algorithms
This discussion has been archived. No new comments can be posted.

Mastering Algorithms with C

Comments Filter:
  • Support is another revenue engine that I won't deal with now...

    Money for nothing...

    The problem is that the Pc is still scary. We need to combat the the M$'$ easy way out machine.

    I still think we need a computer manual.
  • Check out "Advanced C Programming by Example" [amazon.com]. Very good book about C in general, covers some of the most obscure functions in string.h ...
  • As the subject says. The STL is a component of the C++ library standard. The standard specifies what is in the headers; the syntax and semantics of the objects (as well as intended asymptotic running times to somewhat constrain the implementation).

    Several open source implementations of the STL are floating around; the best known one is perhaps the one from SGI. The one used by Microsoft was done by P. J. Plauger.
  • Earlier this year I reviewed a book entitled `Advanced C Programming by Example' on Slashdot. It's a concise little book with a number of useful examples. It's also the only book that provides usable examples of realloc() that I've come across.

    To read a variable length string, give this a whirl:

    char *get_line(FILE *fp) {
    char *line;

    if((line = (char *)malloc(BUFSIZ)) == NULL)
    exit(1);

    if((fgets(line, BUFSIZ, fp)) == NULL) {
    if(feof(fp))
    return NULL;
    exit(1);
    }
    if(line[strlen(line) - 1] != '\n') {
    char *line_ptr, get_buf[BUFSIZ];
    while(line[strlen(line) - 1] != '\n') {
    fgets(get_buf, BUFSIZ, fp);
    if((line = (char *)realloc(line, strlen(line) + BUFSIZ)) == NULL)
    exit(1);
    line_ptr = line + strlen(line);
    strcpy(line_ptr, get_buf);
    }
    }
    line[strlen(line) - 1] = '\0';

    return line;
    }

    Bear in mind that a malicious user could slurp up large amounts of memory with this function ...

    Chris Wareham
  • Try programming without thinking of semantics. I suppose you can if you use a "graphical" development tool - but I don't believe such things are popular in this forum. It seems "semantics" are much more popular - we don't say "my computer" here; we say "my xyz o/s running on abc hardware."
  • >>I still think we need a computer manual.

    We need computer operator's licenses. (I'm semi-joking, and semi-serious)

    Something similar in difficulty to the A+ certification. People need to have a basic understand of computers before they begin to get in too deep.

    I've seen people who know ZILCH about computers who drop 3-4 thousand dollars on a machine that they don't understand.

    I'm not saying that everyone has to be an electrical engineer, but come on, you should ad least know that...

    1. The mouse is for your hand.
    2. The mouse is NOT a foot pedal.
    3. Everything must be properly plugged in and turned ON to work.
    4. It's not the salesman's/store's/manufacturer's if you don't understand something.
    5. You WILL need to spend money again on your computer, no matter how much you paid for it.

    There needs to be a minimum level of understanding of the way the machines and the business work. Manuals, tests, online documentation, I don't care how people get it, as long as they get it.

    LK
  • How about "complete disaster". Or "no apparent understanding of the limitations of finite arithmetic". Or "misleading in the extreme". Yes, the algebra is correct (or seemed to be, at a quick glance). But the implementations are pretty bad.

    Example: the convergence requirement for the newton's method implementation (the delta argument) is treated as an absolute value, rather than relative to the value of the function. This means that the function will often fail to converge at all, or will return roots that are completely wrong. This is a beginners mistake, typically covered about 15 minutes or two paragraphs after the initial description of the algorithm.

    No checks for overflow or underflow. None of the obvious re-ordering of operations to avoid those faults. No mention of these issues.

    You could argue "well, these are just introductions", but there is no mention that these are pretty much unusable for any real work. Numerical work on computers is a tricky business, and presenting these algorithms and implementations in this form is doing a disservice to anybody interested in the field.

    Rule of Thumb: The straightforward obvious translation of the albebraic representation of a algorithm is probably wrong, and certainly suspect.

    It's unfortunate that ORA choice that chapter to post, because it may well be that the rest of the algorithms and implmentations are reasonably good.
  • Well what about Pracital Algorithms for Programmers, by Andrew Binstock and John Rex.
    I have found this book to be a invaluable tool for my job a c programmer. While it is geared more towards an experinced c programmer it does provided a good explantion of how the algorithms work and does offer a good discussion of what they are suited for.
    And it offers full implementations of all the algorithms discussed. quite useful when you need a b-tree at 3am and all you're thinking is "mmmm, sleep"
  • Sedgewick's books (Algorithms in C, Pascal, C++ -- the same book transated to different languages) are very good.

    Slightly off-topic, but anyone interested in bread-and-butter algorithms should definitely check out the STL (Standard Template Library, part of the C++ standard library) for a brilliant, original and incredibly useful algorithms library.

    ---------------------------------------
  • by stanis ( 7549 ) on Tuesday September 14, 1999 @03:33AM (#1683430) Homepage
    This book review was a waste of time for me. It told me nothing about the contents of the book and how relevant they might be to me. The author spends most of his time complaining about structure and little time explaining the material covered.

    Are there any insightfull ideas or algorithms discussed in this book? What topics are covered and how well? Is this book simply a rehashing of a college algorithms class, or does it discuss issues about implementation complexity and efficency? Is the process of algorithm construction addressed by the book?

    I picked up The Practice of Programming a while ago, and have found it to be an interesting treatise on the major problems and solutions in programming. If this book is anything like it I might consider reading it, but this review gave me little insight.

    --Tom

  • by Anonymous Coward
    I've browsed and read many C books but all of them fall short when demonstrating how to make pratical use of string manipulation functions in C. They usually take the lazy approach by showing string manipulation examples on fixed-length buffers, which is a very bad habit for real world programmers.

    So which C books out there are not afraid to demonstrate some real pratical, sfae, and effecient string parsing, reading, and writing in C? For example, how to safely parse variable length fields from a text file?

    Personally I like "Pointers on C" by Kenneth Reek (Addison Wesley, 1998) ... very detailed explanation of pointer-based programming.

  • My complaint is not with the fact that the language changes. I probably should have left out the line you quoted from my original post. My mistake.

    My beef with the book is that it uses a horrible coding style that makes the examples hard to understand. That and the fact that many examples are just plain wrong (and not just in the corner cases, either).

    I didn't particularly care for the writing style, as I found the explanations to be unnecessarily confusing. By that's really a matter or personal preference. The Cormen book, I think, is a much better introduction (for me, anyway).

    --

  • I've got the 'C' version. Some of the examples do have minor errors in them, as arrays in C are 0 based and the original in Pascal was 1 based. But it's not a book about 'C' - it's about algorithms. As such, it's very good and approachable (moreso than Knuth, IMHO.)

    Nowhere is a 1 (one) used as a variable - they're all lowercase L. It's unfortunate that the typeface for the examples makes it hard to distinguish l from 1. The (in)correctness of single-letter variable names, I'll let someone else argue, but it allows the examples to take up far less space (and not span lines), so I would maintain it's mostly irrelevant.

    There are many other books that are better at data structures, as Sedgewick barely devotes 10 pages to them. For algorithms, in my estimation very few are as good and possibly only 1 better. (Knuth, of course.)
  • For C/C++ books, I'm afraid Addison-Wesley kicks O'Reilley's ass every time. That's okay, though... O'Reilley has great books in their catalog, and I'm happy to purchase from either.

    So for C/C++, be sure to get "The C Programming Language (Second edition)" and "The C++ Programming Language (Third edition)."

    For algorithms, I'd recommend Cormen et al "Introduction to Algorithms". It's *not* an intro, it's a 1000+ page behemoth textbook, very thorough. We used it in university, covering half the material. We also had more specialised texts on numerical methods, etc. Most of these books are available on Dr. Dobb's CDROMs, check their web site.

    Also, I have the Knuths and have read some of them. While that's clearly beyond the average programmer, I have to say I find his wit, style, and attention to detail inspiring.

    Of course, if you just want to *use* algorithms, you use the C++ standard template library, or whatever. I've heard numerical recipes has some problems and bugs.
  • The "wolf" is wonderful. It's the first book of algorithms I've actually used- at least for more than a paperweight.

    I look forward to reading the review on that one.

    -MVK
  • Nowhere is a 1 (one) used as a variable - they're all lowercase L.

    Which illustrates my point beautifully. I did not write "one," I wrote "ell." This is exactly why the use of such variable names is terrible practice. Any example code using it is essentially useless. It's much more confusing than it ought to be.

    It worries me when books on algorithms and data structures can't present clear code (pseudo or otherwise).

    --

  • I agree and also I feel your response should be "up-moderated."
  • You'd think that Numerical Recipes would be an ideal argument for giving away code and selling the documentation. That's the way it worked with the first edition. But someone must have convinced the authors to make the second edition more restrictive, to make money directly from the code. The subsection "License Information" in the section "Legal Matters" explains "We want to make it easy for you to use the programs in this book on your computer. But to do so legally, you will need a license." I found the details fascinating. The authors list six different ways to get the code. 1) Buy the book and you can make one copy of each program for personal and non-commercial use. 2) You can use the code on one machine for each diskette purchased directly from Cambridge University Press. 3) Commercial licenses are negotiated, with discounts for universities. 4) Licenses can be requested for incorporation into products, with some restrictions, at nominal cost. (Like ObjectSpace, they sell the privilege of looking at the code, not compiling it.) 5) An instructor can mail $5 per student per license, if the software is used in a course. 6) A one-time license can be obtained for $50 for use on a single workstation at a not-for-profit organization. Obsessively wrongheaded, isn't it?
  • by Aaron M. Renn ( 539 ) <arenn@urbanophile.com> on Tuesday September 14, 1999 @01:23AM (#1683445) Homepage
    My recommendation is Algorithms in C [amazon.com] by Robert Sedgewick. It's one of the classic texts of the field. I think its a good text for a motivated beginner to learn algorithsm and some data structures. It's not as watered down as a lot of intro texts. But it's not as overwhelming as Knuth or something like that. Plus the examples are written in C, if not particularly readable C. (Be careful, there is another edition that does not use C. Buyer beware).

  • Just to let you know, that run-on link was correct in my preview and in my HTML, which I verified after I saw what got posted. Very curious.

  • O'Reily just can't make enough great books on programming. This book is a must read for intermediate programmers who wish to learn more about programming wicked fast and clean algorithms. In college out of all the C, and C++ programming classes I have taken, no professor ever goes that in depth on the art of the algorithm
  • Amen!

    This is a very well-written text with clear and detailed explanations. The pseudocode takes a bit of getting used to, but I like the fact the the algorithms are presented in a language-neutral fashion. This is the book that helped me understand alorithms, O notation and the tradeoffs between data structures.

    Plus, it's a big book and covers pretty much everything your average hacker would ever want, and more.

    --

  • by vyesue ( 76216 ) on Tuesday September 14, 1999 @01:27AM (#1683449)
    I haven't read the book being reviewed here, but if anyone is interested in a very exhaustive introductory algorithms book, I'd recommend Introduction to Algorithms from the MIT Electrical Engineering and Computer Science series. very very intelligent treatment of any algorithm-related subject you would want to read about.

    here it is at Amazon [amazon.com]
  • From the contents, it seems like a basic book on data-structures and algorithms.

    To get into more depth, it is best to learn from the masters: Knuth obviously,and Tarjan's book on data-structures. For algorithms, probably Kozen's book is the best. These books are, of course, at a higher level compared to Sedgewick etc ...
  • Hmmmm ... looking at the chapter, I'd say that it's a bit lightweight. You don't need to be a maths purist to think that it's a bit poor to teach people to do least-squares estimation without calculating the standard errors as they go. Or at least without even mentioning that there are such things as standard errors.

    And polynomial interpretation is only useful when you have a very good idea of what the data's going to give you -- it can go very badly wrong otherwise. For a small step-up in complexity, they could have covered cubic splines, which are much more generally useful.

    Numerical Recipes, now there was a book ...

    jsm
  • by Anonymous Coward
    Unfortunatley Algorithms in C by Robert Sedgewick is probably still the best Algorithm book out there. (Unfortunate because its extremely academic and difficult to understand without the proper background or sheer will to learn).

    The ironic thing is that most computer algorithms are not very difficult to understand at all, but since they are described in such difficult language many people who are learning them eventually give up and complain "computer programming is just to difficult for me".

    Its a shame, but I guess it does keep the ammount of programmers down, and thus allow me to charge more money for my skills... :)
  • If you're looking for an algorithm book, have a look at:

    Algorithms and Data Structures in C++
    Author: Ammeraal, Leendert
    Publisher: Wiley
    ISBN: 0471963550
    Price:£ 29.95
    I've got the C version of this and it was worth looking at.

    Algorithms in C++
    Author: Sedgewick, Robert
    Publisher: Addison-Wesley
    ISBN: 0201510596
    Price:£ 32.99
    I've got the older version of this and it was worth looking at.

    You can get other peoples opinions from ACCU's book reviews:
    http://www.accu.org/bookreviews/public/index.htm
  • by David Greene ( 463 ) on Tuesday September 14, 1999 @01:41AM (#1683454)
    Ugh! You're joking, right?

    Sedgewick basically takes a generic template and substitutes the language of the day, producing Algorithms in [Language].

    The examples in this book are horrible. It is the only textbook I've seen that uses "l" (the letter) as a variable. Many of the examples are just plain broken.

    To its credit the book does cover a wide range of topics, but I've read much better texts on algorithms and data structures.

    --

  • by Anonymous Coward
    My understanding is that a Linked List is a data structure (like an array). An abstract data type is more like a specific customized data structure and algorithm based on what you intend to accomplish. From Sedgewick: "A data structure is not a passive object: We also must consider the operations to be performed on it (and the algorithms used for these operations). This concept is formalized in the notion of a data type."
  • by jks ( 269 ) on Tuesday September 14, 1999 @04:01AM (#1683456) Homepage
    There are several problems with Numerical Recipes. Don't take my word for it; see what the people at NASA's Jet Propulsion Lab have to say: http://math.jpl.nasa.gov/nr/ [nasa.gov] They also list several alternatives.
  • I would second the previous recommendations of two books.

    Introduction to Algorithms by Cormen, Leiserson, and Rivest, MIT Press

    Numerical Recipes in (various languages) by Press et al., Cambridge University Press

    The main difficulty with the Cormen textbook is that it is too analytical for a lot of people. You really do need the math in the first part of the book to understand the rest of it (so it's your fault if you skip those chapters). Also, I hate red-black trees; AVL trees are simpler to understand, simpler to implement, and perform about as well. Otherwise, very nice pseudocode and analysis.

    The Knuth books are classic, but the choice of MIX for programming is dreadful.

    For numerical algorithms, the Numerical Recipes books are superb. I wish the style of analysis in the Cormen book was more like this.

  • After reading this review I'd like to clarify one thing. A linked list is not an algorithm, it is an abstract data type. The algorithm part is the description on how operations on the list are carried out.
    For those interested in the subject I'd recommend "Data Structures & Their Algorithms" by Harry R. Lewis and Larry Denenberg. It may not be very well suited for the novice programmer as the content is rather mathematical and abstract but it discusses algorithmic complexity to some degree. It also contains implementations for important algorithms in pseudo-code.
    Just my 0.02 sek
  • But really the definitive book on algorithms (not data structures!) is that horribly huge expensive white book with the spiral on the cover. While we were studying it I found it really horrible, but I've since learned that it's an excellent reference to have handy.

    Plus you get a good upper-body workout thrown in for free.
  • I third that. (someone already seconded it ;)

    Cormen's book lacks some not so often found data structures (skewed heaps, tries, leftist heaps, and splay trees for instance). It has a fairly good treatment on miscellaneous well-known algorithms (FFT, RSA, string matching, ...)

    I'm about to get my M.Sc. in Computer Science and I haven't read TAOCP (it's a deep personal dishonor I have to live with, until I have the time to go through it). The CS degrees are too shallow nowadays, knowing how to program is almost enough. To get a decent all-round knowledge of the field I've had to read the old classics in my spare time, since _none_ of them are a part of the curriculum. The degrees are geared towards providing IT companies employees and IMO that is not what CS is all about. Ever notice the 'science' part in the name of the discipline ? That's why I'm studying for a Ph.D in CS. Anyway I wonder how TAOCP compares to this book.

    AC
  • And indeed, here it is at Amazon too:

    Numerical Recipes in C: The Art of Scientific Computing [amazon.com]

    Amazon have a lot of hostile comments up, it seems, but ignore them. They're mainly from people who've used the example routines in mission-critical systems and are whining about the odd funny result. The guy who claims that the routines aren't robust is daft in my book -- they're examples from a textbook on numerical algorithms, not super-powered black boxes with a million special cases.

    Of course, if you're feeling cheap, pick it up at http://www.nr.com/ [nr.com]

    jsm

    (who once won a pint for managing to sneak a reference to "Tukey's biweight" into a memorandum)
  • Perhaps have it finished before we end up having to rent Microsoft C++ 2005 because we ignored our customers. The only reason the PC is threatened by NCs is because we let Microsoft scare the hell out of new users. It is really our own fault.

  • by Anonymous Coward
    What I like the most about "Introduction to Algorithms" is that it covers a lot of material that is usually omitted (or at best given a cursory overview) from algorithm books. For example, the section about NP-completeness was by far the best that I've seen in nontheoretical books (actually, even many introductionary computation theory books omit NP-completeness proofs)

    To the day I've read Sedgewick, Weiss, "Introduction to...", and approximately one twelfth of Knuth. Of these I'd say that "Introduction" is the best general purpose reference on algorithms. Knuth is certainly impressive and useful, but it is a little too heavy to be a good introduction to algorithms.
  • Sedgewick basically takes a generic template and substitutes the language of the day, producing Algorithms in [Language].

    Sedgewick's book is about algorithms, not languages. The book is valuable because of the explanations and diagrams, and because the code is presented in very clear, short snippets. I have the Pascal version of his book, and it makes no difference to me, as someone who programs in C and C++ professionally, that the code samples are given in Pascal.

    A lot of people apparently don't want a book unless it is specifically focused on their favorite language, so Sedgewick has to keep releasing new versions of his book as language fashions change (Pascal was the teaching language of choice when the original edition was published). Shortly we'll probably see "Algorithms in Java" for just that reason.

  • >>The only reason the PC is threatened by NCs is because we let Microsoft scare the hell out of new users. It is really our own fault.

    The NC is just Larry Ellison's pipe dream for the forseeable future. Until we all have broadband internet access the NC will only have a place in corporate environments, and then it'll just be for the bean counters and pencil pushers.

    The NC is not a viable solution for anyone who is in any way a part of a creative process. If they tried to force artists and programmers to use NCs production, effeciceicy, and therefore profits would go down.

    The NC is for Joe Schmuckateli who just needs to be able to get e-mail from his boss and maybe have a look at a networked database.

    As for whos fault it is, I say that it lies with business. M$ doesn't make 34 million dollars per day because of what we as consumers buy. The biggest cash cow of the M$ empire is and has always been the corporate market. When XYZ consulting needs new computers in the Miami branch and then get 400 new Win9X boxen M$ maks about 400x$70. Then they need 400 new copies of Office 2k. Then they need 400 Client licenses for Win NT Server.

    Since I don't have access to the priveleged knowledge of how much M$ makes on each copy of office I'm going to make a few assumptions.

    Let's just say that they make $100 in profit for each copy of Office. $70 for each copy of Win9x. Finally $40 for each WinNT client license.

    That's $40,000 in profit for Office alone.
    That's $28,000 for Win 9x
    That's $16,000 for Client licenses

    $84,000 in profit for M$ on one sale. Support is another revenue engine that I won't deal with now...

    They'd have to move 1200 "Presarios" out of Best Buy for them to get that kind of return.

    In the late 80s and early 90s the consumer market was crap. You didn't have the "Computer Superstores" on every block moving 2 dozen computer per hour. The money that put M$ on top came from the business community. It really is NOT our own fault.

    LK
  • They've also prevented translations to other languages. Someone wanted to translate the code into Modula-3, keeping equivalent function signatures; he was told that the authors considered the function signatures to be a copyrighted interface, and would not permit the translation. (This argument seems bogus; if it was correct, AT&T/Novell/whoever could sue the authors of Linux and *BSD for providing similar interfaces to their product.)
  • I also recommend Introduction to Algorithms by Cormen, Leiserson, and Rivest. If you don't have the CS background for this book, Addison Wesley published a book called "Practical Algorithms" that contains a lot of code; I think the authors name is Binstock.

    TedC

  • The opinions on Sedgewick are similar to those for Walter Rudins's Principles of Mathematical Analysis. Namely, you get a bi-modal distribution of people who love it and hate it. I agree that the code could be better, but I found the difficulty level and subject area breadth ideal for a motivated beginning student. (BTW: Sedgewick is far more suitable as a beginning text than Rudin, for those who've encountered that book).

  • my understanding is this is all a bunch of silly semantic quibiling.

Credit ... is the only enduring testimonial to man's confidence in man. -- James Blish

Working...