Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Operating Systems Books Media Software Unix Book Reviews Linux

Linux and the Unix Philosophy 234

limbo_14 writes "Mike Gancarz takes his oft'-quoted original book, The Unix Philosophy and spruces it up for the Brave New World of Linux with Linux and the Unix Philosophy. Since The Unix Philosophy was written, Unix has undergone many changes and evolutions. Now with Linux emerging as the new face of Unix, he has updated his book with the same philosophy and tenets that were in the first, but updated the book to include considerations for the Open Source community and the new world of Operating Systems in which we live." Even the old version of The Unix Philosophy is worth finding; it may remind you of Neil Stephenson's In the Beginning Was the Command Line. Read on for the rest of limbo_14's review.
Linux and the Unix Philosophy
author Mike Gancarz
pages 200
publisher Butterworth-Heinemann
rating Recommended
reviewer limbo_14
ISBN 1555582737
summary An updated and expanded version of Gancarz's original book, The Unix Philosophy.

The good stuff...

I enjoyed Mike Gancarz' first book The Unix Philosophy greatly when I was first getting into the Unix world, and was hoping for an updated version. The thing that makes this book stand out in the shelves full of How-To, Dummy, and Administrator guides is the fact that it covers the What and Why of Unix/Linux rather than the How's. I am constantly amazed at Unix books that are mostly printed man files, and things that can easily be googled. This book explains with great precision why Unix is the way it is, and what separates it from other OS paradigms.

I realized the importance of this book after reading it, and being forced to do interviews for a Unix Engineer at my office. Of the 7 candidates, 6 of them seemed to know the textbook stuff. They knew the commands, they knew vi and a handful of scripting languages to a degree of proficiency. Alas, this is what it takes to become a Unix Administrator, not an Engineer that needs to see the whole picture. In this world of "puppy mill" Unix admins who have certifications and know one or two flavors of Unix/Linux, this book really teaches people the core of why Unix/Linux is the way it is, and why it is so attractive to those who really care about which OS to use.

The last chapter -- "Brave New (Unix) World)" -- is the real kicker. Gancarz really drives it home, and shows how the Unix/Linux philosophy has made it into other aspects of technology, and in the world we live in.

The not-so-good stuff ...

With every good book, there must be some bad, although this one's errors are quite forgivable. Although I appreciate any book that loosens the RFC style nature of so many technical books, sometimes it can go a little too far. This, however, is for each reader to judge. Some of the puns made me squirm, but for the most part they added a nice touch of levity to the book. So, depending on your threshold for python-esque puns or corny Elvis jokes, the book may not be for you, but knowing the /. Crowd, I don't think it will cause anything more than some groans and giggles.

All in All...

This is a quality book. It is one that should be re-read every now and then to make sure you do not stray from the Tenets that Gancarz drives home throughout the book via anecdotal evidence.This book can and should be read by anyone from a newbie hacker to a Corporate CEO. It is just technical enough not to make one feel patronized, and eases you into it with general concepts just enough to make it not feel like reading IETF standards. Here are the chapters, which give a good overview of what each is about:

  • Table of Contents
  • The Unix Philosophy: A Cast of Thousand
  • One Small Step for Humankind
  • Rapid Prototyping for Fun and Profit
  • The Portability Priority
  • Now THAT'S Leverage!
  • The Perils of Interactive Programs
  • More Unix Philosophy: Ten Lesser Tenets
  • Making Unix Do One Thing Well
  • Unix and Other Operating System Philosophies
  • Through the Glass Darkly: Linux vs. Windows
  • A Cathedral? How Bizarre!
  • Brave New (Unix) World

Although this is not the cheapest book in the rack, it packs more of a punch than half of the books on my shelf, so I think it is worth it. I found it a great read on the metro on the way to work in the morning, and found myself finishing it well within a week. With 200 pages, and by making it fun to read, Linux and the Unix Philosophy breezes by and makes for a great read.


You can purchase Linux and the Unix Philosophy from bn.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.

This discussion has been archived. No new comments can be posted.

Linux and the Unix Philosophy

Comments Filter:
  • Tragically, (Score:5, Funny)

    by burgburgburg ( 574866 ) <splisken06&email,com> on Tuesday August 12, 2003 @10:42AM (#6675874)
    SCO has determined that this book has violated it intellectual property. While you can buy the book, you require an additional SCO license to read the book. That'll be $699.
  • by jared_hanson ( 514797 ) on Tuesday August 12, 2003 @10:45AM (#6675918) Homepage Journal
    1. Though shalt end your italic tags.
    • If you're going to use archaic English, at least do so properly! The archaic second-person-singular nominative pronoun is spelled thou; and it takes a corresponding singular possessive: thy (or thine before a vowel, as in this case). Not 'your'. (Half a mark for correctly using the corresponding verb form, 'shalt', though.)

      And anyway, what's the problem? All the italics seem properly localised. Has the article been edited since your post?

  • by Lord_Slepnir ( 585350 ) on Tuesday August 12, 2003 @10:48AM (#6675958) Journal
    First, you have a book about the philosophy of an operating system. Then I noticed that there was a chapter called "Rapid Prototyping for Fun and Profit". This guy is either brilliant, or burned out his brian on LSD in the 70s.

    Now that I think of it, the two things Berkley is famous for is UNIX and LSD, and I dont' think it's a coincidence.

  • by Anonymous Coward on Tuesday August 12, 2003 @10:49AM (#6675965)
    Some of the major tenets of the original UNIX philosophy were:

    - Small is beautiful.

    - Make each program do one thing well.

    Why is it then that there are people out there who spend their entire lives with UNIX/Linux, and who ignore this?

    Some of the best examples are sendmail and emacs. And no, this isn't a troll. But I just don't understand why such people just don't get it. Clearly it isn't a lack of intelligence.

    But this paradox is something which I've never been able to figure out.
    • Why is it then that there are people out there who spend their entire lives with UNIX/Linux, and who ignore this?

      Because it's old and out of date. The new ideals should be:

      - Stable is beautiful

      - Make each program do what it does completely.

      Few folks enjoy puny programs that do one thing very well--when they want to do something slightly different, they wind up using a bunch of differenet tools.

      Also, the advent of windowing, color, and richly formated files has made "small simple programs" too hard to
      • by aussersterne ( 212916 ) on Tuesday August 12, 2003 @11:44AM (#6676634) Homepage
        when they want to do something slightly different, they wind up using a bunch of differenet tools

        Ah-ha! A light bulb has gone off over your head. That is th point of the Unix philosophy.

        Thanks to the Unix philosophy, when someone wants to do something slightly different, they won't need to install a new, slightly different application; they can select different arrangements of tools from their existing collection to solve almost any problem imaginable.

        With the "one complete tool, one complete problem" philosophy, every I have a new problem, I must acquire an entirely new tool. If I have a problem no one else has ever had before, no tool will exist yet, and I must either create it myself or pay someone to create it for me. I must then learn and/or be trained to use it.

        By attacking problems with small, specialized tools that I am very familiar with, (i.e. by splitting a problem into many smaller, more specialized sub-problems), I can use the same basic Unix toolset to solve arbitrary problems of almost arbitrary complexity.

        Almost anything can be accomplished with bash, perl, awk, and the collection of shell tools and command-line networking tools on a Linux/Unix system. And if you need some functionality that the tools can't provide, you don't need to install a 100% beginning-to-end solution complete with duplicate functionality and learning curve, you install a new small, specialized tool (i.e. gphoto2 to fetch photos from a camera) to solve the 1% of the problem that needed a new tool and use the tools you already know and have for the other 99% of the problem.

        Efficiency and flexibility are key in Unix.

        Some people argue (as you have) that these come at the expense of ease of use, but that is only true for small (admittedly often consumer-oriented) problems. Without doubt, however, past certain threshold values of problem size and complexity, efficiency and flexibility (i.e. small and specialized teams of scriptable tools) drastically increase ease of use relative to other methods, not to mention time and expense to deployment.
        • That is [the] point of the Unix philosophy.

          Sort of. The "UNIX philosophy" has some very good elements in it--but it's not without its flaws.

          The biggest one, IMO, is getting a tool changed and automating multiple computer tasks to one human-task.

          And if you need some functionality that the tools can't provide, you don't need to install a 100% beginning-to-end solution complete with duplicate functionality and learning curve, you install a new small, specialized tool (i.e. gphoto2 to fetch photos from a camera) to solve the 1% of the problem that needed a new tool and use the tools you already know and have for the other 99% of the problem.

          Depends on the problem, what what your current state of learning is. (Don't toss around statistics that you don't have--we have perfectly good words for "majority" and "minority")

          As I said, the problem is the advent of the GUI and the commonality of richly formatted documents--two major elements of modern computer usage that weren't really dealt with when the orignal UNIX system was created. For some modern applications, such as working with pictures, small apps added to the old apps can work. For other applications, such as 3D design, the current suite of small apps don't work.

          Some people argue (as you have) that these come at the expense of ease of use, but that is only true for small (admittedly often consumer-oriented) problems.

          A minority ("1%") of computers are used to solve problems. Most computers in service today are used to replace pen & paper, communicate, or play games. And most of these computers are used in a GUI environment, never (or almost-never) using the command line.

          I would love to see a removal of the division between "command line" and "GUI", just as I would love to see a removal of the user-distinction between "programs" and "tasks" et al.
          • A minority ("1%") of computers are used to solve problems.

            What problems are you referring to?

            Most computers in service today are used to replace pen & paper,

            Problem solved: Inefficiency and inaccessibility of information.

            communicate,

            Problems solved: In a business environment: Easy flow of information, shortening effective physical distance between parties, etc.

            or play games.

            Not sure if this is solving or creating a problem ;-)

            And most of these computers are used in a GUI environment, n
          • For some modern applications, such as working with pictures, small apps added to the old apps can work. For other applications, such as 3D design, the current suite of small apps don't work.

            This is because 3D design (e.g., CAD) is perhaps one of the most complex problem domains I can imagine. Capturing, dimensioning, tolerancing, and finishing a part in Pro/ENGINEER, for example, requires calling hundreds of different aspects of the UI into action. Pro/ENGINEER itself--in all its massive glory--is the s
        • Almost anything can be accomplished with bash, perl, awk, and the collection of shell tools and command-line networking tools on a Linux/Unix system

          What bash, perl and awk scripts did you use to post this comment with?

          • Looks like someone hasn't visited CPAN [cpan.org] in a while...
            #!/usr/bin/perl
            use HTTP::Request::Common qw(POST);
            use LWP::UserAgent;

            $ua=LWP:UserAgent->new();
            my req = POST 'http://books.slashdot.org', [sid => '74463', cid =>6677408', etc...]
            $content = $ua->request($req)->as_string
            ...and that's all without telnet to port 80!
        • Actually, virtually anything can be done using just Perl - the whole point of creating it was to have a single tool that could do what people did with awk, shell, sed, and many C programs, without the impedance mismatch problem. Shell scripting is great for whipping up a very quick solution, but you usually reach a point at which Perl/Python would be better. I once wrote a simple 4GL compiler entirely in shell and sed, so I think I went well beyond that point - my only defence is that Perl wasn't availabl

        • Some people argue (as you have) that these come at the expense of ease of use, but that is only true for small (admittedly often consumer-oriented) problems. Without doubt, however, past certain threshold values of problem size and complexity, efficiency and flexibility (i.e. small and specialized teams of scriptable tools) drastically increase ease of use relative to other methods, not to mention time and expense to deployment.


          I could not have put it better myself. Have you ever really tried to learn U
      • Because it's old and out of date. The new ideals should be:

        • Stable is beautiful
        • Make each program do what it does completely.

        Few folks enjoy puny programs that do one thing very well--when they want to do something slightly different, they wind up using a bunch of differenet tools.

        Think of it this way. A carpenter's workshop is packed with 100s of small tools that do small and simple things. A hammer to drive nails. A saw to cut wood. A chisel to shape joints. No single tool will "build a box" or

    • Some of the best examples are sendmail and emacs.

      I'd agree abiout emacs, and indeed tried to convince ESR that emacs goes against the Unix philosophy for his TAOUP book, although he wasn't having any of it. However, I disagree about sendmail. Sendmail isn't huge, and it only does one thing (email routing), and it does it very well. I'm not going to argue about its ease of configuration, or its historical security problems. But in terms of doing what it's designed for, you can't fault it.

      In general, thou


      • I'd agree abiout emacs, and indeed tried to convince ESR that emacs goes against the Unix philosophy for his TAOUP book, although he wasn't having any of it.


        Well, he at least has a section on that discussion (here [catb.org]) but in line with ESR's other writings, what it lacks in content in makes up in verbosity and self promotion.
      • But I find that sendmail is easy to confiure, it has a very small (1-2) page m4 configuration file (sendmail.mc), and its incredibly powerful. Most people just don't understand that they will in all likelyhood never have to edit the sendmail.cf file. Its only there to support exotic configurations. Buy the bat book and read a couple chapters. After that, you can sell the book, because you will know enough to set up 99.99% of all mail servers.

        As for security issues, it seems to have fewer cert's than the Li
      • In general, though, I agree with your sentiments. One only has to look at GNOME and KDE to see how much the Unix landscape has been infiltrated by people that just don't get it.

        I don't really agree... For any UNIX system to perform well as a desktop system, it needs a powerful desktop environment. It doesn't have to be big, but that helps.

        For a lot of people, probably most of them, a really powerful desktop environment must be able to do what they want transparently. MacOS and yes, Windows, can do th
      • by crucini ( 98210 ) on Tuesday August 12, 2003 @04:15PM (#6679786)
        One only has to look at GNOME and KDE to see how much the Unix landscape has been infiltrated by people that just don't get it.

        While I understand where you're coming from, that isn't quite fair. KDE and Gnome appear to be attempts to produce a Windows-style GUI atop Linux. This doesn't mean the developers don't get it - it could mean that they get it and yet prefer a Windows-style GUI for some or all tasks. Or that they see a need for such a GUI to let "normal" people use Linux.

        Subjectively, I feel the same alienation you are expressing from KDE and Gnome.
    • There are tradeoffs to using the make each program do one thing vs. make each program do everything technique.

      I am currently working on a small web content management CGI in C, based on the "small is beautiful / Make each program do one thing," paradigm. It is made of several smaller programs, each does one thing (one part is the CGI, and recieves the request and returns the output, one part is an XSLT transform engine, one part is a cacheing engine that will cache the XSLT output and only rerun the XSL t
    • Well, the easiest answer is GNU is not UNIX. Perhaps that's what it means. I always thought the UNIX philosophy was keep it simply, stupid, and only use open standards and protocols, if possible.

      Make everything small, easy and well documented. But this does not mean each applications is limitted to 50 lines of code. It just means if you have an application it should not call on 50 libraries to pop up a window or make function calls like myFunkyFunction_GTK++--::_GNOME_blahBlahBLAH(). Again, keep it si
    • Some of the best examples are sendmail and emacs.

      <FLAMEBAIT>Emacs would be a great OS if someone would just write a decent text editor for it</FLAMEBAIT>

      Jokes aside, I agree with this point, in that it defies the "smaller is better" UNIX philosophy. I never use it, but i'm pseudo-admin and last install I did of xemacs was over 100Mb, I still remember 20Mb hard drives and single 800Kb floppy Macs. But there is an overarching "the right tool for the job" philosphy, and for certain people, the
  • by garcia ( 6573 ) * on Tuesday August 12, 2003 @10:50AM (#6675981)
    Some of the puns made me squirm, but for the most part they added a nice touch of levity to the book. So, depending on your threshold for python-esque puns or corny Elvis jokes, the book may not be for you, but knowing the /. Crowd, I don't think it will cause anything more than some groans and giggles.

    This is a quality book. It is one that should be re-read every now and then to make sure you do not stray from the Tenets that Gancarz drives home throughout the book via anecdotal evidence.

    Are these two items REQUIRED for book reviews on Slashdot? The word "andecdotal" and "puns"?

    Doesn't [slashdot.org] seem like it. :)
  • Where's the Beef? (Score:5, Informative)

    by TTop ( 160446 ) on Tuesday August 12, 2003 @10:51AM (#6675991)
    This review basically consisted of one paragraph describing the book and a table of contents. I didn't get a real good feel of what to expect from the book. Why is it like In The Beginning?

    I guess I was hoping for a little more detail about why this book is good other than "it's not man pages or RFCs."
  • EveryThing2 (Score:4, Informative)

    by vasqzr ( 619165 ) <vasqzr@ne[ ]ape.net ['tsc' in gap]> on Tuesday August 12, 2003 @10:53AM (#6676015)
    Mike Gancarz's book The UNIX Philosophy (Digital Press, 1995) describes many of the ideas and conventions that have made unix a great sytem. It starts with a short run down of the history, quickly getting to the meat of things, discussion of the major ideas of Unixdom and illustrations of why they are such good ideas. While many of the ideas may seem relatively obvious to anyone who's worked with the system before, it makes an excellent introduction to the traditions of the Unix world, as well as an excelent bit of advocacy for why the Unix way is the Right Way.

    Listed in the first chapter, the following nine points are the key tenets:

    Small is beautiful
    Make each program do one thing
    Build a prototype as soon as possible
    Choose portability over efficiency
    Store numerical data in flat ASCII files
    Use software leverage to your advantage
    Use shell scripts to increase leverage and portability
    Avoid captive user interfaces
    Make every program a filter ...and the ten lesser points:
    Allow the user to tailor the environment:
    Make operating system kernels small and lightweight:
    Use lower case and keep it short
    Save trees
    Silence is golden
    Think parallel
    The sum of the parts is greater than the whole
    Look for the 90 percent solution
    Worse is better
    Think hierarchiacally

    • by Jon Peterson ( 1443 ) <jonNO@SPAMsnowdrift.org> on Tuesday August 12, 2003 @11:08AM (#6676210) Homepage
      Small is beautiful
      All program names 4 chars please

      Make each program do one thing
      But provide for it to do that thing in 52 different ways.

      Build a prototype as soon as possible
      And then stop.

      Choose portability over efficiency
      Remember that you are only interested in porting to other Unix systems.

      Store numerical data in flat ASCII files
      So much for small being beautiful

      Use software leverage to your advantage
      Err, whatever.

      Use shell scripts to increase leverage and portability
      While simultaneously decreasing maintainability!

      Avoid captive user interfaces
      Preferably by not having any user interfaces.

      Make every program a filter
      Especially the 'shutdown' command. ...and the ten lesser points:

      Allow the user to tailor the environment:
      Sure saves you having to figure out what works.

      Make operating system kernels small and lightweight:
      But keep them monolithic, Linus!

      Use lower case and keep it short
      Keep those commands under 5 characters!

      Save trees
      And don't bother with manuals!

      Silence is golden
      Don't waste time with error output. Or other human beings.

      Think parallel
      Yeah, don't chain those commands together with pipes, run them all at once. Oh, hang on...

      Look for the 90 percent solution
      And then quit your job. Heh, let the next guy finish it.

      Worse is better
      0 is 1, too.

      • Make each program do one thing
        ... But provide for it to do that thing in 52 different ways.

        Or as many ways as that one thing can be done.

        Choose portability over efficiency
        ... Remember that you are only interested in porting to other Unix systems.

        Systems with available Unix interfaces, eg Linux, Windows/Cygwin, OS X, ...

        Store numerical data in flat ASCII files
        ... So much for small being beautiful

        Small scope/complexity.

        Use shell scripts to increase leverage and portability
        .

  • All of us old-timer Unix folks love to fuss over the worst trangession of a commercial Unix or Linux distro or project makes when it comes to Unix philosophy.

    Whether it is Sun throwing things in /opt that should be in /usr or the other way around or maybe its a linux distro dumping everything in /usr or a project creating these huge programs that should be split into smaller utilities maybe with a unifying gui.

    We all have our complaints.

    What are yours?
  • by cperciva ( 102828 ) on Tuesday August 12, 2003 @11:00AM (#6676105) Homepage
    The Unix Philosophy can be stated in several ways:

    "Small interconnecting components"
    "Never use one program where you could use several"
    "Plumbing is good"

    If is a continual source of amazement to me that GNU tools (eg, tar -AcdrtuxbBCfFGhijykKlLmMnNoOpPRsSTIUvVwWXZz7) are widely used despite this.
    • by AveryT ( 148004 )
      Tell me about it. I remember practically being flamed on a mailing list for proposing the use of find and grep to recursively search for a pattern in a hierarchy of files. Sure grep -r would have done the job in that particular case but once you add the slightest variation you have to go to find.

      I definitely feel that GNU/Linux has moved away from the "true" Unix philosophy with this kitchen sink mentality.
    • It's quite easily explainable: GNU's Not Unix.
    • If is a continual source of amazement to me that GNU tools (eg, tar -AcdrtuxbBCfFGhijykKlLmMnNoOpPRsSTIUvVwWXZz7) are widely used despite this.

      Agreed. However, it isn't just GNU who are guilty of violating UNIX philosophy, but also commands like rpm, from Red Hat.

      The title of this article should be "Linux and the UNIX Philosophy: where did Linux go awry?".

      I think it stems from something a wise man said once: "Why? Well, because!".

  • by GGardner ( 97375 ) on Tuesday August 12, 2003 @11:01AM (#6676118)
    I've been using Unix for 20 years, and can write some mean shell scripts. I've read all the original papers which talk about how great it is to have many small tools and little languages which you can hook up via pipes. That sounds great on paper, but I've never really seen it work out in practice.

    Sure, we've all writing massively pipeline shell one-liners to do day-to-day tasks, but these are just one-time, throw-away code. All of my real Unix apps that I use every day are huge monolithic applications, not a composition of many tiny apps connected by pipes. My web browser is a monolithic app, not connected by pipes. GCC is a couple of monolithic applications, optionally connected by pipes, but never reconnected in any useful way (cpp notwithstanding). My newsreader and mailreader again, monolithic applications. My MTA, again, a monolithic application. Not one large program I use is a shell script, or collection of small, interchangeable programs.

    So, is this Unix tool philosophy useful for real applications, or just for little shell scripts?

    • by dodell ( 83471 ) <{moc.scinortetis} {ta} {lledod}> on Tuesday August 12, 2003 @11:14AM (#6676278) Homepage
      Does IPC count as a pipe? Do you run any filters on your email? How many times have you run | grep (granted that this is a one time thing)? I have a good number of shell scripts I use (for sed scripts, for instance) that make use of pipes (not extensively though). I'd suggest that pipes are used much more than you give credit for here.
    • So, is this Unix tool philosophy useful for real applications, or just for little shell scripts?

      qmail and cbb come immediately to mind. Small apps, all do one thing well, communicate with others via pipes.

      • Qmail is a great example of the Unix philosophy. It is a large, powerful system built with many smaller programs which do one thing well, and communicate via pipes (or the filesystem). I'm not familiar with cbb. But Qmail is almost the exception that proves the rule -- I can't think of any other large application that is built with this Unix philosophy, almost every other is one big program with IPC done via function calls.
    • by Ed Avis ( 5917 ) <ed@membled.com> on Tuesday August 12, 2003 @11:40AM (#6676563) Homepage
      Your post sounds a lot like Miguel's 'Unix sucks' talk where he explains that Unix does not (or did not) have much reusable code and components at a level finer than whole processes - apart from a few libraries like libc and libX11 which are mostly static. Microsoft Office, sometimes the canonical Slashdot example of an ugly monolithic application, is in fact built from many small components (though for legal reasons it is hard to reuse them). The GNOME project set out to change this, although nowadays the emphasis is more on the GUI than on the component architecture.

      Part of the problem is that nobody can agree on anything. So a MIME parsing library has to be written one in Emacs Lisp for Emacs, once as a Perl module, once in Python, as a C library (probably several C libraries), in Java, etc etc. Nobody can agree to write a reusable component once to a common interface (the implementation could be in any language, as long as the interface is usable from others) and just wrap that. On the other hand some libraries, often those coming later in Unix history, do have a single shared implementation, eg libpng.
      • Your post sounds a lot like Miguel's 'Unix sucks' talk where he explains that Unix does not (or did not) have much reusable code and components at a level finer than whole processes

        Indeed. But this isn't an accident -- The Unix Philosophy (such as it is), is that the level of granularity of a resuable component is a process, connected via a pipe.

        Interestingly, I recently attended a lecture by a Plan 9 luminary who railed against shared libraries. His view is that there should be no shared libraries, r

        • Then I think there should be a distinction between UNIX and GNU/Linux here. Sharing of more than binary algorithms via pipes is possible when all the libraries are open and available.

          But Miguel's 'Unix sucks' talk makes me question why he'd use so many different libraries to develope applications like evolution, if they suck so much. Isn't what sucks the dependency nightmare we've created out of GNOME? That could have been prevented if we used pipes instead of shared libraries and morons who don't under
      • Agreed. While Miguel was in some ways right, I don't think CORBA/Bonobo (or KParts for that matter) really solves the problem. It's too heavy an API. A better solution would be to provide an API to every (large/GUI) application that could be used by any scripting language to control the program. In other words, a universal scripting API to any language.

        You could script things inside or outside of the application. So you could write a script something like this:

        aw = start AbiWord
        aw.load("file1.abw")
        aw.sele
    • All those little pipeline shell one-liners are inconsequential precisely because the UNIX philosophy works. Some applications (the monolithic ones) don't fit that ideal very well, but many others (all those little one-liners you use, without thinking much about it) do.

      If text manipulation and piping didn't work well in UNIX, you'd know about it -- all those tasks would be a real thorn in your side. As it is, you have the right tools, so they're no big deal.

    • My web browser is a monolithic app, not connected by pipes.

      Your web browser probably wasn't implemented by people following the unix philosophy. The Unix GUI environment doesn't support small tools as well as the command line environment.

      GCC is a couple of monolithic applications, optionally connected by pipes, but never reconnected in any useful way (cpp
      notwithstanding).


      GCC is GNU software. GNU software does not follow the unix philosophy. GNU software follows the add a command line option for everyt
      • Your web browser probably wasn't implemented by people following the unix philosophy.

        The Unix GUI environment doesn't support small tools as well as the command line environment.

        GCC is GNU software. GNU software does not follow the unix philosophy. GNU software follows the add a command line option for everything and the kitchen sink philosophy.

        This is exactly my point. There are many web browsers out there, and not one is built from program components connected linearly via pipes. Is this because ev

        • This is exactly my point. There are many web browsers out there, and not one is built from program components connected linearly via pipes. Is this because everyone building a web browser is an idiot who just doesn't understand this magical Unix Philosophy? Or is it because this design strategy of small commands interconnected by linear pipes does not scale to the implementation of modern, large programs?

          I don't know of a GUI environment that supports such development. Web browsers use plugins, those plug
    • My MTA, again, a monolithic application.

      Then switch to qmail, an MTA that follows the UNIX philosophy. Every part of qmail is a separate program that does one task and has a well documented interface. As such, you can easily replace a single component without changing anything else.
    • That sounds great on paper, but I've never really seen it work out in practice.

      Sure, we've all writing massively pipeline shell one-liners to do day-to-day tasks, but these are just one-time, throw-away code.


      You just contradicted yourself, here. The fact that a quick one-liner can solve a modest problem in short order is a testament to the success of UNIX. For example, I once needed a one-time list of all calls to a particular API in a particluar program (find, egrep, and sed saved the day).

      My news
  • by peter303 ( 12292 ) on Tuesday August 12, 2003 @11:04AM (#6676155)
    Avoiding Stallman's pun, what exactly is UNIX? Does Linux qualify? Apple OS X+? Exotic OS's like Mach?

    In the mid-1980s an industrial/governemnt consortium tried to defined an unified UNIX API, called Posix. Then it would be straight forward to implement the UNIX utilities, command user interfaces, and apps on top this. I recall some companies layering Posix on top of VMS, MVS, and other non-UNIX kernals. Are these UNIX?

    Another approach was extending the UNIX philosophy of a simple machine image to more modern computers than those in the early 1970s. Mach assumed a computer model with multiple CPUs and memory subsystems. BeOS assumed a computer model where multimedia was the norm. So are these OS's "more UNIX-like than UNIX" then?
  • by dodell ( 83471 ) <{moc.scinortetis} {ta} {lledod}> on Tuesday August 12, 2003 @11:09AM (#6676222) Homepage
    I found this review to be lacking in content. It doesn't discuss the content of the book to any extent; instead it talks about how it got him a job promotion to UNIX Engineer. How did it do this? What did you learn from the book that gave you such an additional skillset to be promoted to UNIX Engineer? What are the differences between the UNIX Administrator and the UNIX Engineer you are referring to?

    I am constantly amazed at Unix books that are mostly printed man files, and things that can easily be googled. This book explains with great precision why Unix is the way it is, and what separates it from other OS paradigms.

    I've not found any books that are mostly printed man pages. Nor have I found any circumstances where the man pages don't cover things I need to know. In any case, what parts of UNIX does it explain? Is it explaining Linux or UNIX? What OS "paradigms" are you referring to? You are going by this definition [reference.com] aren't you?

    I realized the importance of this book after reading it, and being forced to do interviews for a Unix Engineer at my office.

    What importance did you realize this book serving after you had read it? Are you sure this gave you applicable knowledge to separate "UNIX Administrators" from "UNIX Engineers"? What is the difference here?

    Although I appreciate any book that loosens the RFC style nature of so many technical books, sometimes it can go a little too far.

    Why? If it's discussing that you need to know an RFC to understand why something works the way it does (you've stated that this book talks more about the why than how), how does it make it "not-so-good"?

    So, depending on your threshold for python-esque puns or corny Elvis jokes, the book may not be for you...

    Do the few puns in the book really take that much of the quality away?

    I don't think that this book should be re-read from time to time. I think new editions should be published as UNIX and Linux continue to evolve in their own separate directions (yes, they're going in somewhat separate directions).

    Your listing of the TOC didn't give me any idea about what was covered? WTF is "Now THAT'S Leverage" about? What "Lesser Tenants" are being referred to? What "One Thing" does UNIX do well?

    You've left me with more questions about this book than I would have had otherwise. Please try to do a more thorough review next time.

    And, to get on a technicality that will probably cost me this comment as a Troll, Linux IS NOT THE NEW FACE OF UNIX. Most distributions also don't even come close to being something that would compare to a UNIX certified system.

    Finally, please excuse my harshness. I just feel you could have done a better, much more descriptive job. Don't take it personally.
  • Does Gancarz think the advent of linux and large open source projects provides anything for the UNIX philosophers to learn from, or is his book meant to be old wisdom for the new kids? It's hard to tell from the review.
    • Not sure what Gancarz says (haven't read the book) but UNIX philosophy, to a great extent, has consistently been open source. Much of the coolness in UNIX today came out of Berkeley. Termcap, vi, virtual memory, a lot of the coolness that makes up UNIX today was created at Bezerkely and distributed by tape to the world. Sun made SunOS from BSD and later released NFS to an unsuspecting public. Major changes in the philosophy now aren't so much changes that open source is new, but much easier to get becau
  • Linux and the UNIX philosophy [barnesandnoble.com]

    Thought i'd let you know ...
    B0mbtruck, one base at a time

  • by Junks Jerzey ( 54586 ) on Tuesday August 12, 2003 @11:30AM (#6676427)
    One of the core tenets of UNIX was that you have small, simple tools and you glue them together. But now the popular programming languages under Linux are C++, Python, and Perl, none of which follow this philosophy. And to get to the point where Linux is a true alternative to Windows on the desktop, you have to put a massive X server on top of the kernel, and put a massive window manager and desktop environment on top of that. In the end, "Linux" is not a simple thing (and arguably even the kernel is not simple, but the API is), because you are looking at the combination of X+Qt+KDE, and that pretty much throws all philosophy out the Window. (Yes, I know you can use Blackbox or something else instead, but then don't go arguing that it's a suitable replacement for Windows.)

    Have you ever read code from some of the original UNIX team, such as Kernhigan's Software Tools? Wow, can that man write clean and clear code. The original C compiler is similarly concise. But then look at the sources to just about any Open Source project and see that (1) there's a massive amount of code, and (2) it's mostly very ugly. Unfortunately, even though it's illogical, "open source" and "simplicity" aren't as intimately tied together as one would expect.
    • And to get to the point where Linux is a true alternative to Windows on the desktop, you have to put a massive X server on top of the kernel, and put a massive window manager and desktop environment on top of that. In the end, "Linux" is not a simple thing (and arguably even the kernel is not simple, but the API is), because you are looking at the combination of X+Qt+KDE, and that pretty much throws all philosophy out the Window. (Yes, I know you can use Blackbox or something else instead, but then don't go

  • Intended for people with computer experience who are new to Unix (no "For Dummies" nonsense). I found it illuminating when I first started using Linux: Think Unix [tux.org], by Jon Lasser.

    JP

  • by bryane ( 614590 ) on Tuesday August 12, 2003 @01:43PM (#6678030)
    The text for Neal Stephenson's book "In the Beginning..." can be downloaded here [cryptonomicon.com]. Haven't read it yet, but when did that stop anyone here :-)
  • Sure some will consider this a flame/troll.

    ...Now with Linux emerging as the new face of Unix...

    I disagree. Linux (the Kernel) has been available since about 1992 (roughly). (GNU much earlier @ 1984)

    From the few reports I see online (like linux counter, RedHat, etc). One can guess there are about 22 million active Gnu/Linux installs out there. One could also guess 100 million or 50 thousand, it's a guesser's market out there. In any case, lets go with 22 million Gnu/Linux installations over 10 years fo

  • ...but is Linus a eunuch?

    ??

If what they've been doing hasn't solved the problem, tell them to do something else. -- Gerald Weinberg, "The Secrets of Consulting"

Working...