Unix Hints and Hacks 17
Unix Hints and Hacks | |
author | Kirk Waingrow |
pages | 479 |
publisher | Que |
rating | 6.5/10 |
reviewer | Doc Technical |
ISBN | |
summary | This is a somewhat flawed book that contains some useful unix information. |
According to the introduction, the book's examples were tested with five versions of unix, including Linux 2.x. I'm a programmer by profession, but our group maintains 70+ linux servers and workstations, and we spread the sysadmin work around. So I'm always looking for hints on how to keep the machines humming.
The book is divided into ten chapters, four appendices and a glossary. Of the ten chapters, the first eight are on the "hard" unix subjects like File Management, Networking, and Security, while the last two chapters focus on "soft" subjects: "Users" and "System Administration: The Occupation".
The book is on firmer ground with the former than it is with the latter.
Hints and Hacks
The first seven chapters feature a collection of hints/hacks with one or more examples for each. A typical example is Section 6.2, "Copy Files Remotely". The author describes the problem in a brief paragraph ("...there are times when migrating files over from one host to another is necessary"). This is followed by four examples showing different techniques (using rcp, tar/rsh, nfs mount points, and ftp). Finally, the section ends with three brief sections: "Reasons", "Real World Experiences", and "Other Resources".According to the back cover, the book is aimed at the intermediate to advanced reader (although the introduction rather contradicts this by claiming that it will be useful to beginners as well). I consider myself comfortably within the intermediate to advanced range, and I honestly did find some useful tips in this book. Most subjects are covered by several examples showing alternative approaches, and this is consistent with the unix "toolkit" philosophy. There's always more than one way to flay a feline.
While aimed at intermediate and advanced users, I suspect most of that audience will be familiar with one or more solutions to most of the problems presented in the book. Where the reader will benefit occasionally is the discovery of a third or fourth alternative to a problem.
Some of the example problems had me shaking my head. In a section called "Remove the --- Dashes ---", the author presents the legitimate problem of removing a file that begins with a dash. If you don't believe this can be a problem, run these commands:
touch /tmp/-wow; cd /tmp; rm -wow
The obvious solution to this problem is to append the directory to the file name so that it doesn't start with a dash:
rm /tmp/-wow
The author provides this solution, along with three alternatives, but the last one raised more questions for me than it answered. The author's last example suggests "rm -r directory". Remove the entire directory. In the author's defence, he does provide numerous warnings in this example, and says it "should be used only as a last resort." But he never did explain why this method would be needed. Under what circumstances would the other methods not work?
There are some factual problems in the book, as in the appendix on "Basic Scripting Concepts" where a section titled "Recursive Scripts" has nothing to do with recursion.
In the "Real World Experience" subsection of the section "Redirecting Output to Null", the author recounts how he once crashed a database application by deleting its log files with rm, and how he quickly learned to redirect /dev/null to the log files to zero them out ("cat /dev/null > logfile"). But he doesn't take the extra step of explaining why rm'ing an open log file might cause an application to crash. Understanding why something happened in the first place is as important as knowing how to fix it.
What's Left Out
This book looks at solutions to specific problems, but never really attempts to teach what I'd call the hacking mentality.In a section called "Find the Disk Hog", Waingrow shows how to use the du, sort, and head commands to find the top ten disk users like this:
du -s | sort -rn | head -10
He explains how these commands work, and he uses similar command-line examples throughout the book. This is great. Stringing together existing unix commands using pipes and filters is one of the biggest strengths of the operating system. But he never covers these meta-topics, except for a brief mention of pipes in the introduction.
Readers at a beginning level would benefit from an overview of pipes and filters, as well as such essential programs as cut, grep, egrep, head, tail, sort, wc, and tr. Many of these programs are used in examples, and some are explained in context, but a "concepts" chapter would have broadened the book's usefulness.
Perhaps the author assumes that everyone already knows these things, but I suspect many unix and linux administrators have never been explicitly taught them.
The Soft Chapters
The last two chapters of the book, "Users" and "System Administration: The Occupation", are aimed at professional sysadmins. These were, for me, the least useful, but I only perform sysadmin work as an adjunct to my programming.There are sections on topics like "The Six Types of Users", "Public Relations", and "Preparing for an Interview". These are soft subjects not related to unix per se, but they may be useful to administrators.
Some of the advice in these last two chapters will be of most benefit to those just starting out in the field. Much of the advice seemed (to me) to be common sense. In the section "Being Interviewed", Waingrow suggests "Don't ever cut the interviewer off in the middle of a sentence", and "Don't lie", and "Don't belittle, debate, disagree, or tell interviewers that they are wrong". Geez, that really screws up my interviewing strategy.
Structural Problems
For me, the book suffered from several irritating structural and organizational problems. The first eight chapters are each divided into roughly a dozen sections. For example, chapter 5, "Account Management", is divided into 16 sections, starting with 5.1, "User Account Names", and running through section 5.16, "Nulling the Root Password Without vi". This is fine. But then each section is broken down again into subsections. How many? Exactly one. It may seem like a minor aesthetic point, but it makes the pages harder to scan visually, and it makes the Table of Contents much harder to read.Another annoying point is that the book is typeset so that there is less space between the period (.) and the start of a new sentence than there is between individual words. Sentences sometimes seem to crash into one another. Again, a minor point, not the author's fault, but these things add up.
And I have more fundamental problems with the way the book is organized. Each chapter section tackles a single subject, and many of these sections make their intent quite clear: "Dealing with Unwanted Daemons", or "Monitoring at Boot Time". But other section headings are more ambiguous: "Clear and Lock" and "File Collecting" might have been better named as "Clearing and Locking the Screen" and "Finding Suspicious Files".
As a benchmark, I looked at the organization of the recent "Perl Cookbook" by Christiansen and Torkington. This book is a similar collection of brief solutions to specific problems. The Perl Cookbook avoids many of the problems that harm "Unix Hints and Hacks". Section headings are descriptive and unambiquous, and they use a consistent verb tense. The layout is clean and readable. The publishers could take some lessons from O'Reilly.
Editing
In places, a little editing would have helped the prose. Here's an example from page 144:"When a program is sent a QUIT signal, it writes out what was in memory at the time the signal was sent to disk."
I'm pretty sure the signal was not not sent to disk, but this dangling modifier could easily have been recast:
"When a program is sent a QUIT signal, it writes to disk what was in memory at the time the signal was sent."
Writers are going to make mistakes like this, it's the editor's job to find them. This editor too often didn't.
Bottom Line
The weaknesses of this book get in the way of its strengths. Tighter writing, tighter editing, and a tighter focus would help this book enormously.A focus on the hard-core unix topics presented in the first seven chapters, with some additional material on the unix toolkit mentality, could be the basis for a good second edition of this book.
I would recommend skimming through this book at your local bookstore. If, after a few minutes, you find more than two or three examples that are useful, you'll likely benefit from this book.
Else, pick it up at Amazon.
an alternative SA book (Score:1)
I highly recommend it.
books versus experience (Score:1)
Re:Stupid UNIX Tricks (Score:1)
Books (Score:1)
What's needed is a machine on which (at first your all alone) you can try things. The more you try the more you learn. If it take 2 hours to solve a pb then You'll remember the thing in a much better way than by just reading a book and finding a solution in that book . Once you think your hands are made in a "single" user mode test and experiment in a multiuser mode ! but not on a production machine
Re:O'Reilly's "Unix Power Tools" Still Best (Score:1)
It certainly is a very fat book, yes. 8-)
I still prefer the original Bourne book, or Nemeth's tome on Unix System Admin.
--
Re:Stupid UNIX Tricks (Score:1)
If I know all the answers to those, do I win anything?
Well said. (Score:1)
Mmm ... but it's not a sysadmin book (Score:1)
Chris Wareham
The Opposite Sex in a Nutshell (Score:1)
Chris Wareham
echo>-rf (Score:1)
Cheers,
Joshua.
Reading books can be good too (Score:5)
While I agree partly with this philosophy, I think there's a market even for books such as this. As the reviewer points out, the Unix philosophy has been that there are more than one way to do a specific task. It's a "danger" as a new sysadmin that you learn to do a task in one way and then never consider the alternatives. This book can show you other ways to do the same thing which you might never have thought of yourself.
It's basically about learning to think in other ways, much the same as I learned to think in other ways when I learned Perl or read the GEB.
Let me show you an example. Some time ago a machine that I use was suddenly refusing logins because of an error. The admins knew what had to be done but couldn't be bothered to drive to the machines physical location to fix it. Now, I don't know about you, but had I encountered this, I probably would have gone over to the console right away. This was also what others at the scene thought about doing until one of them realised that a certain directory on the machine was NFS mounted on another machine. In that directory was several files run by cron every now and then. The solution was obvious; replace one of the files with a new shell-script and let cron hatch the egg. Solutions such as this has been featured in books like "The Cuckoo's egg" by Cliff Stoll but I never thought about it.
Why didn't I think of that, when the solution should have been obvious all along? Because I was thinking about the lossage as an administrative problem. I never considered the option of tricking the computer into healing itself. Books and experience in a well balanced combination can make your mind more open to solutions that might not seem very obvious at first.
Stupid UNIX Tricks (Score:3)
Why would a df -k /var differ in size used from a du -k /var?
When would you find the Solaris pstop command useful?
When would I want to use FDDI instead of 100BaseT?
What is mutex contention!!?!
What are the most useful snoop or tcpdump parameters?
Why you rm a large file, but you don't gain any free space?
How to stripe for success!
How does swapping differ versus paging?
Where did 10% of my brand new filesystem disappear to?
There's the generic topics...
How to identify the major system bottlenecks.
How to tune a cache.
How to manage users.
And then there's a few vendor specific oddities that would be good. Like that HP (forgot the model, DOH!) that for every four physical processors in the machine, you only get the power of one. But the upside is that it is redundant and somewhat auto-fault detecting, and you can live-yank out components without warning the OS. That and the Sun E10000 which can transform from eight individual systems into a mighty 64-way machine. MEKK-KA-SPARKY-SO-LAR-IS-A!
Forgive my Solaris-centric self. I need to pick up a few more OSs.
Books are ESSENTIAL to Survival! (Score:1)
For simple tasks on a non-critical (home) machine, spending hours to learn one command may be acceptable. But in a business environment, taking that kind of action when numerous manuals are available is irresponsible! It makes people question your level of competence, and it encourages the faulty thinking that UNIX is too complicated (compared to Windows NT).
The "UNIX Way of Thinking" offers a variety of tools and encourages you to use the right tool(s) for the job. I think of a good UNIX manual as simply another tool to add to my toolkit. It's another weapon in my admin arsenal. To deny the value of books is to potentially waste several productive hours! If I follow your suggestions, I will also remember "in a much better way" how I was fired as the secretary simply picked up an O'Reilly manual and solved the problem!
O'Reilly's "Unix Power Tools" Still Best (Score:2)
Couple this book with O'Reilly's Essential System Adminstration and you pretty much cannot go wrong.
Re:Stupid UNIX Tricks (Score:1)
I'm personally waiting for 'The Opposite Sex in a Nutshell' series.