Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Technology Books Media Book Reviews

Systemantics 71

daltonlp writes with the review below of John Gall's 1977 work Systemantics, writing "Most of the systems described by the author are societal or economic systems (governments, corporations, universities). Computer programs are mentioned, but they aren't the primary focus. But Systemantics doesn't distinguish between types of systems. In fact, its theories and arguments seem especially applicable to computer systems." (Read more below.)
Systemantics
author John Gall
pages 111
publisher Quadrangle / The New York Times Book Company (1977)
rating Insightful +5
reviewer Lloyd Dalton
ISBN 0812906748
summary "A complex system that works is invariably found to have evolved from a simple system that works." Years ago, I saw this quote and committed it to memory. I've finally had the pleasure of reading the book it comes from. I was amazed that Systemantics was written in 1977. It's far more relevant today than it was then, because more people write more software today.

That means theories like
Systems in general work poorly or not at all.

Some might question whether this is really true for computer systems built with modern technology. After all, for a computer to function, millions of microscopic parts must act in perfect synchronicity at superhuman speed.

But in reality, computers fail much more frequently than we notice. A large chunk of their innards are dedicated to failing gracefully. There's ecc in just about every piece of hardware. Without it, computer hardware would fail too often to be usable. Software is no different--it can fail sooner or later, gracefully or catastrophically, but it's going to fail. Overall, computers work poorly, but they work.

Complex systems usually operate in failure mode.

In other words, something's always broken at any point in time. The measure of a complex system's quality is how drastically a particular failure impacts the rest of the system.

Loose systems last longer and work better.

Most Slashdot readers probably read the above and think either "Hallelujah!" or "Duh." But it's a small example of something I liked a lot about Systemantics. Buried under several layers of satire and pessimism is a genuine desire to help the reader avoid the mistakes of past systems designers and managers. There's more to this book than just pessimism.

What's Bad:

Systemantics suffers a little from being a quarter-century old. Several references to Watergate and a few other cultural nods may be a bit lost on anyone under 40.

But the book's only real flaw is the author's occasional condescending tone. Every dozen pages or so, Gall takes the opportunity to criticize a real-world example. Some of these anecdotes serve as supporting evidence for an argument. Others are genuinely entertaining (the section on Job Goals and and Objectives is outstanding). But the author sometimes tries too hard to be satirical, and comes across as flat or patronizing, or departs on tangents unrelated to the book's central ideas.

Summary:

Despite small imperfections, there's a wealth of real knowledge in this small volume. The author helpfully outlines the main points at the book's end (some of which I've bulleted above). The book's overall message couldn't be more clear if it summarized itself. Which it nicely does:

It is hardly necessary to state that the very first principle of Systems design is a negative one: Do it without a system if you can.
Systems are seductive. They promise to do a hard job faster, better, and more easily than you could do it by yourself. But if you set up a system, you are likely to find your time and effort now being consumed in the care and feeding of the system itself.
  • New problems are created by its very presence.
  • Once set up, it won't go away, it grows and encroaches.
  • It begins to do strange and wonderful things.
  • It breaks down in ways you never thought possible.
  • It kicks back, gets in the way, and opposes its own proper function.
  • Your own perspective becomes distorted by being in the system.
  • You become anxious and push on it to make it work.
Eventually you come to believe that the misbegotten product it so grudgingly delivers is what you really wanted all the time. You are now a Systems-person.


You can find used copies of Systemantics from bn.com and other online sources, though good-condition copies fetch high prices. Slashdot welcomes readers' book reviews -- to submit a review for consideration, read the book review guidelines, then visit the submission page.

This discussion has been archived. No new comments can be posted.

Systemantics

Comments Filter:
  • It's only $27.95 from here [generalsystemantics.com].

    John.
  • by Raindance ( 680694 ) * <johnsonmxNO@SPAMgmail.com> on Wednesday December 24, 2003 @01:14PM (#7803245) Homepage Journal
    "Systemantics" is a work in the context/field of General Systems Theory, pioneered by the philosopher Ervin Laszlo.

    General Systems Theory says that "invariances of organization" exist; that some things allow complex organization and will be found throughout organized systems, and we can meaningfully study systems through studying these invariances. Also by creating analogies between systems (i.e. such as an ant colony and a communist society). We must also look at parts of a system in a holistic setting- i.e. examine not only parts of a system and their properties, but also their relationships to other parts. Etc. It's good. Check out The Systems View of the World [amazon.com] if you're interested.

    Systemantics seems to be a work aimed at discovering and exploring these "invariances of organization".

    RD
  • Re:Quick review. (Score:2, Informative)

    by Liselle ( 684663 ) * <slashdot@NoSPAm.liselle.net> on Wednesday December 24, 2003 @01:16PM (#7803252) Journal
    Funny, I remember seeing this review somewhere [amazon.com] before (third review down). Is someone at Amazon going to be cheesed you stole their review?
  • by 110010001000 ( 697113 ) on Wednesday December 24, 2003 @01:23PM (#7803288) Homepage Journal
    Here is my take on the book. It is in general excellent and this is one of those books that should have become required reading, but possibly because it is too thought provoking, never became prominent. A great pity. It is as entertaining as Parkinson's works on his famous laws, and to me personally it has proven a good deal more valuable in practice. (Parkinson himself reviewed it and liked it!) It is a pity it is out of print. I hope that its follow-up (which I have not yet read) is as good. Though jocularly written, this is really valuable, stimulating material. Its aphorisms may read like jokes, but they are all the more valuable for being quotable and easy to remember in context. Thinking back on all the godawful systems that I have seen, political, management, engineering and computer, there is not one that could not have been mitigated by intelligent anticipatory digestion of this book. Unfortunately mentalities prominent among power-seekers, control freaks and grandiose designers, not to mention outright dishonesty among managers with conflicts of interest, cause considerable resistance to the ideas and attitudes that Gall promotes. If you are one such, I have nothing to say to you. If on the other hand you enjoy a bit of thoughtful and edifying entertainment, do your best to read this book.
  • by drlock ( 210002 ) on Wednesday December 24, 2003 @02:16PM (#7803567) Homepage
    For those who want to know more about the book, I found the following list over at ERN [interbiznet.com] (These are actually from Systemantics: The Underground Text of Systems Lore which I guess is an expanded version of the book reviewed) Gall's Basic Systems Principles:
    • Systems in general work poorly or not at all.
    • New systems generate new problems.
    • Systems operate by redistributing energy into different forms and into accumulations of different sizes.
    • Systems tend to grow, and as they grow, they encroach.
    • Complex systems exhibit unpredictable behavior.
    • Complex systems tend to oppose their own proper function.
    • People in systems do not do what the system says they are doing.
    • A function performed by a larger system is not operationally identical to the function of the same name performed by a smaller system.
    • The real world is whatever is reported to the system.
    • Systems attract systems people.
    • The bigger the system, the narrower and more specialized the interface with individuals.
    • A complex system cannot be "made" to work; it either works or it doesn't.
    • A simple system may or may not work.
    • If a system is working, leave it alone.
    • 15. A complex system that works is invariably found to have evolved from a simple system that works.
    • Complex systems designed from scratch never work and cannot be patched to make them work; you have to start over, beginning with a working simple system. In complex systems, malfunction and even total nonfunction may not be detectable for long periods, if ever.
    • Large complex systems are beyond human capacity to evaluate.
    • A system that performs a certain way will continue to operate in that way regardless of the need or of changed conditions.
    • Systems develop goals of their own the instant they come into being.
    • Intrasystem goals come first.
    • Complex systems usually operate in failure mode.
    • A complex system can fail in an infinite number of ways.
    • The mode of failure of a complex system cannot ordinarily be predicted.
    • The crucial variables are discovered by accident.
    • The larger the system, the greater the possibility of unexpected failure.
    • "Success" or "function" in any system may be failure in the larger or smaller systems to which it is connected.
    • When a fail-safe system fails, it fails by failing to fail safe.
    • Complex systems tend to produce complex responses (not solutions) to problems.
    • Great advances are not produced by systems designed to produce great advances.
    • Systems aligned with human motivational vectors will sometimes work; systems opposing such vectors work poorly or not at all.
    • Loose systems last longer and work better.

Say "twenty-three-skiddoo" to logout.

Working...