Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
News

Stephenson Counter Rant 96

A while ago we ran a link to Neal Stephenson's bit on In the Beginning there was the Command Line. Nick Arnett has written a counter rant to that piece that you will probably find worth reading if you enjoyed Neal's original piece.
This discussion has been archived. No new comments can be posted.

Stephenson Counter Rant

Comments Filter:
  • by Anonymous Coward
    "But if the deeper complexity is totally hidden from the viewer (GUI user), then they will never have the chance to learn more, because they won't even know it is there to be learned. "

    Not quite.
    You are automatically assuming everyone wants to learn. I can show you a whole crowd who don't.
    Not because they're ignorant or clueless, but because they have more important things to do with life.

    I use Notepad to create a document, drag & drop it into Winzip, drag & drop the zip file into Outlook explorer, & off it goes. I don't care a damn what the technical issues are behind these drags & drops...I'd rather focus on the content of the document I wrote, 'cause I'm a writer.

    Most people are like that...they use computers to empower them, not to learn its innards.

    I don't want to know how the microwave works... I just want to nuke the ham sandwich for lunch.
    That doesn't make me ignorant...just hungry.

  • Luckily he was using a unix machine and a quick tr -d "\r" file.dat >file.nocr.dat and everything was sorted out.

    Yeah. That sort of thing is what we love command lines for :-)

    Let's see who can figure out this one-liner:

    find .|sort -r|gawk 'BEGIN {FS="/"} {printf "mv %s %s\n", $0, gensub($NF"$", tolower($NF), "g")}'|less

    I _know_ some of you guys will figure it out. If you can't, hang around for someone to post an answer, you will get a great example of the power of command lines.

    And, if I screwed up, go ahead and point it out.

    ---

  • Luckily he was using a unix machine and a quick tr -d "\r" file.dat >file.nocr.dat and everything was sorted out.

    Yeah. That sort of thing is what we love command lines for :-)

    Let's see who can figure out this one-liner:

    find .|sort -r|gawk 'BEGIN {FS="/"} {printf "mv %s %s\n", $0, gensub($NF"$", tolower($NF), "g")}'|less

    I _know_ some of you guys will figure it out. If you can't, hang around for someone to post an answer, you will get a great example of the power of command lines.

    And, if I screwed up, go ahead and point it out.

    ---

  • Having finally gotten through slashdot effect and seen the article, I have some comments.

    First off, I'd recommend a course in logic. If you assume both A and not A, _anything_ can be proven with perfect logic. And yet, in the same sentence, he says Windows has a monopoly and Windows does not have a monopoly. From this starting point I can prove that cheddar is better than swiss, with perfect logic.

    Next, if literacy is in such flower, why is requiring literacy to use a computer a bad thing? The reason is- as Stephenson was trying to point out- that literacy is in flower only among a _subset_ of our society. He admits the command line is a better tool for some jobs than the GUI- so what is the problem with using the right tool for the job? Because that would require _learning_- reading documentation (possibly as much as several books- dear me!) and learning an arcane syntax! That's much too difficult!

    And don't give me that "I don't have time to learn" argument. Using the right tool for the job saves time in the long run. This is like arguing, while discussing taking a hundred mile trip, that it'd take longer to learn to drive than it would to walk the hundred miles. Therefor, it's not worth it to learn to drive. For a single hundred-mile trip, this logic probably applies. But on the _second_ hundred mile trip... Besides- who needs to take hundred mile trips anyways? Most people never travel much farther than 20 miles from their place of birth- being able to do a hundred mile trip in 90 minutes is therefor not all that important, right?

    Third- the metaphor is not the reality, the map is not the territory, the menu is not the meal, and MICROSOFT IS NOT A WHIRLPOOL. It's a company. Monopolies can, and have, been broken up to the benefit of the competition and the customers. I offer as proof Standard Oil and AT&T.
  • In a command line, you are given no information about what is possible, but all operations are available instantly if you learn their names and options, all without searching the screen with your eyes.

    And if you don't know, there's ls, man -k (or man, if you're looking for particular options), and automatic completion (with e.g. tcsh or 4dos)... So there are definitely memory aids in CLIs as well.
  • Aha! As of a couple of months ago, the world is officially urban. Over 50% of the world's population now lives in cities.
  • I own a NeXT color turbo, and it's an excellent piece of kit, but...

    the original cube was an '030 with an optical disk designed to be sold to education... for $10,000. and they wondered why they didn't sell, it's Jobs and his up-his-own-arse elistist pretensions again. he doesn't *want* to be popular, he just wants adoration from a fanatical minority.

    They couldn't find a niche because they didn't bother to study the market beforehand. The targetting of NeXT towards the mission-critical custom app (MCCA) space saved them, but by then it was too late... NeXT could have made an absolute killing in the scientific, financial and DTP/pre-press markets if they'd been clued up, and today you'd have a dual-PPC NeXT slab on your desk instead of a SPARCstation.

    NeXTStep is some of the most advanced software ever written, even today, but it was left to wither and die by Jobs.
  • Oh, but that's part of the point. Neal S indulges in some very sloppy argument, ironically while attacking Disney for interpreting culture in a way that avoids the "precision" of words.

    The twist that allows him to argue all the way from start to finish is summed up by this phrase:
    "Words are the only immutable medium we have."

    With respect, and as anyone who'se ever study philosophy in depth, that's rubbish. Words, and the interpretation of words, are one of the most mutable things in the whole of human history. I'm surprised that a story teller, who knows how to make words dance, could make such an elementary error.

    Then, of course, there's the false analogies he uses. He happily compares Disney's "interfaces" on cultural history to GUIs. Unfortunately, that ignores the fundamental fact that culture, for most of us, is something you *observe*. A computer, on the other hand, is something you *use* - a tool. The purpose of the "Disney interface" is to entertain. The purpose of a GUI is to make a computer easier to use, and to ensure that more people can use it to do stuff.

    There are plenty of other errors like that, all of which undermine Neal's argument.
  • It wasn't Jobs' "infinite wisdom" that scrapped the Intel port... by 1990, Jobs was long gone from Apple, had nothing to do with its decisions, and owned only one share of stock. Those were the Sculley years, when Apple was being run into the ground by a marketing hack from Pepsi.

    At that point, Jobs was busy with a little project called NeXT, which was producing almost-affordable workstations with the best GUI ever sold commercially (some of you might use AfterStep for a WM... that's a NeXTStep knockoff), a BSD Unix base, 17+" monitors, CD-quality stereo sound, Display PostScript (true wysiwyg)... in other words, a decade ahead of their time. If the NeXT had ever found a big enough market niche, the world would be very different today.

    Actually, though, i don't fault the lack of an Intel port for the Mac's failure then. Intel hardware was not (IS not) sufficiently mature for MacOS. I personally lay the blame at not providing protected memory and preemptive multitasking for System 7, at a time when Apple was abandoning the older hardware that couldn't support it. Of course, Apple dedicated its next-generation resources to Pink (Taligent), which suffered Death By Design.
  • Care to back this up? It sure doesn't jive with what I've seen from Alan Cox on the topic.

    Well, since you asked:

    1) Inferior processors. Let's face it; Intel makes among the slowest chips on the market today that people actually use in desktop and laptop machines. PPC's beat it, Sparcs beat it, Alphas beat the hell out of it... simply put, it's an old and outdated architecture.
    2) Lack of standards. I'm serious. Sure, they've got some: PCI, USB, that sort of thing. But those standards don't go far enough; they define the hardware interface but not the software interface, at least not to a sufficient degree. That's why you still need all those little patches to the OS that you call "drivers" just to make a device work. Before you flame me, yes, I know that Macs have drivers too. But there's a key difference: most Mac peripherals can still work without drivers; the drivers are there for optimizations and such (there's the deflation for the argument that standards "disallow for third-party optimizations"). But I dare you to run Windows or Linux without the driver for the video card in your machine.
    3) What standards are there are often inferior. Case in point: AGP. Did you know that the average AGP slot/card in a machine today is actually no faster than a card in a fast PCI slot (yes, these do come in different speeds)? Moreover, AGP works by kludging the video into main memory. Sure, you save a hundred bucks. But you get what you pay for. Another inferior standard: IDE. Again, cheaper than its competitors, but for a reason: it's slower and less reliable (which is why I'm still wondering as to why Apple switched to it; they cut corners to save a buck).
    Then, of course, there's MMX and KNI. Have you ever read up on how those two actually work? It's very interesting reading, and proof that these "great multimedia standards" are really little more than more kludges to keep the tired Pentium architecture to a point where it's only a little behind the more advanced technology in every other desktop machine currently being sold.

    The point: The average PC sacrifices performance and reliability just to save a few bucks. That is why it isn't mature enough.
  • Actually, Apple did port MacOS to PC hardware around 1990. Back then MacOS was a bit simpler than System 7 (the one which was ported to PPC, with an emulation step). Apple's engineers managed to port it to Intel hardware, and created a demo (which may have even been shown outside of Apple). Jobs, in his infinite wisdom, ordered it scrapped. Shortly afterwards, Microsoft brought out Windows 3.0, and the rest, as they say. is history.
  • Sorry... late night, brain not working.
  • "Arnett's example of weather tries to set up Stephenson as saying something he's not. It would be a valid analogy if looking at a weather report somehow prevented you from looking at the indivdual molecules in the atmosphere. Stephenson is not against GUIs. He's against them replacing CLIs."

    Another way to look at it: even on the most bubble-headed TV weather forecast, they usually show the charts and graphs with the technical data in the background, and even explain what those funny isobar things are every once in a while. The viewer who just wants to see "rain tomorrow" gets that, but the viewer who becomes interested in the technical matter can find some of that also. And maybe go beyond that: check out the weather page in the 'Chicago Tribune' sometime - detailed discussions of the five major computer models in use by US meteorologists and how they differ(!).

    But if the deeper complexity is totally hidden from the viewer (GUI user), then they will never have the chance to learn more, because they won't even know it is there to be learned.

    And eventually, this knowledge disappears totally, leaving a big hole. If you don't think this is happening today, put out an ad for a 2nd level support technician with four years' experience and interview the people who respond. MCSEs who don't know the first thing about how a computer actually works are a little scary to me, personally.

    sPh
  • "Not quite. You are automatically assuming everyone wants to learn. I can show you a whole crowd who don't. Not because they're ignorant or clueless, but because they have more important things to do with life."

    Interestingly, that turns out not to be the case. I have done end user tech support for more than 10 years now, and have also acted as an end user representative to a centralized IT organization. So I have made this very argument to computer professionals many times.

    However, I have also seen many cases over the last two years (and their frequency is increasing), where people damage their own work and/or their organization with computerized tools. This damage is caused primarily by lack of fundamental knowledge of the tools they are using. OK - not necessarily their fault. Maybe they need better training/education. But it is very hard to give them that training, because the tools beneath the tools (the machine tools, if you will) aren't there to explain what is going on and build understanding.

    OTOH, I have basically concluded that I can no longer teach beginners, because I have too much knowledge and am not able to emphathize with their level of knowledge. So maybe I am just talking through my hat.

    sPh
  • by sphealey ( 2855 ) on Tuesday April 20, 1999 @12:32PM (#1924790)
    Milwaukee, Bosch, Makita, etc. do now sell the low end of their contractor-grade equipment through homeowner hells. They also sell consumer-grade products with the same _name_ as the high end stuff. I would imagine the margin on those sales props up entire product lines quite nicely. But they also have true high end products that are sold only through contractor supply stores and a few catalogs.

    As with just about everything in modern society, the aura of mystery and power about this type of equipment is gradually fading as more information becomes available and more channels open up to the average joe, and the Internet is playing a large part in that. But I believe Stephenson's description of the Hole Hawg was and remains pretty accurate.

    I have also known a few kids who have grown up with contractor (or auto mechanic) parents, and in the cases where the parents were actively engaged in passing on knowledge to their kids, the result was pretty much as Stephenson imagines. The accuracy of that vigenette was fairly telling, I thought.

    It remains to be seen, though, whether this applies to computers or not. My 7 y.o. is pretty active with the PC, and he showed me some tricks I didn't know in Win95 the other day. But what I do at work is so far beyond what we do on the Wintel box at home, and so abstract, that I doubt I am passing much/any of my accumulated knowledge on to him. Anyone else have a different experience?

    sPh
  • damn beginners.

    geez. i have to write something like 20pp of documentation of our current system so that the next braindead sysadmin (and before that my braindead boss!!!) can manage the system after i leave.

    1. print first draft. ~5pp.
    2. she reads a little bit of the details.
    3. this is too hard; i don't understand this;etc.
    4. translation: i don't care about this. write a g-zillion more pp with details like "press the start button. now click run. type in "telnet 123.45.67.89" wait i don't understand the concept of "telnet". explain it, though i won't actually read it and will bother you at your new job instead, anyway.

    ARGH! any linux ppl in college in austin, tx who don't mind working part-time (officially. normally ~35hrs week; the position will be converted to fulltime in september) for $10.49 should email me and i'll give you my bosses email and you can keep the ACC RGC universe from falling to pieces.
  • a mirror?

    mmmmm, /. effect.
  • It's a while since I read the original article now, but it really didn't give me the impression that he was trying to universally bash GUIs -- Quite the opposite, he freely admitted that he ran X on his Unix machines. What he was trying to emphasise was something that most Unix users take for granted, while others have forgotten it: the command line still has it's own unique power, and plenty of users.

    As just a simple example, today a friend complained to me that a fairly sophisticated program was refusing to import his data. I eventually tracked the problem down to lines being terminated with CRLF, when the program was just expecting LF. Luckily he was using a unix machine and a quick

    tr -d "\r" file.dat >file.nocr.dat
    and everything was sorted out. That kind of simple file manipulation can be really awkward on a pure GUI. Of course, I realise that the program should have been more tolerant about its input data. But it wasn't, and there wasn't anything we could do about it. The command line saved hours of worry and hassle.

    Of course, the converse is also true. There are many kinds of applications which do require a GUI. Fortunately, Unix users can get the best of both worlds. Let's just enjoy that position.

  • It's a while since I read the original article now, but it really didn't give me the impression that he was trying to universally bash GUIs -- Quite the opposite, he freely admitted that he ran X on his Unix machines. What he was trying to emphasise was something that most Unix users take for granted, while others have forgotten it: the command line still has it's own unique power, and plenty of users.

    As just a simple example, today a friend complained to me that a fairly sophisticated program was refusing to import his data. I eventually tracked the problem down to lines being terminated with CRLF, when the program was just expecting LF. Luckily he was using a unix machine and a quick

    tr -d "\r" < file.dat > file.nocr.dat
    and everything was sorted out. That kind of simple file manipulation can be really awkward on a pure GUI. Of course, I realise that the program should have been more tolerant about its input data. But it wasn't, and there wasn't anything we could do about it. The command line saved hours of worry and hassle.

    Of course, the converse is also true. There are many kinds of applications which do require a GUI. Fortunately, Unix users can get the best of both worlds. Let's just enjoy that position.

  • Ooops, sorry for the duplicates -- I thought I was hitting preview, and must have submitted instead :(.
  • Okay, it probably wasn't the best example, I know there are plenty of GUI programs that can do the particular manipulation I described, and do it very well. On the other hand, GUI systems which can perform arbitrary complex file manipulations are much rarer.

    People keep talking about GUI data-manipulation systems with the power of command pipelines started from the Unix shell. Once someone has got something working that's half as flexible as the Unix command line (and as easy to write new programs which integrate seamlessly into the system) I'll be more than glad to give it a try.

    For now, however, I firmly believe that the command line has it's place, and I'm sure there'll be at least a few xterms on my desktop for a long time to come.

  • What the heck. I don't care if xoom.com dies.
    http://members.xoom.com/_XOOM/jwynia/counterrant .htm
  • A couple of months ago, I was watching C-SPAN while staying home from work sick and they had a thing from a book trade show. Ingram (or somebody like them) was demonstrating this process they had of printing one copy of a book pretty much as cheaply as a bunch. They stored the book digitally and the machine printed cut and prepared it for the cover which was printed on a different machine a few feet away. The whole thing was done pretty quickly and looked pretty good. Ingram was touting it as a solution for out of print books. They were saying that when the system was entirely in place you could order an OOP book the same way you'd order any out of stock book from Ingram. For the customer it would be transparent. You order your book and either they find a copy or they print a new one. Forgive the errors in my memory of the show, but I found it interesting.
  • 2. In most cases, selling stock is really selling equity, but this is a technology stock we're talking about. With these things, the price of the stock is several orders of magnitude times earnings. Usually you would be selling equity, but with high-tech stocks (especially now with e-trading), it really is like financing a loan.

    I know you have recanted this, but for the benifit of people that don't know the diffrence but want to I still want to talk about it.

    When you make a loan you secure a promise that you will be repayed the loan amount, and for your trubble intrest. Unless the loan-ee goes bankrupt (or is a theif) you get your money back, plus the profit. In other words the value of holding the loan is almost allways more then the amount it was made for, and less then a fixed value (ignoring adjustable rate loans, and high risk loans, and even the time value of money). Concrete example: I loan AT&T $100, they promise to pay me $105 at the end of 12 months. During that time I should be able to sell that loan to someone else for more then $100, but less then $105. At the end I'll get $105. I'll make up to $5, and have almost no chance of losing money. (If I made this loan to someone who has been in bisness less time then AT&T, or who is in trubble the risks would be diffrent, but so would the reward)

    When you buy stock you secure a promise that the compony will try to make a profit, and that it will obey your will in proportion to your stock ownership (not very important for this discussion, so I won't comment on it farther). In other words you have no idea how much much it will be worth in the future. Concrete example: I buy a share of AT&T for $55. Baised on recent history I may be able to sell it for $65 next month (it was in the mid-$60s a month or two ago), I may also only be able to get $38 for it (it was in the high 30's last May). More potential for profit, more risk.

    Owning someone else's debt (say AT&T debt bonds) is an OK way to make a little money fairly safely. It's gennerally a poor way to make fairly large amounts of money. Owning stock is an OK way to make lots of money with a high degree of risk.

    Please don't rush out and start buying and selling stocks, or loan bonds (or making loans directly) baised on this advice. It is very genneral, and may even be wrong. I'm not a financial advisor. Please see one before you do anything like this.

  • I agree, most of the response is meaningless nit-picking. Who cares who invented the GUI? I'm sure Stephenson knows, he was simplifying things a little. It was a well written piece; the response sounds more like yet another incensed and bitter mac user.
  • It's gone. Anybody got a mirror?
  • Stephenson is not a hacker, but he's a good and insightful writer. I'd say that his command line essay is about 80% wheat, 20% chaff. I think it would profit him to run his tech writings past someone with a clue before he shares them, but even so, the command line essay is full of good ideas. A grown-up hacker should be able to read it critically and come away with some fresh perspectives.
  • "The point of chaos theory", he says, "is that interactions that appear to be highly random or complex can be collapsed into simple patterns", and then he proceeds to use it as the foundation for many of the arguments in his essay.
    Those who know any thing Chaos Theory will recognize that Mr. Arnett has -missed the point- entirely.
    The point of Chaos Theory, in short, is that simple, deterministic systems can produce highly complex rather-random-looking patterns.
    Of course it goes on to say some things about such systems.
    But Chaos Theory more than nearly any other he could have picked is a pro-reductionist theory, despite its caveats regarding sensative dependance on initial conditions and implications regarding prediction. In regards to the latter, Dijkstra's work on formalization (see: _A Discipline of Programming_, Edsgar W. Dijkstra, 1976) should be sufficient to convince any reasonable person of the predictive powers we can use with confidence regarding computers.

    My essay on the disadvatages of GUIs and their relation to the nature of computing is forthcoming, as is my essay on why it is crucial to being a good user that one understand the "deep workings of the computer", as Mr. Arnett would have them.
  • yarg! "any thing Chaos Theory" indeed. hopefully my essays will be better proofread than my /. posts.
  • While studying the emergent properties of strange attractors can be interesting, little has come of it. Most of the worthwhile work in Chaos Theory has been done in demonstrating the properties of deterministic systems that might lead to chaos (3-period behavior, etc). Although I see your point, I don't think it can be characterized as "the point of Chaos Theory". Also, Chaos Theory is all about chaos in deterministic systems. Modern interrupt-based computers are nondeterministic. The concept you're after is that of emergent properties, and isn't much related to Chaos Theory, save via a good metaphor or two. I would argue that Stephenson's reductionist viewpoint is equally valid, though I am with you regarding his poor definition of the "real job" of a computer, though for different reasons (i.e. that a computer need not be based on bit-manipulation. Bit-manipulation is just a metaphor that happens to be true to the workings of the modern computer.).
    Also, my plan for my essay is to use a broad definition of the purpose of a computer (what a computer is good at as a tool) to show how a lack of understanding of the basic nature of computers and what they are good at leads to (and has lead to) ineffective usage. I'm very interested in computers being good tools, but no-one is going to say that a hammer is a good tool if they're trying to use it as a saw.
  • You are automatically assuming everyone wants to learn. I can show you a whole crowd who don't.

    You are correct. Many people don't want to understand how their box works. The question that remains is, should they want to know?

    I say yes. At least a little.

    Take for example a car. I would venture to guess you bought a car to get around--to empower yourself.

    But owning a car has many pitfalls. Not the least of which is being ripped off by someone who knows more than you--a nasty mechanic, a dirty used car salesman, etc. If you know nothing about the inner workings of a car, you are at their complete mercy. If the mechanic tells you something is wrong, how are you to argue?

    I've seen far too many people get ripped of by dirty mechanics (and dirty computer salesmen). It doesn't take too much knowledge to protect yourself. Just a bit. (I really hope you didn't go from using 95 to 98--that is what we call a ripoff.)

    It comes down to this. Knowing nothing about your box is not empowerment. It puts you at the mercy of others.

    But that is just me.

  • "But if the deeper complexity is totally hidden from the viewer (GUI user), then they will never have the chance to learn more, because they won't even know it is there to be learned. "

    Not quite.
    You are automatically assuming everyone wants to learn. I can show you a whole crowd who don't. Not because they're ignorant or clueless, but because they have more important things to do with life.


    He isn't assuming any such thing.
    He is stating that the information should be there for those of us who want to learn. He's not talking about the whole crowd who don't.

    I don't know why I'm posting this. It seemed pretty simple to me. Maybe I'm just one of those kind of freaks.

    I don't want to know how the microwave works... I just want to nuke the ham sandwich for lunch. That doesn't make me ignorant...just hungry.

    There's a radioactive part inside which emits microwaves. The microwaves cause the water molecules inside the food to vibrate, careening into the other molecules, thus heating up the food. It's not a precise description, but it's good enough.

    As a question, would you rather nuke a ham sandwich, or give it to me, and wait for me to nuke it for you, when I get around to it? This whole argument is a matter of where you choose to draw certain lines, and what you consider important. I think that it's important to know what happens when I interact with the things around me, but that might be because I make my living writing software which has to work around various undocumented "features".

    Later,
    Blake.

    I speak for PCDocs
  • These days, you don't have to understand, remember, or type These days, you don't have to understand, remember, or type find -type f -exec grep -i "meeting notes" '{}' ';' to find a file containing "meeting notes" on a *nix machine, but you CAN'T get the same kind of interaction from MacOS or Windows without a third-party utility.

    find "meeting notes" *.*

    Found this out a while ago.
    Used it a lot since then.

    And using the WinNT (and 95/98?) "Find Files" application, you can specify "contains the text..."

    Later,
    Blake.

    I speak for PCDocs
  • I was born in '78 so I guess I'm right in that area but luckily I was poor when I was young and we couldn't afford a computer so I spent my time studying the old technical books at the library and reading sci fi. Then it went to hacking my Atari 2600 and then playing w/ cast off TSR-80's, c64's, ancient Apples, Texas Instruments, and 8088/8086 machines. Finally I got a 486 and was in computer heaven, so much power! *laughs* It wasn't really until I got to college that I realized how useful this offroad track was to me. A lot of those people had no knowledge of anything except the latest and greatest Windows interfaces and PC hardware. IMO all comp sci students should have to take a semester of computer history. Is sure a lot more useful than those lame computer ethics classes.
  • -Jobs- canned the Intel Port of the mac os in -1990-?

    Perhaps you should go read some timelines.
  • One really interesting point in the rebuttal is in reference to Microsoft advancing in the market place by the mutation of acquired technologies.

    This flies directly in the face of M$ primary criticism of Linux, that Linux developers are 'chasing tail-lights', and that's why they're doing so well...

    Just thought I'd set that one aside by itself, as a point to ponder.

    Another interesting point is the Steve Jobs vs stepfather dynamic. Wow! Here I thought business was cold and heartless - and heart is what's driving Jobs??

    Makes me wonder if Gates is just trying to get back at all those people who kicked sand in his face when he was little.
  • > Both theories show that seemingly random systems
    > are not nearly as random as we had imagined.

    Sure, but they don't show you how to extract simplicity from seeming randomness/chaos, they explain how you can start with something really simple and end up with really complex resulting behavior. Neither field implies that complicated things have visible simple patterns - only that there may be underlying simpe rules.

    Anyway, the response rant seemed to totally miss the point of Stephenson's original. He wasn't saying that GUIs are evil and the command line totally rules everything. In fact, he said his current choice of operating system (BeOS) was determined at least in part by its GUI. His point (among many other semi-random things) was that the command line interface, at least a good powerful one (like Un*x, unlike DOS) provides a unique and valuable way of looking at computing that a GUI can't really provide.

    As a heavy user of both command-line and GUI tools, I have to agree. It makes you think in algorithmic and linguistic terms instead of spatial terms.

    For some people this isn't really a value - a GUI provides all they really want out of their computing experience. But I have also known people whose linguistic reasoning was much better than their spatial reasoning, and they had a much harder time dealing with a pure GUI interface than with one that provides a command line.

  • Exactly.

    Consider print-on-demand the mp3 of the publishing industry. Whereas mp3 has a built-in distribution medium, the internet, hardcopy books cannot magically propogate from bookstore to bookstore, and you generally can't thumb through a book online.
  • Jeez, I think someone was taking the metaphors a little too seriously. I mean, when someone says "Love is a red red rose" you don't go asking him what part of love the petals or anthers represent, or complaining that the metaphor shows a lack of knowledge about botany.

    I agree that Neal Stephenson's essay was not perfect, but the fact that it clicked with so many people on the internet reveals that it speaks to us in a way that we can all relate at about subjects that we all share.

    Bottom line: this guy either has a big stick up his ass or he was just aiming for some quick publicity.
  • all too true....and if everyone learned to drive in a '62 Chevy pickup w/ three-on-the-column and no power steering they would never be in a vehicle that they could not operate.....but nobody wants to take the time or effort necessary to accomplish this, because the likelihood that they will ever be stuck in a car with a standard transmission (or a CLI) is (seemingly) so small that it does not appear to be a worthwhile enterprise.
  • But Micrsoft tries to make one OS for everybody, for both those who seek the workings and those who don't care. They could have done it the Linux+X+WM+GNOME/KDE way, where it's pretty seemless once set up but the underlying mechanics are still availible. Instead, they did it the Microsoft way, where your best bet to understanding how things work is to play with regedit (Fun!)
  • Ok, you're right. In my haste, I didn't really think it through. I, an econ major, am ashamed. But those stock prices are still way too high!
  • by Fizgig ( 16368 ) on Tuesday April 20, 1999 @12:21PM (#1924818)
    I know he's not a real hacker (I'm not either, for that matter), but he seems a WHOLE lot more hackerish than any other sci-fi writer or tech-reporter I've ever read. From the back of Snow Crash (which I bought because of In the Beginning):

    [Stephenson] began his higher education as a physics major, then switched to geography when it appeared that this would enable him to scam more free time on his university's mainframe computer.

    If that's not an old-school hacker, what is?
  • by Fizgig ( 16368 ) on Tuesday April 20, 1999 @01:06PM (#1924819)
    Ok, I loved Neal Stephnson's essay. Sure, it had problems, but that's not going to stop me from fanatically defending it :)

    1. Stephenson's point was not that GUIs are bad. Maybe he got the history wrong, but he got the important part. The people at PARC were not trying to hide the workings of the computer behind a graphical interface. They were trying to enhance the computer through a graphical interface. This is not what Apple and Microsoft do, though they may pretend to. What they do is hide the computer behind the interface. This is what Stephenson says is bad. Arnett's example of weather tries to set up Stephenson as saying something he's not. It would be a valid analogy if looking at a weather report somehow prevented you from looking at the indivdual molecules in the atmosphere. Stephenson is not against GUIs. He's against them replacing CLIs. Come on, can anybody who likes Enlightenment be against GUIs? Oh, the irony that Arnett mentions the straw-man argument in his essay!

    2. In most cases, selling stock is really selling equity, but this is a technology stock we're talking about. With these things, the price of the stock is several orders of magnitude times earnings. Usually you would be selling equity, but with high-tech stocks (especially now with e-trading), it really is like financing a loan. Microsoft does not have that much equity. You would have to be pretty naive to believe that the stock market (especially the high-tech stuff) can sustain itself at that level. It's unsunstainable, therefore it's not really selling equity and can be seen, as Stephenson chooses, as financing a loan against the future burst of the bubble.

    3. What, pray tell, is the difference between Bill Gates "winning" and Micrsoft having bigger profits? Is there such an important distinction? His obligation is not short-term profits but long-term profits, which seems to be the same thing as "winning" in most cases.

    4. Arnett says that it's not price or features that OSes compete on, but momentum. He then says it's phenomenal that Linux has such momentum so young. But what about when Linux first started out? It had zero momentum. According to this theory, it should still have zero momentum, but we all know that's false. It has competed on features and price. Also look at Windows 98. It's not necassary for anything. You can still use 95. Did people buy it? Yes. Why? Not because they had to. They did it becaus OSes compete on price (ok, 98 loses there), feaures (yeah, they seem to the average consumer to do well there), and marketing (which Stephenson mentions, but not in the same place). And I'm willing to bet that if Hurd shapes up, people will use it. If it shapes up well, it people will use it instead of Linux, or *BSD, or whatever. Open source reduces the impact of momentum, as we've seen many times.

    5. Stephenson was not entirely against the Disneyfication process. I recall him saying it was better that a million people see the Disneyfied temple than that 10,000 cardiovascular surgeons go see the real thing for the same amount of money.

    The whirlpool/chaos metaphor is at least as confusing/misleaing as any metaphor Stephenson uses." what do you get if you break a whirlpool into pieces?" This is misleading. Breaking up Micrsoft would have an impact. The point of the antitrust case is that they not be allowed to expand their monopoly into other areas, causing things to stagnate. Arnett says Micrsoft does not own the internet. True, it doesn't now. That's the point. We shouldn' let them own the Internet. Breaking the division of Microsoft that deals with OSes from the part that deals with the Internet would help (look back to Arnett's argument about momentum; without the OS division behind it, MS-Internet would not have the unfair momentum)

    The use of the chaos example is particularly interesting. Arnett claims that we can find simple things from the general in a chaotic system. Ok, I'll buy that. But the reverse in a chaotic system is that you can't understand what the whole will look like from some of the pieces. This is not what a computer should be like, and that's part of Stephenson's arguments. I should be bottom-up, not top-down. We should be understanding computers from the simple parts and then extrapolating the big picture (with the help of a GUI if necessary). Arnett says that we should be understing the big picture (through the GUI) and from that inferring the underlying system. Me, I'll take the former system. It gives me more information. It's my computer, and I don't want to have to guess what's going on. True, dragging an icon may represent mounds of complexity, but I prefer the GNOME/KDE dragging, where I can know what the dragging means, to the Microsoft/Apple dragging, where it's all abstracted behind recognition.

    There, I'm sure I have lots of holes. I like this. Arnett wrote a good piece in response to (what I believe to be) a better piece. Now I've written a worse piece than Arnett. Now someone reply to me. I like this.
  • I do this sort of thing with UNIX/PeeCee .html files all the time on the Mac.
    Open file in BBedit. "Save As..." with appropriate linefeed option.
    Mmm...mmm... GUI goodness!

    Then again, I am a designer, so I fit Apple's demographic to a T!

    DaveY GraveY
  • when i use a GUI, i do different things than when i use a cmd line UI. over time these different things accrete changing the interconnects in my mind.

    i started out in '73 using the same lash-up Neal describes in his essay. it had an effect on my mind. it evoked certain patterns of thinking and
    approaches toward problems. my mind adapted to
    the UI available to it.

    years later, i got an Amiga and its UI was different, and i intereacted with it differently. in some ways more effectively, in others, less. nevertheless, the GUI has evoked a different pattern of thinking and different interconnects were forged.

    the only way i could observe this effect on my thinking was to experience at an expert level *both* command-line UI and GUI.

    the two ways are different, but to say one is good and the other is evil is to miss the point. one way does some things effectively and has some consequences, and the other way has different effectivenesses (sic) and different cognitive consequences. neither way is "better" both ways are "different."
  • "You are automatically assuming everyone wants to learn..."

    Even if only a tiny fraction want to learn, better that the "ugly guts of how things really are" be available (but not necessarily in the way).

    "I don't want to know how the microwave works... I just want to nuke the ham sandwich for lunch.
    That doesn't make me ignorant...just hungry."


    There's a difference between allowing you to know how it works and requiring you to. Early cars and computers demanded an understanding of the mechanics of how they worked if you wanted to get anything done. As they develop this is less necessary.

    These days, you don't have to understand, remember, or type
    find -type f -exec grep -i "meeting notes" '{}' ';'
    to find a file containing "meeting notes" on a *nix machine, but you CAN'T get the same kind of interaction from MacOS or Windows without a third-party utility. More to the point, you can't alter or ammend how a Mac or Windows performs these kinds of search. Why would I want to, you ask? That's my business. It's ease of use doesn't stem from that I can't change it.

    As I see it, the age-old argument has always been about whether ease-of-use and freedom-of-use were mutually exclusive. I never thought the argument was about whether the user was ignorant or hungry. I also never thought freedom-of-use meant requiring the user to learn how their tool works.
  • I think stephenson's point about "Hole Hawg" was that it is specialized. It is made to twist a big chuck with great torque. It is made to twist that chuck all day long, day after day. It is made to be repairable and maintainable so it can keep twisting month after month and year after year. It is designed for a nitch market, "Drill big holes with a portable (sort of) drill". And it does that job very well.

    There may be others that are better, but I doubt they are a lot better. If you are buying a drill in this class, you are probably more concerned with the dealer and service than the specific brand name.

    Also, I doubt there are many other drills out there that would fit in the nitch that the Hole Hawg has staked out. There aren't many people who need that class of drill.
    ************************************
  • Plus they were BLACK. Took IBM how many years to figure that one out? Took how long for a major OEM to start making systems available in something other than beige or white?

    Of course the point that Jobs killed Next is also true. I wonder what life would have been like if only little Stevie had had an ego kicked down a couple notches...

  • Intel hardware was not (IS not) sufficiently mature for MacOS

    Care to back this up? It sure doesn't jive with what I've seen from Alan Cox on the topic.
  • there is a link to it in the counter_rant, I was just too quick to post
  • the home despot owns...of course they have it there, they even have the kitchen sink
  • But if the deeper complexity is totally hidden from the viewer (GUI user), then they will never have the chance to learn more, because they won't even know it is there to be learned. "

    Not quite.
    You are automatically assuming everyone wants to learn. I can show you a whole crowd who don't.
    Not because they're ignorant or clueless, but because they have more important things to do with life.


    Accually, just to share my personal experience, which most of you could probably give a rats ass about. When I first started using computers it was on an msdos system (of which I spent a amazing amount of time in BASIC, hey I was 10 this was cool stuff for me), during this period I learned the basic ideas of command line. Which I later applied to learning unix on my local freenet, and on nethernet (is that system still up?). Anyways I havn't used either of those systems for years, in the command line side of them. And for the past 4 year have been pretty much strickly using Win 95/98. I have always been interested in the interworking of computers and really do like learning but have been sheilded from that side of it for way too long (mostly my own fault, but anyways) Beginning a few month ago, I've become reinfatuated with linux. I keep downloading it and installing it into another partition. Use it for a few weeks, and each in every time, go back to my Win 95. Not because I like Win 95, I'm absolutly discusted by it. But I can't for the likes of me really and truly figure out the command interface. Sure I know how to ls -al and ps and kill -9. But that doesn't help me set up and administer my linux box like I want to. Sure I can get X working, even got ppp working once. But I still feel that I can't take advantage of the emersive power linux gives me. Its sad to say I can get more power (as in being able to configure) out of my win 95 than I still can in linux. I know this is my own fault, but for some reason, I keep burying my head in shame, and scrambling back to win 95, usually after I get disgusted at the inability to get my sound right, or install icq (can anyone do this??? I heard there is a linux clone out there somewhere?)
    Anyways I partially blame this on win 95 dumbing down my knowledge, and ability to manipulate my computer the way I really want. I honestly I can't do without my icq/slashdot/email/minesweeper (cry yes its true) long enough to figure out how to get it all working in linux.
    ~testimony of a addict at his last thread
  • I have to admit that I admire anyone with the cojones to step up and refute Mr. Stephenson, a well-respected author, popular guy, and arguably competent user. To do so is to risk death by a thousand emails, and to risk having your web server fall over under a crush of curious rubberneckers.

    On the other hand, I don't feel that there was anything in "Beginning" that called for refutation. It was a collection of Stephenson's musings on a whole slew of issues, and outlined the structure of those thoughts. It was not gospel truth (although found myself nodding in agreement throughout the entire read) nor was it intended to be.

    Unfortunately, I can't get to whats-is-name's article, since it actually *has* fallen down under the crush of slashdotters, but I hope his rant is in the same spirit as Stephenson's...
  • I don't think Stephenson was attacking Disney. I suspect that to merely be a poor choice of words on your part, because it's fairly obvious in his text that he admires the "Disney interface" for it's amazing power, for its ability to perform the tasks for which it is designed.

    And you seem to take his comparisons of interfaces out of context. The purposes of the interfaces are practically impertinent to the discussion: they're meta-protocols by which some form of communication takes place. Perhaps he is unfair in comparing a nearly all-downstream "Disney interface" to a mostly-upstream GUI, but I think the parallels he draws are valid.

    I think you have missed the point that Stephenson wasn't writing a critical piece. It was almost purely an illustrative work, and while you or I may or may not agree with his perceptions, it's not arguable that they are his perceptions.

    At no point in his document does he indicate that only he speaks the gospel truth.
  • At first I thought he was going and on about this drill, as being some sort of uber drill and a symbol for industrial strength Os'es and such.

    However, I was at home depot, and realized that you can get that drill there, for $250, there are other drills that are more expensive, and even better. Michigan tools had a little stand there that talked about their different drills, apparently this one was designed in the 1960's as part of the "prefab" boom. They have newer drills, and less specialized ones. The "hole hawg" was kinda neat, but there were just as many other drills that seemed to be as good or perhaps better out there.
  • I know what you mean - Have a dewalt drill, mostly because of the yellow color, and the fact that it seemed to be fairly decent for the price. It's not a mikita though, although I doubt I use the dewalt to near it's potential.

    as far as the computer transferal of information - I don't know how to pass on the accumulated store of knowledge that's in my head, to my fellow it-workers, much less friends ro relatives. I mean I remember when the trs-80 was new,linux .9 etc, etc. We have interns here who never used anything but windows 95 before starting here. sometimes I don't even know the start when trying to answer "WHY is there twisted pair ethernet and 10base2" sometimes the answer is "no good reason - now, but back in the day." I mean it's weird when you run into someone who was born after star wars came out :-)and I was born in the 70's!
  • He was pulling up a personal memory, not picking a carefully researched example.

    I thought it was a nice metaphor: if the safety is provided by and reliant on the skill and intelligence of the user, rather than built-in idiot checks, you can get more done (of course, if you screw up...). How much time do you waste in Windows reading and checking confirmations, or doing things manually that you would script at the command line?
  • So who needs publishers?

    There's no major expense to adding a title to a print-on-demand system, so there's no reason to give sole printing rights to one publisher.

    Expect little printing shops to pop up all over the place; there should be plenty of competition.

    It won't be long before authors set their per-copy license price, and all the printing shops compete in speed, service, and mark-up.

    For mass-printed paper books, the author could negociate lower fees with a particular publisher, similar to the traditional model.

    But authors should get involved! Make sure it happens this way and not the way in the message I'm replying to!
  • Why not? I see it all the time. And print-on-demand book stores could be filled with browsing kiosks which navigate through the local web catalog.
  • And if you don't know, there's ls, man -k (or man, if you're looking for particular options), and automatic completion (with e.g. tcsh or 4dos)... So there are definitely memory aids in CLIs as well.

    Not inherently. These are commands which you must learn before using, not a fundamental part of the interface. It's a little like saying that a car's interface has memory aids because there's a manual in the glove compartment (ignoring, for the moment, the real inherent aids such as the markings on the gear shift).
  • by TheDullBlade ( 28998 ) on Tuesday April 20, 1999 @12:26PM (#1924838)
    I doubt Neal Stephenson would argue against the value of a GUI for a 3D-modeling program, or for a web-browser. I imagine he uses X all the time. Rather, he argues against having just a GUI, particularly for system management and development.

    In a sense, every control in a GUI is a menu: you are presented options of what to do and you pick one. You don't need much experience but you are limited to those options which are presented. The interface does the work of your memory, but less efficiently. In a command line, you are given no information about what is possible, but all operations are available instantly if you learn their names and options, all without searching the screen with your eyes.

    I do agree with some points in the counter-rant, but some of the metaphors were more appropriate than Stephenson probably intended. You can't drive a tank through town, and while towns and cities are a small percentage of the world they are where most people spend most of their time. Most weekend handymen wouldn't want a hole hawg.

    IMHO, the average user won't want Linux until you hide the most powerful tools from him, including the command line. I still think most business users would be better off with a black-box system where the 3-6 applications they regularly run are offered in a menu which is the whole interface to the system, or better yet, integrated into one application that automatically starts when they turn on the computer and saves everything (without overwriting the originals) when they turn it off.
  • Everybody overlooks Apple's major contribution to GUI over PARC: not menus, single button mouse, etc. It was the programming model.

    PARC said "use Smalltalk and get GUI". Smalltalk supported a form of closure for their blocks, which is how you attached things like buttons or menu items to code behavior. Unfortunately, the number of people willing to develop applications in Smalltalk was limited, esp. for commercial use.

    Apple's major contribution was the window, grafport, and event loop based programming model. There was related research work elsewhere, but exposing the GUI programming model the way they did was commercially novel. Most importantly, it enabled a commercial third-party application market using languages like assembler, Pascal and C.

    Tying an OS programming model to an interpreted managed memory language has historically been the kiss of death for a system (Smalltalk, Lisp, Telescript, Dylan). I believe this is because those interpreted languages were designed without sufficient attention to commercial application issues. In addition, memory and CPU were more expensive, and applications were simple enough that the overhead of managed memory was deemed excessive (you could conceivably track all of your objects without excessive effort).

    Consequently, Java is the first interpreted system that can succeed. Better attention to these issues, and better timing...

  • """
    Stephenson's sub-rant on the use of "document" and "save" in GUIs to mean the opposite of everyday meanings is not unusual in the least. A word is often used to mean opposite things.
    """

    The problem I see here is that we are not just using "document" and "save" as words, but as parts of a metaphor. As such, the use of the words to being the opposite breaks the metaphor that the Desktop is supposed to be presenting.

    See _About Face_ by Alan Cooper for a good deconstuction of the 'File' menu.
  • Neil Stephenson sure seems like a hacker to me. He describes himself in the "Command Line" essay as a long-time Mac programmer, and writes that he's been a Linux user since 1995. (Not to mention his mention that he acquired one of the first BeBoxen, which we can assume he didn't buy under the illusion that he was getting a better television-typewriter on which to peck out his novels.)

    He wrote the best article ever to appear in Wired Magazine (okay, so maybe that's a low bar; how about "probably the best major-publication tech article I've read"), which is about the whos/hows/whys of large-scale fiber projects. That article demonstrated Stephenson's firm grasp of the technical -- in addition to the social and historical -- issues at hand. ("Mother Earth Mother Board." Cover story for December 1996, http://www.wired.com/wired/archive/4.12/ffglass.ht ml)

    Finally, and to me most importantly, Stephenson's two novels that revolve around digital technologies and information manipulation ring unfailingly true, from the throwaway descriptions of how programming works (Hiro in the lifeboat building a motorcycle mostly by taking bits and pieces of old code and repurposing them), to effortless explanation of means and mechanism (the dog with an atavistic protect-the-boy impulse inside a crenellated, heat-diffusing body; Nell's book creating Turing World for her), to large-scale examination of the idea that computers offer us powerful new ways to conceptualize that activity which most defines and shapes us as humans, the manipulation of abstract symbols (the Babylonian/ur-language theme of _Snow Crash_; the info/bio/nano-tech merging conceit in _The Diamond Age_.)

    All three of the above strengths were sadly lacking in, for example, "The Matrix," which was much and loudly praised here.

    As eagerly as I await my copy of _Cryptonomicon_, I would be even more interested to learn that a collection of Stephenson non-fiction was in the offing. My favorite part of "Command Line" was actually the short digression about Disney World. I'm betting there's a draft of something as amazing as the DFW "E Unibus Plurum" essay that Stephenson cites somewhere on one of his (many) hard drives.
  • He totally misses the point.


    Similarly in responses to the original article, the one comment about "gee, this [stephenson article] is really long -- it would be nice if someone would put together a summary". I laughed out loud at that comment, and laughed harder when the clueless flamed him. None of those people understood the humor of such a comment because they read in the Stephenson piece only about literal computers rather than reading about the buffering of experience by some artificial interface. Look again at the piece -- there are all sorts of little interesting tidbits in there.


    Regarding the counterpoint's author, perhaps he has spent too much time reading RFCs...

    "With patience and plenty of saliva,
    the elephant deflowered the mosquito."
  • Maybe this isn't a nit, but rather a strong disagreement. Apple could have easily ported the MacOS to the x86 Architecture. All they had to do was reject their traditional "not invented here, not interested" attitude and make a deal with ARDI (http://www.ardi.com) to license the 68000 engine that runs Executor. It's a plain lie (or a distortion) to claim otherwise. Granted, there's no PPC emulator that I have heard of, but we're discussing history here, not the present.

    Executor (a Macintosh emulator for the PC, including an x86 linux version) is cool stuff, by the way. A very small tight team of coders developed one heck of a product that it seems NOBODY pays much attention to.
  • .. here or there [cryptonomicon.com]
    well what ever... just read it!

    nmarshall
    #include STD_DISCLAMER.H
    R.U. SIRIUS: THE ONLY POSSIBLE RESPONSE
  • by Lionette ( 33558 )
    Publishers are running to reprint their backlists? Since when? I'd submit that the opinion of a published author might be more informed on that matter. I, for one, am not seeing publishers eager to publish more and new material and out-of-print material... only works they're absolutely certain will sell. Quite a different matter from the proliferation of new and old and everything-in-between material Arnett seems to suggest.

  • It's called print-on-demand, something publishers are beginning to toy with.

    Of course, the state of the industry now grants publishers the right to print-on-demand as a subsidiary right, which means the author gets paid a lot less for it -- not only that, but having a book constantly in some publisher's backlist and theoretically available for purchase prevents an author from obtaining her rights from the publisher so that she can re-sell them to another publisher for a better deal/new and more publicized print-run!

    It might be good for publishers, and it might be good for consumers, but the authors have to eat too.

  • by Lionette ( 33558 )
    If there comes a day when no book need ever be out of print, then something radical is going to happen to the way authors are paid. There's value in reversion clauses; it's one thing for a publisher to be willing to never be out of print, but it's another thing entirely whether a publisher is willing to give their infinitely-retained backlist the kind of publicity it would need to keep an author even moderately compensated.

    Additionally, I have to wonder how having an eternal backlist will affect what authors are sought after.

    I don't have a problem with the idea of books never going out of print; I'm just skeptical. I don't see the publishing industry taking many chances on new authors. I see massive consolidation of publishing houses with increasing slushpiles and decreasing interest in radical books. Unless something like the afore-mentioned print-on-demand makes book publishing cheaper, I can only imagine that eternal backlists will only make it more difficult for new authors to break into the business.

  • The problem with this is distribution. That's the second of the two major reasons we have publishers (the first being packaging and printing, something that a print-on-demand solution would theoretically solve). Your aim as an author is to make money off every copy you sell... and it's much harder to get copies sold if you don't have a physical presence that people can handle, point to, have signed, have placed on large kiosks for them to see. It's far more enticing to a buyer to see an interesting cover and title and pick it up than it is to peruse endless text lists of titles.

    Authorial presences on the web are beginning to make a difference, but it still doesn't compare yet to physical print-runs, keeping in mind that books are sold to far more people than are on the net.

  • The problem is that sort of thing has nothing to do with a command-line-interface vs. a GUI. File manipulation of that sort can be done just as well in a GUI as in a CLI. The difference there is then you have the difference between simple, pandering GUIs (Windows, anyone?) and ones that can function complexly.

    However, a complex GUI is more difficult to learn, just as a complex CLI is difficult to learn. (If Windows is a simple GUI, then the command-line equivilent might be something like a simple menu system.) It may not be AS difficult to learn, but it is certainly more difficult than the simple GUI.

    Unix with X isn't quite there as a "complex GUI", but it's almost acceptable.

  • Absolutely right. That's a mistake that I'll definitely fix. Can't believe I wrapped it all under PARC...

    BTW, my DSL line is swamped; I'll be mirroring the piece at my ISP shortly and setting up a redirect.
  • Yes, publishers are eager to reprint their backlists, even though they aren't doing it today -- and this is the opinion of a published author and publisher! I'm doing consulting in this very area these days. The problem is the cost of the publishing supply chain. It is too expensive to print a few copies of any book, but startups such as Lightning Printing and supply chain automation developers are changing the economics considerably. It is generally believed in the publishing industry that there will come a day when no book need ever be out of print.
  • I've put a copy of the essay on angelfire and my server is set to redirect you there. The collision LEDs on my router and switch are breathing much easier now -- they were working awfully hard most of the morning...

    Nick
  • My agency (Waterside Associates) has been revising their contracts to deal with the possibility of books that never go out-of-print, since that has been the main test for rights reverting to the author. Contracts are already starting to require X sales per month or rights revert (I had a prescient lawyer put such a clause in a contract for a book of his I publisher six years ago or so).

    The main roles of publishing companies have been marketing and assuming the risks of a very inefficient distribution system. As technology increases efficiency, I think we'll see more publishers overall. The net offers a lot more opportunities for promotion. Anyone who thinks I offered this essay to slashdot out of pure altruism is naive, for example. I have a new book coming up, which will deal with some of the issues in the essay.

    By the way, there are about 55,000 publishers in the United States. Of course, I would guess that the biggest few have most of the market (and are no longer owned by U.S. companies!), the old 80/20 rule.
  • The e-mails have barely begun, but that's probably because my connection was crushed by the aforementioned crush of rubberneckers.

    I feel like this is a good moment to add that I really enjoyed "Snowcrash." Read it twice.

    Further, I'm not so unhappy with Stephenson's conclusions, just how he gets to them. I was learning SPURT (Sperry Univac machine language) in 1972 (and Algol, good heavens, before that) and I do understand the virtues of deep access to computers. When I use my Macs, I sometimes get very annoyed that there's no easy way built into the OS to change a bunch of file names, or to select them with wildcards, just as random examples of my own occasional CLI lust.

    But I also find it a bit weird when a great bit-twiddler bemoans the fact that compilers have all the fun these days. That was something that Andy Hertzfeld said back in the early life of General Magic. And I'm not criticizing Andy with that -- he's a friend and a wonderful person. He's not normal, though, which is good.
  • >The point of Chaos Theory, in short, is that simple, deterministic systems can produce highly complex rather-random-looking patterns.

    A pattern *is* the collapse of something complex into something simple. One can see both complexity and simplicity in chaotic systems; the beauty of chaos theory is that it offers a glimpse of how the chaos can be understand through simple principles, how simple features emerge from chaotic systems. Your statement turned that idea around, saying the same thing.

    I see the goal of being a good user as rather secondary to the computer being a good tool. Not that that's an argument against command line interfaces, but neither was the essay.
  • You're right. I was thinking "complexity" when I read "chaos." I've stuck a note to that effect in the essay itself.

    I don't think I can articulate very well the intuition that chaos and complexity are related. Not that that'll stop me from trying. Both theories show that seemingly random systems are not nearly as random as we had imagined. How's that as a start?

    However, I should acknowledge immediately that this is a step or two away from the point I was making in the essay about context and content.
  • You know, for easy reading and referral on your Palm Pilot. (i also removed some unnecessary & nbsp; tags.)

    Bookmarked, too:
    http://www.darryl.com/command.bm.prc [darryl.com]

    Or without:
    http://www.darryl.com/command.prc [darryl.com]

Do you suffer painful illumination? -- Isaac Newton, "Optics"

Working...