Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology Books Media Book Reviews

Electronic Life 197

It is difficult today to remember how intimidating computers were for non-technical people in the early 1980s. In previous decades, the machines had been corralled into computer departments at universities and large businesses, and were the responsibility of trained personnel. However, in the early 1980s, people who might have been perfectly happy never getting closer to a computer than a Star Trek re-run were told that personal computers would be soon be on their desks at work. This created demand for books that introduced computers, defined basic jargon, and reassured American readers that they could master the machine when it inevitably arrived. The panic probably reached a peak in 1983, as did the response. In 1983, Time Magazine called the personal computer Man of the Year ("Machine of the Year" under the circumstances). In 1983, The Soul of a New Machine won the Pulitzer Prize. In 1983, Michael Crichton published Electronic Life: How to Think About Computers.
Electronic Life: How to Think About Computers
author Michael Crichton
pages 209
publisher Ballantine Books
rating 4
reviewer stern
ISBN 0394534069
summary May be worth thumbing through for a glance of what the future was supposed to have been.

Crichton was already successful as a novelist, having published The Andromeda Strain, The Great Train Robbery, Congo, and other books. Several of these had already been made made into movies. Of course he would become vastly more famous later, with Jurassic Park and the television show E.R.

Electronic Life is written as a glossary, with entries like "Afraid of Computers (everybody is)" "Buying a Computer" "Computer Crime," and so forth. The book shows signs of being hurridly written, as few of the entries reflect any research. The computer crime entry, for example, is three pages long and contains only four hard facts -- specifically, that institutions were then losing $5 billion to $30 billion a year on computer crime, that Citibank processed $30 billion a day in customer transactions using computers, that American banks as a whole were moving $400 billion a year in the U.S., and that the Stanford public key code (not otherwise described) was broken in 1982. No examples of computer crime are given, though by 1983 such accounts were appearing in the mainstream press, and dedicated books on the topic had been around for at least a decade (I own one British example dating to 1973). Detailed descriptions of such capers make for good reading, so Crichton's failure to include any tells us that he did not take the time to visit the library when he wrote this book.

Electronic Life is of interest to modern readers in only two respects: first, Crichton's descriptions of then-current technology provide an amusing reminder of how far we have come. Second, and more significantly, Crichton's predictions for the future are worth comparing with what has actually developed.

As an example of the first sort of passage, on page 140 he points out that if you ask your computer to compute 5.01*5.02-5.03/2.04*100.5+3.06+20.07-200.08+300.09/1.10, there will be a noticable delay as it works out the answer. Later he suggests that a user would do well to buy a CP/M based system, because of all the excellent applications for that platform.

Crichton writes science fiction, and he knew very well that computers would soon do more than was possible in 1983. Such predictions are largely absent from this book, but a few entries do let us see what he expected for the future (other resurrecting dinosaurs, I mean). First, Crichton correctly expected that computer networks would increase in importance. He saw this as a matter of convenience -- computers can share pictures, which you can't do with a verbal phone call, and computer networks can operate asynchronously, so you can leave information for somebody and have have them pick it up at their convenience.

He also makes predictions for computer games, first explaining that there are several types of games:

  1. Arcade Games (which are in turn split into 'invader games', 'defender games', and 'eating games'.)
  2. Strategy Games (chess, backgammon, etc.)
  3. Adventure Games (text-based interactive fiction)
Crichton dismisses computer games as "the hula hoops of the '80s", saying "already there are indications that the mania for twitch games may be fading." He thinks that parents should not worry about their children playing games because, "it's a way of making friends with the machine." (that's not how I think about Tomb Raider 3, but to each his own). He was wrong here, of course, and missed entirely how games would eventually drive the high end of the home computer market.

Most interestingly in his predictions, Crichton clearly expected that computers would soon be as normal as home appliances like washing machines. He never anticipated that, through vastly increased numbers and reduced cost, they would become omnipresent and perhaps invisible.

The book is little more than a collection of off-the-cuff musings, and as such the most interesting entry is probably "Microprocessors, or how I flunked biostatistics at Harvard" in which Crichton lashes out at a medical school teacher who had given him a 'D' fifteen years earlier.

This book is a curiosity, not worth buying at a garage sale unless you are a Crichton completist.


Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
This discussion has been archived. No new comments can be posted.

Electronic Life

Comments Filter:
  • Nostalgia (Score:1, Interesting)

    by Slashdotess ( 605550 )
    This brings back memories of the old days when I was a UNIX sysadmin. Back when we used to write pacman for the terminals and run it in 2 in the morning.
    I read Michael Crichton's book a few years ago and I'd just like to share my memories.
    • Re:Nostalgia (Score:2, Interesting)

      by ackthpt ( 218170 )
      Back when we used to write pacman for the terminals and run it in 2 in the morning.

      I recall playing VT100 invaders on a PDP 11/50, which was pretty darn cool. Among other games, all we lacked was 3D and third dimension thinking. The logic behind the games still remains pretty much the same. (The fellow who wrote NBA JAM originally cut his teeth writing squash and tennis games to play on VT52 terminals)

      Still, those behemoth VAXen, PDP, System 370s, etc. did introduce those of us to tried, to writing multiplayer/multi access applications.

      • dang! You just brought forth great memories of visiting Pop's office at DEC in the early 80's and playing VT invaders... I'd totally forgotten about that ;)
    • Re:Nostalgia -- (Score:2, Interesting)

      by rupes99 ( 605886 )
      Brought to mind one thought... I got in computing in that era (built my own Sinclair ZX80) when you *had* to program yourself. That probablt isn't a good thing... But somewhere along the way it has become harder & harder to actually program computers (not in the sense it is more difficult as such, just you now have to try and search it out) When did they stop supplying BASIC as standard with computers ? Do kids who interested download free compilers these days instead ? Or do we encourage passive consumers ?
  • I dunno (Score:2, Interesting)

    As an example of the first sort of passage, on page 140 he points out that if you ask your computer to compute 5.01*5.02-5.03/2.04*100.5+3.06+20.07-200.08+300.09 /1.10, there will be a noticable delay as it works out the answer.

    Considering I got my first computer in 1980 (A 4Mhz Z80-based TRS-80), I think I can say with some credibility that there would not have been a delay computing that, even using interpreted Basic.

    On the other hand, those systems were amusingly slow by todays standards. As evidence, I submit that under interpreted Basic, I had memorized how to produce a 1 second delay loop:

    FOR I = 1 TO 500:NEXT I

    Yes, 500 empty loops took 1 second.

    • Re:I dunno (Score:5, Funny)

      by dubstop ( 136484 ) on Thursday November 21, 2002 @01:11PM (#4724442)
      FOR I = 1 TO 500:NEXT I

      Yes, 500 empty loops took 1 second.


      With most of the old basic interpreters, FOR/NEXT loops were slightly faster if the loop variable wasn't given after the NEXT statement. Therefore, if your code had looked like this:

      FOR I = 1 TO 500:NEXT

      You might have been able to squeeze a 1 second delay into 0.9 seconds.
    • Re:I dunno (Score:2, Interesting)

      by suitti ( 447395 )
      BASIC... FOR I = 1 TO 500:NEXT I
      500 empty loops took 1 second.

      A modern Palm OS machine running Cbasic computes this loop about 20 times faster (10,000 loops per second). On a 1 GHz desktop, the C compiled loop

      i = 1000000000;
      do {
      } while (--i);

      executes in one second.

      Byte magazine ran an April 1st article where they predicted that by 2000, PC's would be 107 MHz, switch selectable to 4.77 MHz. The joke part was the switch selectable feature, which was used to allow games that depended on the original PC's performance to continue to run. Naturally, a 486 clocked at 4.77 MHz was still much faster than an 8088 clocked at 4.77, so the switch was already an anachronism. But by 2000, a cheap PC was much faster than predicted.

      An interesting question is what performance is enough? In 1981, we'd run 30+ users on a 1 MIPS machine with 4 MB RAM. 1 MIPS is enough to do word processing and surf the web, but not enough to do full screen video. A PII/350 is enough to do full screen video. Is it enough to cope with current bloatware?

      Most of my cycles go to search for aliens.

    • You're making a false assertion.

      This operation contains floating-point arithmetic. On just about all computers of the day, FP units were emulated.

      Now, think about how many dozens of cycles a FP divide STILL takes us today with pipelined FP units. Now, consider that these processors run at 1/10000 the processing power of today's chips, and FP numbers have to be emulated by the ALU.

      We're talking TENS OF THOUSANDS if not HUNDREDS OF THOUSANDS of cycles for a single FP divide.

      The FP multiplications also take a sizeable amount of time, but nowhere near the time the divide takes. All the addition/subtraction also require emulated FP. Considering a speed of only 4MHz, the above would definitely take a noticable chunk of a second, maybe even more than a second if my efficiency guesses are much worse.
      • We're talking TENS OF THOUSANDS if not HUNDREDS OF THOUSANDS of cycles for a single FP divide.

        What the hell are you talking about? It's not done using repeated subtraction, you know. As someone who has written a floating point package, I can say with certainty that it's not even near that (I can't give you a number because it depends heavily on what the assembly language has for operations).

        Put it his way: does it take you thousands of steps to do a long division? No? Then what makes you think a computer needs thousands of steps?

        Hint: it's a matter of shifting and subtracting. What makes it slower than multiplication is that you have to do test-and-borrow stuff.

        Hell, my little FOR loop above is doing 1000 floating point add/subtracts per second on top of the overhead of the Basic interpreter (which is by far the most significant), although adds/subtracts are obviously faster than multiple/divides.

    • Re:I dunno (Score:3, Funny)

      by mccrew ( 62494 )
      > Yes, 500 empty loops took 1 second.

      Sounds about right. You're talking about Java, right?

      Oh, BASIC.

      Nevermind...

  • by hermescom ( 624888 ) on Thursday November 21, 2002 @12:56PM (#4724285) Homepage
    Yeah, I remember back in the day, our computers were powered by squirrels, and you had to keep feeding them every five minutes, and every other monday we would send somebody to the market to buy new squirrels. (Squirresl were 5 for $4.99 back then).

    Also, our computer only had three bits of memory, so we really had to write everything down on little bits of paper, which was a problem because our wpare squirrels kept carrying them away and hiding them.

    THOSE were the days...

  • by lamz ( 60321 ) on Thursday November 21, 2002 @12:56PM (#4724287) Homepage Journal
    I can understand why Crichton predicted that video games were a fad. Around that time, Intel had lost pots of money on Intellivision, Coleco was on its way to going broke because of Colecovision, (and was only saved, incidentally, by the later success of Cabbage Patch dolls,) and Atari had started its long slide into the ground. Many arcades started to move the video games to the back and pinball machines to the front. Nintendo and Sega weren't on the radar yet, so it really seemed to a lot of people like video games were fading away. And as to PCs, it would be years before they had arcade-quality games which surpassed the Atari and Commodore lines of personal computers. PCs didn't typically have colour screens until the late 80s.

    • Intel had nothing to do with Intellivision. Mattel made it, based on a chipset made by (I think) GI. And Coleco wasn't going broke because of Colecovision, it was going broke because of Adam. It took Nintendo (the only company "stupid" enough to believe there was still a market for video games and sell one) to bring the market back to life.

      As for Atari (Computer, not Games), if they hadn't shelved the 7800 for two years (it was manufactured and ready to ship, but they warehoused it at the last minute), or even if they hadn't refused the option on the NES, they might still be around today, and not just a name that changed owners twice so far.

    • However, Clive Sinclair had shipped the ZX81, Commodore had its PET going and the VIC20 was just on its way in, and small 8-bit home computers were multiplying like rabbits! And the primary use of all of these was to play games.

      Sure, hardware where the only means interaction was a joystick were well on their way out - it's only in the relatively recent past that consoles have made a return. The 80s and 90s were absolutely the realm of the home computer (as defined by including a keyboard), due to the fact that a few hackers could program them, and the rest of the world could play the games they created. The keyboard was less important for the many, but without it, the few couldn't produce the stuff that the many used.

      With that in mind, Crichton's guess was pretty short-sighted even at the time.

      It's rather the same way that after the crashes of Pets.com and other similar crap ideas, ppl spouted a load of nonsense about "the web is bad for business". The problem isn't the medium, the problem is that a bunch of business majors went hog-wild, didn't think the problem through, and consequently cratered their companies. Early-80s consoles had exactly the same problem due to the management of the various companies all doing *really* dumb things.

      Grab.

  • by Anonymous Coward on Thursday November 21, 2002 @12:57PM (#4724294)
    It is difficult today to remember how intimidating computers were for non-technical people in the early 1980s

    I guess my office is stuck in the 1980s...
  • Crichton & Timeline (Score:3, Interesting)

    by Dugsmyname ( 451987 ) <thegenericgeek AT gmail DOT com> on Thursday November 21, 2002 @12:58PM (#4724307) Homepage
    It's not supprising that Crichton did his homework before writing the novel "Timeline". Timeline is a great novel that involves the mechanics of quantum computing. He does a great job of breaking down how a quantum computer (of more than 5 atoms) could work. It's also worth a read.
  • by krinsh ( 94283 ) on Thursday November 21, 2002 @12:59PM (#4724315)
    I know several 'non-technical' people, and dozens more technical and not with computers on their desks and they are still intimidated by them. That is one of the prime reasons [desktop] tech support people have some job security; and why most of the industry rag's "job market predictions" claim that tech support is the way to go if you want to keep feeding your kids with an IT-related salary. On the other hand, this book reminded me of the video game display I saw in the Baltimore Science Center (Museum?) at the Inner Harbor [my wife and I used to frequently trek there for weekend mini-vacations if we didn't have time for AC with her parents]. They had all sorts of old cabinets beginning with a copy of the original Pong; and of course history and some video on how a lot of these were developed, and a few transparent ones so that you could see the boards and ROM and so forth. I wish I remembered the name or they had a book to go with that.
  • by j_kenpo ( 571930 ) on Thursday November 21, 2002 @12:59PM (#4724316)
    As the poster notes, this may not be a technically sound book, may not be worth owning, and shows signs of little research and quick writing but I think its still worth it. To compare and contrast predictions and attitudes from the past to the now is always interesting. It could have been anyone and it still would have been interesting. I remember my thoughts about computers at that time. I was fascinated by them, yet everyone was so paranoid because of the high cost noone wanted to touch them. I always got the impression of "We need to have a PC, but dont touch it, its too expensive". Id be interested to read it just to see what others thought at that time about PCs. Itd make an even more interesting read though if it included that, and then his opinions today on PCs.
  • Congo (Score:5, Informative)

    by Doctor Faustus ( 127273 ) <Slashdot@@@WilliamCleveland...Org> on Thursday November 21, 2002 @12:59PM (#4724322) Homepage
    That's scary that Chrichton wrote a book about computers in 1983. In 1980, he wrote Congo (of his books, only Sphere is better, and Jurassic Park is about on par), which is a great book, but it demonstrated he didn't really understand computers terribly well.

    The expedition team in in the Congo in Africa, using satellite communication to the United States. Because bandwidth was so limited, their messages were abbreviated to IM-Speak and beyond ("HLO. HW R U DOIN? MY NAM IS MKL CRITN.", etc.). Makes sense, I supposed. Along with this, though, they needed to do digital cleanup of a moss-covered wall, so they took digital video of the wall, sent it via satellite to the U.S. where it ws processed, and received the results back in Africa, in real time. Right.
    • I have read the book, _Congo_, but I do have to say that the movie is one of the worst either made. It contains every jungle cliche known to lifekind and is alway as bad as "Plan 9 from Outer Space", but not as bad as "Courntey and Kurt".
      • Movies (Score:2, Interesting)

        by Amata ( 554796 )
        All of the movie recreations of his books suck worse than most movie recreations of books. Somehow in JP2: The Lost World the black boy and the white girl got condensed into a black girl who has some relationship with Malcolm that I never fully understood because I was on a bus down to Fla when I was "watching" it.

        Re Congo specifically: it has been said that the only way to enjoy that movie is, with a group of friends, have everyone pick a character. If your character survies, you "win". Win what, I don't know, but it at least keeps you paying attention.

        All in all, I suppose this is why I own a crudload of books and about 5 movies, and the movies were gifts.
    • Surely you jest.

      Have you read _Andromeda Strain_ or _The Great Train Robbery_?
    • by Fear the Clam ( 230933 ) on Thursday November 21, 2002 @04:01PM (#4725974)
      Disclosure is all about hammering out the problems in CD-ROM drive production, but meanwhile, the fact that the same company has solved virtual reality doesn't get a comment.

      Jurassic Park: Yeah, we can re-create extinct animals and basically fuck with genetics all we want. So what do we do with it? Open a zoo.

      Having "solved" time travel, in Timeline, what's it going to be used for? Stock market speculation? Changing history in a big land grab? No, an amusement park.

      Chrichton stories are all about getting super powers, and then using them to order a pizza.
  • by craenor ( 623901 ) on Thursday November 21, 2002 @01:01PM (#4724333) Homepage
    When will people realize that as a society and a species, we are driven by our need to entertain ourselves?

    News, TV, Movies, Sports, Games...most all consumer products in some way pander to our need to make ourselves happy and distract us from the day to day.

    Face it folks...immersive games are here to stay. They are the electronic crack of the 21st century.
    • As animals we are driven by our need to reproduce. Games are one method of deferring that need into another accomplishment structure. I play a LOT fewer games when I have a girlfriend. And before you make the obvious joke, this isn't THAT rare of an occurence for geeks.
    • These fads come and go. During the renaissance, curiosity and culture drove people. Before that, the Greeks were inventing hedonism. Still further back arts, crafts and skills were all important.

      The "Don't worry, be happy" phase we're going through now will eventually turn around.
  • by The Grassy Knoll ( 112931 ) on Thursday November 21, 2002 @01:03PM (#4724362)
    >It is difficult today to remember how intimidating computers were for non-technical people in the early 1980s

    Hello? Computers are still pretty intimidating for non-technical people in the early 2000s!

    That's why Code Red/[insert name of favourite virus here], etc. proliferated so widely. Most people don't understand computers even to the level where they know how (or why) to install security patches.

    • Security patches? I think you're shooting a little high. I doubt the technologically unitiated even understand what a security patch is. Most of those email viruses were easily avoided by simply using an email program besides Outlook Express. Unfortunately, most people don't understand computers even to the level of being able to install and run programs besides the ones that came on their machines.
  • computers or VCR's for that matter. My 18 year old niece says to me. I'm feeling pretty old, but its fun to reminice and see how far things have come in such a short time.
    • by Anonymous Coward
      It's only going to get worse. In a short span of time, I've seen home computers rise from Commodore 64's to massive gigahertz-driven monstrosities.

      We live in interesting times. Where other sciences have reached maturity, and breakthroughs come slow and are profound, computer science is still young. There's so much to learn, so many ideas that haven't even been touched upon. Our breakthroughs will not change the world overnight, but they have and will continue to come much quicker.

      In a best case scenario, I've only lived a quarter of my life. And I'm astonished. I sit back after playing the latest game and think, "Man, I remember back when everything was all pixellized.. And we thought the graphics were so realistic!"

      I look at Windows XP, and think, "We wouldn't have dreamed some of these things were possible back then, now we take them for granted."

      On a tangent, I think that's one of the lures of Linux. People who missed the original revolution due to being too young/too technically inept/etc., can take part in building something from the ground up.

      Anyway, ten years ago, who the hell would've dreamed we would be able to watch movies on our PC?

      Staggering.
  • "He was wrong here, of course, and missed entirely how games would eventually drive the high end of the home computer market.
    Most interestingly in his predictions, Crichton clearly expected that computers would soon be as normal as home appliances like washing machines."

    It's amusing to look back at how wrong he or others (Bill Gates with "nobody will ever need more than 64Kb" (paraphrase)) have been wrong about their predictions, but it all goes under the heading of "Hindsight is 20/20", and I don't think we can fault Crichton for that.
    • I like an even better quote:

      ``I think there is a world market for maybe five computers''
      Thomas Watson, IBM, 1943

      an all Asimov books with their 'computers as big as a city'... those were the days...


      • I like an even better quote:

        ``I think there is a world market for maybe five computers''
        Thomas Watson, IBM, 1943


        That might have been true, for computers with the price/performance ratio of 1943.
    • I believe the quote was 640k (Gates later said that he never said it). 640k makes sense because that was the limit of memory a program could take (the rest of the 1 meg was reserved for the system and the BIOS if I recall).
  • Since when were computers in corporate data centers or universitys _not_ the responsibility of trained personnel???

    I didn't think they hired just anyone off the street to run these things.

    Oh wait... I forgot about those MCSE's.

    Perhaps they DO just hire any bum off the streets afterall...
  • by Elwood P Dowd ( 16933 ) <judgmentalist@gmail.com> on Thursday November 21, 2002 @01:07PM (#4724401) Journal
    My grandma suffers from a number of ailments that restrict her movement. For a while my dad kept suggesting that she get a mac to play around with. My mom's mom got one, and loved using it.

    Anyway. My grandma's problem wasn't that she was scared of using a computer. She'd say, "You don't know what you're talking about. I used to *run* a computer. I know all *about* computers. What the hell do I need a computer for?"

    She used to be the administrator in charge of the computer for the Grand Rapids Police Department. In the 1950s. Punch cards. Hehe. Old people are funny.
    • One question pops into my head after reading this: does your grandmother watch TV? If so, she must realize that what your dad is talking about is vastly different than what she remembers. At least your grandmother knows what a computer is. I haven't even tried to introduce my grandmother to a PC for fear that she will have an aneurysm trying to understand it (I am not being (completely) sarcastic).
      • My grandparents don't have a phone, cable tv, or even an ATM card. I couldn't imagine trying to explain what I do from day to day at work.

        I imagine that someday my grandchildren will have jobs that I am unable to understand the need for.
        • My parents are now in their mid-50's, and they've managed to cope with computers and stuff just fine. However, my grandparents were never really able to get the hang of new technology. Funny thing is, according to my parents this was the case since they were children.

          I think the main difference is that my parents received a much more comprehensive education than my grandparents, and consequently didn't treat new technology as some sort of voodoo.

    • Punch cards. Hehe. Old people are funny.

      Heh, heh......I ain't laughing. I still find rubber bands in my moving boxes from the days of punch cards.

    • My co-worker is better...

      She's this lovely lady in her sixties. She's doing office admin work here in our small office, just for spending money for when she flies to Palm Springs every winter. Until this job, she hasn't touched a computer in literally decades! It took her a while to get used to the idea of a mouse and a GUI, not to mention the flakiness of Office, but she's learning great. However, she used to be a crackerjack COBOL and FORTRAN programmer, so once in a while she still amazes me. She says she misses the old UNIX command line, and all that came with it.

      Boy was she mad when she found out she couldn't have a look at the source code for MS Word, to see where the bugs were!

      Bork!
    • Its like my dad. . .he worked as a datacomm manager for 25 years and through that time worked on some amazing systems (for their periods: the 1401, Sytem/360, etc).

      To this day he does all of his taxes with a pencil and accounting paper and when my mother bought him a calculator in the early 80s it collected dust as he said, "What do I need a calculator for? I have the most powerful calculator in the world on the top of my head!"

      He says that today about PCs :).
    • That's nothing. My great aunt used to hang a towel over the TV whenever anyone was getting dressed in the house.

      There's wisdom in age, but there's a lot of madness too.
  • funny book (Score:4, Interesting)

    by iocat ( 572367 ) on Thursday November 21, 2002 @01:09PM (#4724423) Homepage Journal
    I have the book; it's okay. It has a lot of BASIC listings in the back. I love the way older media on computers just assumed that you'd need to be able to program, and to know how a microprocessor works to get any value out of the machine: I only wish it was still like that today.
    • "I love the way older media on computers just assumed that you'd need to be able to program, and to know how a microprocessor works to get any value out of the machine: I only wish it was still like that today."

      Why would you want that? Do you not remember how in the 80's, every brand of computer (Apple, Atari, Commodore, etc) was incompatible? Often older models weren't compatible with newer ones -- C64 and VIC-20, Apple 2 and GS, etc.

      Even now architectures chage quickly -- look at the changes of video cards and CPUs over the past 5-10 years. If you write code for a specific graphics card or CPU, it would be "out of date" in just a few years. Consider this: in 20 years, we've gone from 8 bit CPUs on the desktop to 32 bit CPUs. AMD's Hammer has the potential to make 64-bit desktop computers common. For most people knowing the specifics about the CPU is more info than is needed -- let the compiler do the optimizations.
  • >Crichton dismisses computer games as "the hula hoops of the '80s", saying "already
    >there are indications that the mania for twitch games may be fading."

    To be fair to Crichton, he was basicaly right for about 10 years. It wasn't untill Doom and the orriginal PLay Station that computer games were noticed by the mass market and became more than children's toys, or a specialist niche hobby.

    I haven't read the book, but I'd like to know how long the reviewer thinks the book remained relevent? Anything over a decade would be pretty impressive.

    Simon Hibbs
    • I don't know. During your dark age of gaming, the Amiga was one heck of a great games machine.
    • To be fair to Crichton, he was basicaly right for about 10 years. It wasn't untill Doom and the orriginal PLay Station that computer games were noticed by the mass market [more]

      I guess you are right, if you choose to completely ignore the millions of dollars that went into Amiga games. Thousands where made, some sold a million. In fact in the late 80's/early 90's the PC was the complete laughing stock of the gaming community (4096 color Amiga or 16 (fixed) color EGA PC?). It wasn't until Doom that people even started switching.

    • Eh?

      Spectrum, C64, Amiga, Atari ST - they all had a totally mass market for games.

      Then as now, the core market was the same - mostly teenage boys. Sure, there's plenty of adults today with PS2s and hardcore gaming rigs, but there were similar adults back then with the latest home computer (eg. the Amiga A2000). This is especially the case for the Amiga, which was *way* ahead of the PC but got slammed by Commodore screwing up and going bankrupt by investing in too many shit projects. For an example, StarGlider 2 was 5 years ahead of X-Wing, and conventional flight sims were ahead by a similar amount. The area which saw particularly impressive advances was the graphical adventure area, which was the precursor to all the Tomb Raider stuff - the Amiga and Atari ST were the prime movers in this.

      Remember that Doom was a 2-D hack which just looked like 3-D. It wasn't until Quake that 3-D cards became required. The Amiga had 3-D games throughout though - the only thing that wasn't invented was the first-person shooter. Had some bright spark come up with that (and they certainly had platforms that'd support it!) then things would have been quite different. The only issue was that one type of game, it was nothing to do with the platform.

      Grab.
  • on my P4 1.6GHz... (Score:1, Interesting)

    by Balerion ( 25115 )
    ben@mojo ~/test$ time echo '5.01*5.02-5.03/2.04*100.5+3.06+20.07-200.08+300.0 9/1.10' | bc
    -80.80

    real 0m0.006s
    user 0m0.010s
    sys 0m0.000s
    • bc
      5.01*5.02-5.03/2.04*100.5+3.06+20.07-200.08+30 0.0 9/1.10
      -80.80
      scale=5
      5.01*5.02-5.03/2.04*100.5+3.06+ 20.07-200.08+300.0 9/1.10
      -126.79155

      It'd be nice to get something like the right answer.
  • Now, it's the robots (Score:1, Interesting)

    by erixtark ( 413840 )
    The development of robots is very similar to that of the personal computer in the 70s and the 80s. In 20 years we will remember this time as the last years without a robot assistant.
  • by ackthpt ( 218170 ) on Thursday November 21, 2002 @01:15PM (#4724475) Homepage Journal
    Pick on Crichton if you must, but even Bill Gates failed to anticipate the impact of the internet in "The Road Ahead" They're not alone. The textbook I was assigned to buy for college classes, back in 1983, were as as fanciful and completely off target as Crichton and Gates.

    A more enlightened approach would have been to observe what people were actually doing and how a vastly faster computer of small size might be useable to them, in ways other than balancing their checkbook.

    Science fiction and some really old comic books are amazingly on target, frequently, although they still depicted computers as being massive things.

    • Oddly enough, one of the few people I've read who was making an awful lot of future-oriented predictions about media, telecommunications, and the increasing importance of networks was Marshall McLuhan. It's really amazing how much McLuhan got right, considering that he died in 1980, and attained the height of his fame in the late 1960s. Good places to start for your elementary McLuhan education are Understanding Media and Counterblast, which seems gimmicky, but manages to reduce McLuhan's prolix yet lucid prose into catchy aphorisms and illustrative graphics (thanks to the design work of Harley Parker).

      Relatedly, I have an issue of Amazing Stories with prognostications on "Life In The Year 2000." I'm still waiting for the 4-day, 20-hour, full-pay workweek, the spray-washable house, and the cheap energy [publicpower.ca]. The moral of the story is, you may get some of it right some of the time, but you're never going to be 100% on the money.
  • by i_want_you_to_throw_ ( 559379 ) on Thursday November 21, 2002 @01:15PM (#4724479) Journal
    We were in high school in the early-mid 80s.
    Just the perfect time for that expert blend of
    1. Low self esteem
    2. Teenage years
    3. Dawn of the PC.
    to bring us to where we are now...still dateless and coding.

    Geek used to be a 4 letter word, now it's a six figure one.
    • 4. Period of time when you could still get at the guts of the software. Apple IIs had a nice built in debugger, a programming environment, and you could see the source for a tremendous amount of programs. Same for similar computers of the time...

      Now, there's so many layers that it's kind of intimidating to a new developer...
    • It is difficult today to remember how intimidating computers were for non-technical people in the early 1980s [...]
    ... this reminds me of one of my favourite bits of "latter-day aracana": Back in the olden days, when small computers were bigger than large cars, Managers of large companies introducing the behemoths were on occasion given (children's) Ladybird [kinetonbooks.co.uk] books, specifically the one [digibarn.com] titled "The Computer [commscene.com]" as a primer.
    (Come to think of it - I wonder who had the means to put them through the embarassement ...)
  • by VikingBerserker ( 546589 ) on Thursday November 21, 2002 @01:21PM (#4724523)

    20 years ago: Prepare now, for computers will become an important part of your life.

    Today: Those of us who use computers most often today tend to have no life.

  • by circletimessquare ( 444983 ) <circletimessquar ... m ['gma' in gap]> on Thursday November 21, 2002 @01:22PM (#4724539) Homepage Journal
    games won't last... heh.

    crichton's take on video games reminds me of what some futurists said around the birth of the television.

    they said that the television was going to be a great instrument of education, and bring thousands into enlightenment.

    yeah, right. -insert ironic tv laugh track here-

    i guess crichton fell into the same trap as many futurists: technology as savior. a lot of us see new technology and envision how it will improve us all.

    meanwhile, some guy somewhere is writing the first donkey kong game. somewhere there must be a graph comparing how many cpu cycles of all of the processors ever made have been spent playing games versus other computer-related exploits. it would be an interesting comparison. as a victim of civilization iii, i can attest to the fact that a lot of the good electronic life is spent taking a lot of digital crack. ;-P
  • Cut and paste it into CALC.exe (yes, windows),
    press enter.

    -126.79217967914438502673796791444

    I love fast machines.
    Cool :-)
  • by mttlg ( 174815 ) on Thursday November 21, 2002 @01:30PM (#4724607) Homepage Journal

    Perhaps the most interesting part of this book is where Crichton discusses copyright. He takes the opinion that copyright will need serious reform as the amount of electronic content increases because of the simple fact that people want to copy (he cited the success of VHS over laserdisc to support this position). This jumped out at me because I read the book back when Napster was at its peak. Unfortunately, Crichton seems to have underestimated the power of the entertainment industry - the DMCA is almost the exact opposite of what he envisioned as the future of digital content. Maybe Crichton's next novel will be about a group of people who narrowly escape death while attempting to view copyrighted material they legally purchased...

  • I have not yet read Electronic Life, but I have always been impressed with the level of research Crichton's narratives display, and while this was one of his earlier works, I wouldn't be suprised if he decided not to include all the material his research revealed for the sake of readability.

    Jurassic Park stands out in my mind as the most well researched work of Crichton's, so much so that most people I know who didn't enjoy reading it say it was because he dwelled on the science in favor of the narrative too often. I like Crichton's approach personally, because for me it grounds the story better in the science of today, and shows the reader how his world developed out of our own without fantastic leaps of science. Authors like Gibson, who write great narrative and have turned out to be prescient about technology but never dwelled on how their world came to be out of the one we live in, have always felt more like fantasy than science fiction to me. This is because I don't feel as connected to their world without them illustrating a plausible course of events that could lead society from where it is down a path to the world that they envision.

    In any case, I was already planning to purchase Crichton's newest work, Prey [amazon.com], but now I 'll have to go grab this one as well.

    • I can't decide if this is a well executed troll, or just someone who really believes what he is saying. I certainly find it goes against everything *I* know about Crichton's work...

      If it is a troll, comparing Crichton to Gibson is a masterstroke!
      • It is not meant as a troll. Don't get me wrong, I like Gibson. But he tends to do a lot of hand waving - explaining how AI, surgical and cybernetic augmentation, private space stations and VR so immersive that it can kill you came to be common place are glossed over. It is left as an excerise for the reader's imagination how all this came to be. Crichton on the other hand ties all of his science fiction to science fact; insects in amber to DNA to supercomputing gene sequencers to overambitious developers and their patented living creations.

        I don't fault Gibson at all, because the world he created was so far removed from the one that he actually lived in, but for me the suspension of disbelief is much more easily conjured when I start in the concrete and fact-based and am lead to the what-if through the narrative. Gibson had no choice but to start with the what-if, and for that reason I could never feel as immersed in his world as I can in Crichton's. At the same time, Gibson scores more points than Crichton for his prophetic prediction of the 'Net, and as of yet no T-Rex's or Compy's have shown up on the mainland ;)

        As much catching up as reality has done since Neuromancer and its sequels were conceived, the reader has to invent for himselfs the paths that lead from the world as we know it to the world in which Gibson's characters operate. Crichton draws that path much more clearly, and for that reason I find it much plausible. I will be the first to admit that plausible doesn't directly equate to enjoyable, as I enjoy George Lucas' to no end, but in my mind.

        Star Wars and Neuromancer are in my opinion great works of technologically themed fantasy, whereas I think of Jurassic Park and Andromeda Strain as great science fiction. The distinction between science fiction and fantasy have been as hotly debated on /. before. I don't consider myself on expert on either, but perhaps this better states my opinion on the matter.

        • Well, Gibson and Crichton are doing different things, aren't they? Crichton focuses on tracing a line from Today to Tomorrow, with a plot about hubris as a substrate in which to suspend the results of his research. He is also obviously weak at character development, and it's generally quite obvious that the plot (man vs. technology or whatever) is subordinate to the story (how today's technology came to be, or how today's technology becomes tomorrow's technology). Airframe is an excellent example of his preference for research data over character development.

          Gibson, meanwhile, belongs to that group of SciFi writers who prefers to focus on how individuals and societies might be changed by some future technology, and never mind how the tech got there or how it works. For my taste, Gibson is also much better at developing interesting characters--so much so that the "Virtual Light" trilogy took me a little while to warm up to. It was very character focused, and the blinkenlights were kept more in the background than I was used to. Nowadays, though, my fave "SciFi" author is Iain (M.) Banks, whose stories are all people/societies, and when the tech comes up at all it's usually just handwaving on the order of "here's a glossy brochure about technobabble, but let's set that aside for now and talk about the characters some more".

      • I never understood the draw of Gibson. Yes, he contributed a great deal to later authors, and his books are at least enjoyable, but IIRC from reading an Hypercard version of some of his earlier books, they tended to be confusing...
  • by garyrich ( 30652 ) on Thursday November 21, 2002 @01:38PM (#4724690) Homepage Journal
    "Crichton writes science fiction"

    He does? I've never seen any. He writes technological thrillers. From Andromeda Strain (a good one) to ER (a mediocre one).
    • He does? I've never seen any. He writes technological thrillers. From Andromeda Strain (a good one) to ER (a mediocre one).

      Eh? Most of Crichton's works are based on science that either doesn't exist yet, or hasn't been used in the ways he's conjuring it up... that's what defines science fiction.

      Jurassic Park (dinosaurs haven't been cloned), Andromeda Strain (more back then, before we saw ebola), Sphere (alien object that can manifest our subconscious thoughts) - all great sci-fi works, even though they aren't set in a futuristic outer-space setting.

      I kinda wish ER were sci-fi - it'd make it more interesting. "Patient is a three-year-old android, with possible faulty bios. Somebody get me some nanoprobes!"
      • Well, veering off topic, but since two people asked:
        His work usually has science or technology *in* it, but it takes more than just that before I'd call him a science fiction writer. It hard to write contemporary fiction that doesn't after all.

        Jurrasic Park is a good example. It's related to science fiction but I dont' think it really is. It's King Kong updated with a some technospeak wrapped around to make it seem plausible. The fact that the technospeak is fairly good makes it a better giant monster movie, but not science fiction. The science/technology is there to support the giant monster story. A science fiction story would start the other way around. With a "suppose that you could harvest DNA from extinct species - what would happen?". There are lots of good science fiction stories that could come from that "what if", but none of them are Jurrasic Park. Going from zero to a giant dino amusement park in one go just doesn't make sense. By the time the tech was advanced enough for that it would have crept out and shown up in a million subtle and non-subtle ways through the society. A science fiction writer would have tried to deal with all those things. JP doesn't, because the author isn't interested - he's just trying to make a monster book/movie.

        The science/tech is like Batman's utility belt on the old TV series - it just has whatever is needed to get out of the current plothole. I don't consider Batman to be science fiction either, or King Kong, or James Bond. Star Wars gets in, I guess becaue "everyone" considers it science fiction - even though it is devoid of science or the socal effects of science.

        Further I don't think Crichton considers himself a science fiction writer. He writes mainstream fiction that tends to use current or very near future technology. If nothing else, he makes *far* too much money to be a science fiction writer :-)
        • The fact that the technospeak is fairly good makes it a better giant monster movie, but not science fiction.

          You're right: I think most of Crichton's works would fall more under the 'monster' genre, and I would never think of James Bond as sci-fi, despite the definition I gave. Just hadn't thought about that when I fired off my original post.
  • Jargon (Score:5, Funny)

    by Insightfill ( 554828 ) on Thursday November 21, 2002 @02:00PM (#4724887) Homepage
    defined basic jargon

    When I was in mainframes in the early 80's, the mainframe repair guy had a good one.

    He was on the phone talking to the refrigerator repair guy and told him:
    Tech: "My refrigerator is down."
    Repair Guy: (longish pause) "'Down?' where?"

    Today, that probably wouldn't have been a big deal.

    OTOH, that was also a job that had so conditioned me that I started to type a "9" to get an outside line on my home phone.

    (good grief: I'm 34 and talking about the "good old days")

    • When I was in mainframes in the early 80's...

      (good grief: I'm 34 and talking about the "good old days")

      Good grief, you're referring to working on mainframes (when you were 14!) as the good ol' days :-).

  • It seems clear to me that Crichton's 'Non-fiction' on technical subjects is even worse than his 'Science Fiction' is when it comes to science.

    Glenn Reynolds [instapundit.com] has been wondering just how much Crichton's new novel (on Nanotechnology) will get wrong or sensationalize [instapundit.com]. The worry being that Crichton could easily cause an anti-nano-science backlash by putting the phear of grey goo into Joe Sixpack...
  • by Tri0de ( 182282 ) <dpreynld@pacbell.net> on Thursday November 21, 2002 @02:12PM (#4724996) Journal
    The first practical book on computers I ever saw: "Peter Norton's guide to DOS". I still remember his premise:
    you show up to work one day and here is a computer sitting at your desk, you haven't seen one before, don't want one AND the boss is expecting you to become vastly MORE productive. now.

    Anyway, that is the supposition he started the book from. Good book as I recall, no BS.So where some people saw panic, or hyped everything up others saw and siezed opportunity.
  • ...starring russell crowe as 'everyman' with special appearances by the tron's master controller, and an original soundtrack by thomas dolby...
  • Digital - Feh Feh I say !!!!

    We used to race computers to a solution using a circular slide rule. We won about half the time.

    Oh did I mention it was mostly solid analytic geometry problems?
  • by meehawl ( 73285 ) <meehawl...spam+slashdot@@@gmail...com> on Thursday November 21, 2002 @03:45PM (#4725842) Homepage Journal
    My favourite book from this era is definitely Stephen Levy's Hackers: Heroes of the Computer Revolution [amazon.com] . Worth it for the hottub tales alone.

The truth of a proposition has nothing to do with its credibility. And vice versa.

Working...