Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Technology Books Media Book Reviews

Human-Computer Interaction in the New Millenium 83

Long-time reviewer clampe writes with this piece on Human-Computer Interaction in the New Millenium. This is not a book you're likely to find at the corner bookshop, but if you're serious about keeping track of goings-on in the field of HCI, Cliff argues this one is worth seeking out.
Human-Computer Interaction in the New Millenium
author John M. Carroll (Editor)
pages 703
publisher Addison-Wesley
rating 9
reviewer Cliff Lampe
ISBN 0-201-70447-1
summary Academic HCI lovefest.

Reviewer's Note:

Most of the people in the book I'm reviewing could crush me beneath their heels, given I'm a lowly doctoral student in the HCI field. However, it's not a simple question of whether the collection is good or bad, but whether it will be good for the reader in their context. Besides, I can give you good inside information on lots of the authors. Like George Furnas, as cool a cat as you'll meet, gets nervous when he does magic tricks and Paul Resnick picks a mean fiddle. Yep, I got tons of dirt.

The Scenario

Anyone who has taken an HCI class has probably come across a gigantic blue paperback book called Human-Computer Interaction: Toward the Year 2000, which has acted as a de facto text in HCI classes in the past. In 1998, leaders in the HCI field realized that this book would soon be obsolete, and started organizing the players who would contribute to this worthy successor. This book is a collection of 29 articles from the lead researchers in the HCI academic research community, and it attempts to outline the research programs that will dominate the HCI field, if not for the next millennium as advertised, then at least for the next 10 years. The book is divided into seven sections:

  1. Models, Theories, and Frameworks
  2. Usability Engineering Methods and Concepts
  3. User Interface Software and Tools
  4. Groupware and Cooperative Activity
  5. Media and Information
  6. Integrating Computation and Real Environments
  7. HCI and Society

Each section has 3-5 articles on the section's topic. Examples of the research included:

  • Terry Winograd proposes a conceptual framework for the design of interactive spaces, or more basically computing environments built into the architecture of a space and seamlessly integrated with personal context.
  • Hollan, Hutchins and Kirsh follow up some of Hutchins work on distributed cognition as an HCI research area, including a call for more ethnographic studies in the area and a better understanding of how people and tools interact.
  • Olson and Olson outline the problems of distant work collaboration, and outline situations in which distant work makes more sense than not.
  • Terveen and Hill give a great review of work in collaborative filtering, and then outline several approaches to making recommender systems better able to return positive hits.
  • Doug Schuler in one article and Paul Resnick in another argue how HCI issues go beyond desktop computing or small groups and can be applied to larger groups, including communities both online and off.

Other topics include situated computing, participatory design, new user interfaces like tangible user interfaces or gesture recognition, cognitive modelling and so on. Some common themes that emerge are the expectation that user interface needs to go beyond the desktop environment, the application of HCI principle to things other than the individual or small group, the importance of groupware and the development of a unifying theory for the field.

Really, one could write a pretty long review on any of the 29 chapters, since each one does have serious weight, as well as an innovative edge as these investigators attempt to outline directions for the next several years. Some of the articles included here have already struck a chord in this research community and have become widely cited in their draft forms, or from appearances in special journals. Each section of the book typically appeared as as journal article in Human-Computer Interactions, or were specifically solicited by John Carroll.

The Good and the Bad

These are some heavy hitters. The authors list reads like my general prelims, and it takes someone like Carroll to pull together a group like this. Each of the 29 articles stands strong on its own, though one may quibble with claims here and there, yet still manage to paint a remarkably cohesive picture of the area as a whole. This book contains serious research in a single bound volume that should grace the desk of any person interested in HCI issues. It is simply unarguable that this is going to be the HCI book for the foreseeable future.

The book bears some of the problems of the field, which is that it comes from a specific set of disciplines like cognitive psychology and computer science, so may preclude applicable theories from other disciplines. That is the nature of academic boundary making, and is not the specific fault of the book. Just so you are aware of it.

And speaking of academics, some readers may be turned off by the academic edge of this book. HCI in general has always had a foot in both the university and the corporate sector, as evinced by the list of speakers at this year's ACM-SIGCHI conference, but this book tends towards the academic side. Although specific applications get mentioned here, large parts of the book may be a turn off to people like my brother-in-law who is a sysadmin and definitely not interested in new macrotheory for HCI research. Or shaving.

This book takes commitment. It is not for lily-livered pedants who want something to fill the space until the next Harry Potter book comes out. That's neither good nor bad, just fair warning. Don't expect this to be as eminently accessible as a Don Norman book. Still, like in most things the work is very worthwhile.

So What's In It For Me?

It seems that in every field there is That One Book that people will point you to as the ultimate source to quickly get a sense of what it is all about. This book plays that role for the HCI field. If you are at all interested in the state of HCI research, mostly in the U.S. of course, then this is the book you should get. Even if you are already some tricked out, super-HCI guru, there is likely to be some research in here from outside your specific area that you will get value from.

This is not a book for someone who has to do a usability test for the boss next week and needs to know how to conduct one. Nor will this book tell you how to make your website look really cool. What it will do is give you incredible insight into the history and future of an exceedingly interesting field of endeavor.


Cliff is a doctoral student at the University of Michigan School of Information, studying in their Human-Computer Interaction program. He plans to be a contributing author in the next version of this book. You can purchase the Human-Computer Interaction in the New Millenium from bn.com. Want to see your own review here? Just read the book review guidelines, then use Slashdot's handy submission form.

This discussion has been archived. No new comments can be posted.

Human-Computer Interaction in the New Millenium

Comments Filter:
  • by yatest5 ( 455123 ) on Wednesday April 24, 2002 @10:10AM (#3401026) Homepage
    Why don't they write a book on Human-Human interaction for Computer Science students ;-)?
    • No Book is required. Anyone who's seen Monty Python and the Holy Grail three times is standards compliant (they understand references to the film.) By complying to this standard, you are able to interact with anyone worthwhile. Since the rest of the human race cannot agree to watch the film more than once, interaction with them is impossible. You can't interact if people can't agree on a protocol!

      Seriously, many books have been written on the subject of enabling human-human interaction. A number specifically target CS students. Reading them will slowly transform you into an MBA, killing you on the inside. IS THAT YOU WANT?
    • Re:Why don't they? (Score:1, Informative)

      by Anonymous Coward
      No, no, it won't turn you into an MBA. You're reading the wrong book.

      What it *will* do it teach you that software is only as good as its interface, which can be frustrating. You can't have a good interface to bad software, but you can have a bad interface to good software, and then it's unusable.

      Some handy books:
      • Tog on Interface
      (out of print but findable)

      Edward Tufte's trilogy,
      • The Visual Display of Quantitative Informantion
      ,
      • Envisioning Information
      , and
      • Visual Explanations
      .

      Interface design is abstract; the issue isn't how to make a button pretty, but to make it be in the right place at the right time, in the right color, at the right size for your audience to understand what they need to understand and not be distracted by irrelevant details.

      -Niko
    • Why don't they write a book on Human-Human interaction for Computer Science students ;-)?

      They have [oreilly.com].
  • In a nutshell... (Score:3, Interesting)

    by NetRanger ( 5584 ) on Wednesday April 24, 2002 @10:13AM (#3401042) Homepage
    I think the debate really comes down to the simple question -- will computers be integrated into existing appliances, or are they really considered as their own class of appliance?

    The answer to the above question will mold the ideas of how computers are made. To some degree you can see this train of thought happening on both sides already. IE, Apple's styling suggests the computer as its own appliance. It's friendly, but obviously its own class of household machine. But it's made to integrate into the household environment (in other words, their computers don't look ugly).

    On the other hand, you have the ideas of appliances like the Maytag Neptune, for instance. The on-board computer can solve any of your stain dilemmas, but lacks the capabilities of a full-blown desktop computer.

    In the end I think the school of thought which advocates molding the computer to more traditional appliances will ultimately become a niche market, and the computer will remain as its own appliance, with the learning curve becoming less sharp as interface design is advanced.
    • Re:In a nutshell... (Score:2, Interesting)

      by Anonymous Coward
      The days of the computer as its own class of devices is almost over for all but the highest end users.

      Low-end computing (web browsing, email, word processing) will become the domain of new applieances. While these machines will become eaiser to use. They will severly restrict what a user can do with them. (RIAA, DMCA, etc..)

      The all purpose devises we currently own will live on in the hands of power users. However, they will not see significant improvements in what is traditionally considered 'usability'. If anything the learning curve will get steeper. The problem is that truely complex tasks cannot be made self evident. No amount of HCI research will make them simpler. The best we can hope for is increased consistancy and predictability.

    • If it's easier to use and quick to learn, it'll be adopted. If not it'll be consigned to the junk drawer.

      Remember when people said we'd all be wearing digital watches and looking at digital clocks and kids today wouldn't know how to read time from a clock with hands? Snicker. Score one for a tried and true analog design. :)

      • Bad analogy... clock choice is often as much an aesthetic consideration as a practical consideration. I don't care how cool or usable your digital watch is, an analog watch _looks_ better. Same goes even more for the clock on the wall.

        I also recall in 8th grade (about 24 years ago) that there was a kid who couldn't read analog clocks, because there were none in his home.

        Here's a better analogy, how many analog radio tuners are being made these days? If there are any, it's strictly a function of cost.

      • The clock thing may also be skewed because some states include analog clock reading in mathematics curricula.

        "Easier is better" doesn't always work. For example, we aren't using the metric system, and it certainly facilitates simple calculations (six feet seven inches minus fourteen inches) as well as complex conversions between units (mass, volume, etc).

        thanks,
        cbd.
        • My observation is thus, looking at a digital display and associating it with time versus the relative position of hands on a clock display requires more mental time. i.e. If the hour hand is near 12 and the minute hand is between 6 and 12, it's getting near noon.

          This exercise is more clear when you look at 3 different numbers then see what relation they have in scale and part of the whole in a pie chart. It's comparative, rather than exact. A digital stopwatch may be the best too for timing at the track, but in things relative we can mentally associate in one look at the face of a clock faster than we can translate digits from a digital clock into where in the day the time is.

          It's not just style, though an analog clock is more easily made stylish than the attempts at gold digital watches (my dad has a gold TI LED watch from the 70's), but how easily the interface tells us what we want to know, which attracts us.

          Lord knows, I have enough on my mind without trying to look at the rate of change on a digital display and trying to figure out my rate of acceleration instead of looking at an analog speedometer.

      • Friend of mine who joined the US Army a couple years back was unable to read a non-digital clock when I began associating with him post high school.

        I clearly recall my bafflement the day I had to teach him this simple task. Though I have to admit, after a while my amazement became acceptance of the inevitable...

        I hope he's done well in the army - having someone paid to think for him is what he desperately needs.
    • Appliance evolution (Score:5, Informative)

      by Nooface ( 526234 ) on Wednesday April 24, 2002 @10:51AM (#3401222) Homepage
      The problem with computing appliances right now is that human-factors engineering has been largely neglected by the computer industry over the past few years. Most computer suppliers have focused on lowering costs, rather than pushing the envelope on hardware design and really improving usability. At the same time, software usability improvements have been slowed by the lock-in of WIMP imposed by Windows and other systems, which have frozen UI state-of-the-art at 1984 levels.

      If you are interested in the evolution of appliances, this summary in MIT Technology Review [technologyreview.com] provides an interesting glimpse of how handheld computing could evolve in the future. It questions the assumption that cell-phone Web browsers and pen-based computing will be the dominant paradigms, betting instead on thumb keyboards [nooface.com] and portable hard drives. Some interesting market statistics are revealed, such as that 52 percent of households in the 25 largest urban markets in the United States have cell phones (compared to more than 75 percent in some European countries such as Iceland and Finland), and that worldwide sales of digital organizers were 12 million units, while digital camera sales were 6.4 million units last year (3 million MP3 players are expected to be sold this year). These kinds of numbers show that a breakthrough for computing appliances may be near.
    • First: love the sig.

      I see a future where there are powerful computers which are used just like our computers today.

      There will be devices throughout your house which may connect to this main computer (so you don't have to get up to see what is in the fridge) - but they will stand on their own as well.

      As high speed internet trickles into every home the reality of these devices will become clearer.

      Maybe your toaster is a computer - but simply a dumb terminal as your 10GHZ desktop does all the work.

      We are already seeing devices spin off of the PC. MP3 players - both for your component system and elsewhere. We are seeing console video games which are tiny PC's [Think Dreamcast and up].

      The future hopefully will be voice based interaction with these machines. My hands hurt now.
    • The problem with some computer-in-traditional-appliances (like washing machines) is that they provide very little added functionality and cost a lot to repair. I spent $A500 to have the onboard computer in our 'just-out-of-warranty' washing machine repaired, only to have it break again within 6 months. We traded it in on a good old mechanical model, which cost less than the original repair.

  • Arthur C. Clarke wrote a book [barnesandnoble.com] on this subject didn't he?
    • Clarke wrote a number of books which discuss aspects of living with electronic machinery. _Imperial Earth_ (I think that's the one) included PDAs and pervasive IR networking long before anyone seriously considered building such stuff. (Not to mention a planetary public data network used routinely by billions. Can you say, "Internet"?)
  • Or is it the other way around when us humans get to incorporate computers into our minds, this subject area will be kept on being researched but not by many because of the dearth of knowledge yet available.

    BTW, where is the mind located? I seem to remember in Philosophy we had a long discussion on this question.
  • by morbid ( 4258 )
    But computers haven't been around for a thousand years yet!
  • Alice [alice.org] is a 3D Interactive Graphics Programming Environment for Windows 95/98/NT built by the Stage 3 Research Group at Carnegie Mellon University [cmu.edu]. The goal is to make it easy for novice programmers to develop 3D environments and to explore the medium of interactive 3D graphics. The current version of Alice authoring tool is free (as in beer).

    Alice is primarily a scripting and prototyping environment for 3D object behavior, not a 3D modeler, so Alice is much more like LOGO than AutoCAD. By writing simple scripts, Alice users can control object appearance and behavior, and while the scripts are executing, objects respond to user input via mouse and keyboard.

    I see an opportunity here for a free (as in speech) version. It could go a long way in the acceptance of Linux at the elementary school level.

    • by mwood ( 25379 ) on Wednesday April 24, 2002 @10:36AM (#3401152)
      If only someone could explain to me *why* we should want 3d UIs for general use. We have enough clutter and irrelevant detail in two dimensions. I've been computing for a quarter of a century and I am still more productive with a 1d UI (a commandline) than with any of its modern rivals.

      Let me know when the neurocouplers are ready, so I can try out a 0d UI. It sounds wonderful.
      • I think you've really hit the nail on the head here. We're still looking for that 'killer app' or that field in which using a 3D interface increases productivity 300%, or whatever.

        Anyone remember all those cool 3D images the Pathfinder/Sojourner robots returned from Mars? [nasa.gov] My lab participated in a NASA field test for future Mars robotic explorers, trying to decide what tools should be included on the robot, command structures, how to do distributed science with a 24 hour turn around time, things like that. Turns out the geologists were MUCH more interested in the high-res, 2D pictures than in the interactive-immersive-photorealistic-insert buzzword VR environment created from the robot.

        The technology was awesome. The computer churns for a couple hours, you put on your 3D goggles, or step into the Cave, and you can look around and see the remote environment from the vantage of the camera (in the field test, the camera was placed about average human eye height). I think distances/measurements were accurate to 5mm at 5m.

        But the geologists only used the technology like 0.5% of the time. When the data first came back, they'd look at the VR and say, "whoa...cool!" but then go to do their 'real work' from the 2d images. The people who really did use it were the people that planned the path of the robot (how deep is that ridge? how far to that rock?).

        I think there were a lot of factors that contributed to that. First, the resolution wasn't that great, or more accurately still light years from the resolution of the eye. We're a LONG way from telepresence, so the geologists weren't motivated to act like it was available. Second, there was a high learning curve to the user interface. Not to someone w/VR experience, but if you've never put on 3D goggles, or used a Magellan device, or thought about how to 'rotate the world' so you can see what you want, its hard. So they had to find someone to 'drive' the interface for them, and they weren't motivated to do that either.
      • The human brain is built to analze 3d information. By presenting things in 2d, we are wasting this marvelous ability.

        The fact is, most 3d interfaces in existance just plain suck. The problems are with implementation and design details, though... It's very hard to take advantage of spatial 3d information display without forcing awkward interfaces that are easy to get lost in and difficult to understand. That doesn't mean that the theory I've stated above isn't worth pursuing, though... just that making use of the theory in a workable way is VERY HARD. This is why we study it.

        There is progress being made. I got the chance to paint in 3d with a very simple 3d interface at the CAVE at Brown university (it was a paintbrush with a button on it and some virtual pots I could dip it into). It was simple and usable and trust me, it's much easier to paint a 3d object with a 3d cursor than a 2d cursor.

        -Erik
        • The human brain is built to analze 3d information. By presenting things in 2d, we are wasting this marvelous ability.

          This much is true

          The fact is, most 3d interfaces in existance just plain suck.

          This is very true.

          The problems are with implementation and design details, though

          There is also the minor point that there's no such thing as a 3D monitor.

          Your average geek may be able to fly around Quake 3 with the greatest of ease, but my girlfriend gets motion sick watching me play. When she isn't getting ready to hurl, she's completely lost (which corridor is that one?). These are not desirable characteristics when trying build an easy to use UI.

          Face it: 2D projections of 3D are a poor substitute for actual 3D objects. Monitors are 2D (I might even pay 2.5D if you want to count some degree of layering); trying to get them to represent 3D information (and using that representation) requires a strong capacity for spatial maths, which is not present in the vast majority of the population.

          Until the basic hardware hurdle is overcome, I doubt we will see any major improvements in 3D UI's.

          Russ %-)

          • 2D projections of 3D are a poor substitute for actual 3D objects

            Tell that to the animation world. Motion and other cues give us a lot of information about an object. Stereo vision is *not* the only way we perceive depth.

            I think you hit the nail on the head when you talk about Quake though. I doubt any worthwhile 2d-projection 3d interface is going to have a fast moving camera (or a moving camera at all). It is indeed disorienting and often nausiating. Of course, this falls under the category of poor implemetation. There are many other ways to tackle the problem of movement besides free 1st person movement. Guided navigation seems like a good idea to me.

            -Erik
    • Interesting, the last time I had looked at Alice it was still at Virginia Tech.

      There is a free (as in speech) version of Alice in Squeak Smalltalk [squeak.org] which runs not only in Linux, but also Windows, Macs, PDAs and so on.

      The above site seems to be having some problems, so check SqueakLand [squeakland.org] instead.

  • I'm kind of surprised they said you probably won't see it in your local bookstore. Do people still go to the bookstore? I only go there looking at the bargain books. I found the book by the ISBN on Amazon.com [amazon.com] here HCI Book [amazon.com].
  • I bet this book doesn't cover the dirtier side of future Human-Computer Interaction ;)
  • by count0 ( 28810 ) on Wednesday April 24, 2002 @10:49AM (#3401206)
    HCI suffers in real-world situations because tomes like Carroll's collection are of interest to academics, but are often hard to apply to day-to-day problem solving that most development teams need. Here's a list of books I'd recommend before buying HCI for the New Millenium
    • The Humane Interface - Raskin
    • Contextual Design - Beyer and Holtzblatt
    • Design of Everyday Things - Don Norman
    • Usability Engineering Lifecycle - Deb Mayhew
    • User and Task Analysis for Interface Design - Hackos and Redish
    • About Face - Alan Cooper
    • Information Architecture for the WWW - Rosenfeld and Morville (2nd edition coming from O'Reilly in July)
    • O'Reilly should also have a book coming out in Fall from Mike Kuniavsky (OpenOffice contributor) on User Research that should be good.
    • Norman's The Design of Everyday Things is a great way to prep your mind for HCI. It's on my desk right now.

      However, I have to say that Raskin's The Humane Interface is relatively - alien. I don't agree with some of his tenets, most notably his aversion to modality. Even so, it's worth reading. Ironically, the cover of the edition I read had a tendency to curl annoyingly, and was one of the most inhumane book interfaces I've ever seen...
    • by gwernol ( 167574 ) on Wednesday April 24, 2002 @11:27AM (#3401486)
      HCI suffers in real-world situations because tomes like Carroll's collection are of interest to academics, but are often hard to apply to day-to-day problem solving that most development teams need. Here's a list of books I'd recommend before buying HCI for the New Millenium

      This is to some extent fair, but academic researchers are usually well aware of this issue. Their work is (usually) not intended to be applicable to today's day-to-day problems. It either provides the theory you transform into day-to-day practice or it is work that will solve problems that will occur ten years from now.

      Most importantly if you understand the scientific underpinnings of the field you will get a lot more from the "practitioners" books you mention (which are all good recommendations).

      I spent 7 years as an academic HCI/AI researcher. I have spent the last 10 years as a commercial software developer. Having an academic understanding of the theory of HCI makes me a profoundly better user interaction designer in the "real world".
      • I totally agree with you (and am a closet academic at heart). It's just that if you're a developer looking to make your project more usable and relevant, then academic papers are overkill. If I was going to pick between an interaction designer with HCI academic background and commercial development experience and one with just commercial development experience, I'd prefer the academic background sight unseen (but then I'd ask for a portfolio - having an HCI degree doesn't make you a great designer, and the lack of one doesn't mean you can't design)
    • I confess that HCI does not hold much interest for me personally but I would like to see a very simple "popular science" treatment of what types of HCIs are being considered for future generations. Not completely content-free but something very basic that would communicate to the public that HCI is an active field of research and that they shouldn't expect that tomorrow's UI will look anything like today's. The public at large thinks that the way we interact with computers today is representative of how we will tomorrow as well and so parents are forking over big bucks to make sure their kids get "computer training". It always breaks my heart whenever I see one of those news stories where they interview some low-income single-mother-of-three who is wasting good money on computer classes for her 10 year old kids. She makes the sacrifice because she thinks she's giving her kids a head-start on the supposedly-valuable computer skills they'll need to compete in the job market. What she doesn't realize is that by the time her kids enter the workforce, most of the stuff her kids learned will be obsolete.

      A book that makes parents think twice about the value of modern-day computer "training" and applicability to future systems is seriously needed, in my view.

      GMD

      • If your arguement was true, then I should not have learned about computers when I was 4 and just waited until I went to university.
        The problem is, how would you know what to do if you weren't used to what computers can do?
        So please tell me, who is more competant: the person who has used computers all their life, or the recent graduate in computer science?
    • Design of Everyday Things rocked. one of my more enjoyable textbooks for a class.

      Also had User and Task Analysis for Interface Design as book for a class, was pretty good
    • Couple of quibbles.

      About Face is not very useful due to its lacking in rigor, IMHO. It's basically a list of the author's pet peeves. While he does present some useful advice, he doesn't back it up with any research to prove that he's right.

      Also, if you can find it, Tog on Interface (Bruce Tognazzini) is an essential seminal book on HCI by one of the earliest graphical interface designers. You may not agree with him either, but he's at least done the research. Tog on Software Design is also good.

    • I agree with most of the books on this list, and would have a few to add at some point. This would make a good Ask Slashdot, so I'll see if I can get that done. I agree that this text isn't going to please someone interested in either a specific view of HCI, or are looking to apply HCI concepts to specific problems.
    • I have asked Jef Raskin about the proposed new features of the GUI in my Kaos [BSD] operating system, he wants some stupid CLI inside all applications.
      This may sound reasonable to certain slashdot readers, but it's a mongrel of an idea: does anyone need to type in caps to execute commands while working in any program?
      He would prefer an unituitive multiple command sequence to a seperate GUI window.
  • This book is a collection of 29 articles from the lead researchers in the HCI academic research community, and it attempts to outline the research programs that will dominate the HCI field

    If the reviewer would have gone into detail about a couple of individual articles (of the 29) in the book, a lot of us newbies would get a much better picture of what to expect in the book. Five of the articles were given a very brief one-sentence description, which doesn't help as much. I'm sure there must have been one or two articles the reviewer found exceptionally good. He should tell us about them.

    Disclaimer: I'm not trying to be a troll here. The review did stimulate enough interest for me in that I'll probably end up purchasing the book.

  • by Anonymous Coward
    www.asktog.com

    Bruce Tognazini ... did a lot of work for Apple in the early years. He wrote a great book "Tog on Interface"
  • I just hope in the new millennium we all learn how to spell it.

  • The basic problem as I see it is too much information. The only way to find anything is go through a lot of it manually. We need to bridge the gap between "this is what I really want" and "this is what this information is about".

    What we need is more context information. The computer needs to start understanding what we are saying. Something more than blind keyword searches.

    We also need more in the way of information anywhere Where I can as my phone, my pda, my computer, my refrigerator 'what do I need to do now' or 'make sure I buy more beer' and have it do the right thing.

    Danny

  • Okay folks, it's been what, two years now since 2000 hit. I don't care whether the new millennium hit in 2000 or 2001, but please, DOUBLE N!

    MillenNium.

    Hell, even the bn.com link has it with two n's.
  • I guess I'll have to read the book to find out what they mean by HCI and society...I wonder if this takes into account the internet and the advent of human-computer-human interaction (HCHI). This is rarely debated when we consider that they way we interact with the computer, can also shape the way we interact with other people.

    If you somehow limit someone's sense of self or ability to make themselves known in a real and or personal way via the lacking in a good interface perhaps it can be said that you thereby limit this person's ability to communicate with another.

    Most scientific studies on human computer interfaces have avoided the very real facts about human beings in society. That is, that we believe ourselves to be different IRL and in social contexts we perpetuate these differences in a clear and noticeable way which we are very aware of. It is said of the internet, that there is a total lack of identity and that there is just text on the screen, but when we think of the original users and perpetuaters of the internet and the way things are arranged, whe can sometimes see that this is more like a homogeneous situation, and not so much obscure.

    Its the developers of the internet that are shaping the ways we interact, and many times this is a unifacited design that is based on how they interact with their computer.

    You may downplay the sense of self online and say that you can pretend to be anyone and be anything, but you still have to take into account that although you may be pretending, you are doing so from within the mindset of your real life self...please don't try and tell me that you somehow become detached from your lifetime memory and experience, and become a totally different person, as I'll find that hard to believe.

    ...by the very nature of the privacy of the internet you can see how the question of my identity can change your opinion of my opinion...seeing as how you probably don't know me outside this post:)

    I recommend people read Race and Gender Online, and Wired Women if they would like to see insight into this rarely researched phenomena...yes, at times they might seem like they're complaining;)

    Nothing ever gets worked out by leaving it be...

  • hmmmmm (Score:2, Interesting)

    by Jacer ( 574383 )
    all your base; are belong to us....
  • Everything else is video game fluff. Batch processing forces you to THINK and get it right the first time. This realtime interactive crap is responsible for all the lousy 'trial-and-error' code we have to live with today.

  • This is a field? Where can I get more info on this? I may have just found my calling. Thank you /.!
  • I'm sure there are many wonderful HCI ideas floating around in academic departments, but perhaps this is a field where real breakthroughs can still be made by creative people outside of universities who are willing to give some serious thougth to the issues. Formal course work is nice, but not essential.

    Anway, a plug for some ideas I wrote up a few weeks ago: The Voice/Hand Motion Interface. [m3peeps.org]

Love may laugh at locksmiths, but he has a profound respect for money bags. -- Sidney Paternoster, "The Folly of the Wise"

Working...