Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Movies Media

Review: A.I. 390

As you might have expected, several of the slashdot folks went to see A.I. this weekend. Jon Katz and I were brave enough to write about it. In case you've been dead for the past six months, there's a huge game being run to promote the movie (though the plot of the game apparently has little to do with the plot of the movie). Read on for a thorough dissection of this much-hyped tale of the robot boy who can (sniff, sniff) love. (Usual warnings about spoilers apply.)

michael: Looks like I get to go first. Let's get some basics out of the way. Some reviews by others: Slate, Salon, Wired. You may want to read the short story that started it all. But if you see the movie, you'll find that the short story has less influence on the movie than a famous and beautiful poem by W. B. Yeats, The Stolen Child. Since it's out of copyright, and since it happens to be one of my favorite poems, and since you uncouth heathens could use some exposure to beauty, I'm going to reproduce it here.

The Stolen Child

Where dips the rocky highland
Of Sleuth Wood in the Lake,
There lies a leafy island
Where flapping herons wake
The drowsy water-rats
There we've hid our faery vats,
Full of berries
And of reddest stolen cherries.
Come away, O human child!
To the waters and the wild
With a faery, hand in hand,
For the world's more full of weeping than you can understand

Where the wave of moonlight glosses
The dim grey sands with light
Far off by furthest rosses
We foot it all the night,
Weaving olden dances,
Mingling hands and mingling glances
Till the moon has taken flight,
To and fro we leap
And chase the frothy bubbles,
While the world is full of troubles
And is anxious in its sleep
Come away, O human child!
To the waters and the wild
With a faery, hand in hand,
For the world's more full of weeping than you can understand

Where the wandering water gushes
From the hills above Glen-Car,
In pools among the rushes
That scarce could bathe a star,
We seek for slumbering trout
And whispering in their ears
Give them unquiet dreams,
Leaning softly out
From ferns that drop their tears
Over the young streams
Come away, O human child!
To the waters and the wild
With a faery, hand in hand,
For the world's more full of weeping than you can understand

Away with us he's going,
The solemn-eyed
He'll hear no more the lowing
Of the calves on the warm hillside
Or the kettle on the hob
Sing peace into his breast
Or see the brown mice bob
Round and round the oatmeal chest
For he comes, the human child
To the waters and the wild
With a faery, hand in hand,
From a world more full of weeping then he can understand

--W.B. Yeats, 1889

The poem itself in is in the movie in two places, and crops up in several other places as well - "Till the moon has taken flight" takes on literal meaning, for example. Faeries, yep, we got faeries. And there's no one more solemn-eyed than a kid who sees dead people.

I'm sure one of the other slashdot authors will go into the whole Kubrick/Spielberg deal so I'll skip it. The movie is slow, light on dialogue, heavy on music and long meaningful camera shots. (It reminded me of The Thin Red Line several times.) The audience didn't particularly appreciate the slower scenes (one anonymous coward in the back row shouted out "Boring!" at one point), which makes me think this isn't going to be a box-office smash. The acting is superior - a great deal of effort has been expended in having the mechanicals show a consistent face to the world - they don't break character in the slightest, not even an extraneous eye-twitch. Special effects are also superior - rarely in your face, but always there, and entirely realistic. (I'm going to ignore the aliens.)

One area I kept looking for was hard-coded limits on robotic behavior. These robots have neither the First, nor the Second, nor the Third Laws of Robotics, which seems like a foolish design oversight. Several major plot points would been eliminated if the robots were obedient ... but why would humans make disobedient robots? At the very least, it seems like emotion would come well before disobedience on the robot evolutionary scale.

Anyway, A.I. is well worth seeing, at least once. I don't know if time will call this a masterwork or not. It's certainly a fine piece, worthy of respect, and it will certainly be referenced in the many future movies about artificial intelligence (just wait and see), but it seemed to fall a bit short of master-level.



Jon Katz: In A.I., Steven Spielberg (and the ghostly spirit of Stanley Kubrick) has made one of the most astonishing and original scientific fairy tales of all time. The movie is unlike anything you've ever seen, visually or conceptually. Like so many Hollywood movies of the past decade or two, it doesn't quite know how to end, but that's a minor squawk against the backdrop of a masterpiece of story-telling genius and moral power. Through the life of a lost boy -- an artificially engineered one -- Spielberg has brought a fresh, contemporary eye to enduring questions of moral responsibility and technology, and their impact on human life. Be prepared: this is a very disturbing movie. In cinematic terms, Spielberg has chillingly evoked Mary Shelley. He combines his dramatic flair and his acute sensibilities about childhood with fantabulistic animation and design. Spoilage warning: Plot is discussed, no endings.

This is the story of David (played wonderfully by Haley Joel Osment), a robotic boy sent out into a world ravaged by ecological catastrophe (global warming has submerged the great coastal cities of the earth). Although the future is filled with mechanical beings, David is the first child programmable to feel and need love, and to dream his own dreams. His desire to love a mother deeply, once activatd by a spoken imprint sequence, is irreversible. If the relationship doesn't work, David must be destroyed.

Osment's tormented robot-kid is disturbingly convincing, especially his transformation from a machine trying to learn about emotions into a sentient being overwhelmed and consumed by them. Alternately predictable and inappropriate, endearing and creepy, he struggles to fit into a conventional family. Henry and Monica, the parents who take him in (Frances O'Connor and Sam Robards) have accepted that their biological son, who is in a coma, will never awaken.

Already, the moral lines are drawn powerfully around this family, a stand-in for our morally obtuse society. Henry agrees to bring a robot child into his home as a surrogate kid without even telling his wife, to help assuage her grief. Monica, mourning her stricken offspring, is a sucker for a loving kid, even a programmed one. David is used in the most profoundly unthinking way. At first, Monica is unnerved by this alien creature, then succumbs to his unequivocal affection.

But their son Martin does recover, and comes home angry and jealous. Here, the movie moves directly into Frankenstein territory. In one powerful scene David is so anxious to be like Martin, whom his new mother loves so deeply, that he starts wolfing down food, which nearly destroys his delicate circuitry. Goaded by their manipulative and somewhat unpleasant natural son, Henry and Monica come to believe they have a monster in their home rather than a loving child, and are overwhelmed by what they've done. Just like Victor Frankenstein, they take no responsibility whatsoever for this creature, sending him away into the dark woods.

David's "mother," to whom he is now forever devoted, takes him out for a drive and abandons him -- an echo of countless fairy tales -- rather than return him to the cybernetic firm that will destroy him. The film's lively middle section depicts a world in which thugs roam the countryside looking to torture and hunt down "mechas," capturing them for a "Flesh Fair," a carnival billed as a celebration of life devoted to "demolishing artificiality" and securing a truly human future.

David's creator Professor Hobby (William Hurt), also stands back as this tragedy unfolds, more curious about his experiment and its commercial possibilities than he is concerned for its consequences. It's a scathing rendition of America's ostrich-like attitudes about technology, as it unleashes AI, fertility, genetic and other technologies on an unprepared world, all in the name of progress, health, or convenience.

In fact, as in The Matrix and almost every other movie which deals with AI, the film delineates a world already sliding into civil war: humans ("orgas," for organic) caught between technological and environmental issues, feel increasingly endangered by the intelligent machines that are more adaptable than they are. It's interesting that almost no artist or futurist looks at AI and the future and sees much good.

As a renegade sex robot called Gigolo Joe (the phenomenal Jude Law) explains to David, whom he's befriended, humanity has belatedly come to regret devloping AI machines unthinkingly. "They made us too smart, too fast, too many," Joe says, perhaps presciently.

Dark and ominous from the beginning, the movie now turns wrenching. Wickedly, Martin has urged his mother to read aloud the story of Pinocchio, with which David becomes obsessed. He sees the parallels between his own story and the wooden puppet's, and he sets out at all costs to find the Blue Fairy who will transform him into a real boy so that his missing "mother" will love him as much as she loves her biological son. But by now, David is no witless, gullible Pinocchio. He is obsessed and resourceful, and has evolved in decidedly non-Disney ways.

The shadow of Stanley Kubrick, who conceived the movie based on a short story by Brian Aldriss, falls darkly across this ground-breakingly inventive tale. There are embedded visual and thematic references to A Clockwork Orange, and 2001: A Space Odyssey, along with Star Wars and E.T. There's even a sly homage to Pinocchio's "Pleasure Island." And the story draws heavily from the fairy tale genre, especially all those Grimm's fables about kids being abandoned in dark and menacing woods. Kubrick apparently spent many hours talking with Spielberg about the movie, but died before he could tackle it.

But Spielberg really honors him here. This movie is as disquieting as it is eerie, gorgeous and thoughtful; it dares to take on the serious issue of humanity's pell-mell rush to fiddle with human life -- from AI to robotics to genomics -- without realistically or carefully considering the consequences. You can almost hear the technologists of the future explaining why they couldn't possibly have foreseen the impact of the forces their predecessors unleashed.

When Mary Shelley sounded this warning in Frankenstein, technology was primitive and noninvasive, still a somewhat abstract fear. The world in whic David "lives" is not only imaginable but, by many accounts, is almost upon us, at least in terms of the possibilities of AI and the rapid evolution of computer systems into a sort of species.

Speielberg reminds us that we aren't ready. Not only may many humans get hurt, but so may the new machines, along with nature itself. It's a provocative twist on a big and powerful premise. What are we? What are we going to be?

There's a Freudian twist or two as well. What David yearns for is what the shrinks tell us we all want at some point -- pure, undiluted love from and time with Mom. David's fight for that is heroic, down to a shocking and unexpected series of endings, certain to be controversial and upsetting to many. (Parents who bring little kids to what they think is just another Spielberg yarn will be in for an unpleasant surprise). David develops some less attractive human qualities as well. Spielberg seems to be suggesting that it's all too easy to ultimately create machines that behave like humans, but we might not like the results.

This ability, he seems to warn, distracts us, lets us off the hook, prevents us from asking the most signficant question: What does it mean to be human, and what kind of humans do we want to be? That question doesn't often come up when it comes to technology, where the question is more apt to be: how can we create more cool stuff?

A.I. is shocking and haunting, beautiful and unique. For all his sometimes icky Boomer sentimentality, Spielberg's ability to grow artistically, to make deeper, richer, more inventive movies, qualifies in my book as an epic acheivement. When it comes to science, this movie begins where 2001 leaves off, and then goes a galaxy or two farther.

This discussion has been archived. No new comments can be posted.

Review: A.I.

Comments Filter:
  • by Anonymous Coward on Sunday July 01, 2001 @08:20AM (#116316)
    This movie will make you laugh and cry, provided you have nitrous oxide, severe allergies, or a deep regard for sentimental middle brow rubbish. I stayed to the end by an act of great will, irrationally expecting to find something that would justify the rest of the time I had spent on watching this testament to the director's wish to be a real, live artist.

    First of all, pacing. There isn't any. The movie drags on, and on, and just when you thought it was done, it drags on some more. I would have been been fidgeting if I'd been an immortal robot programmed to simulate engagement with crap movies.

    The soundtrack is obtrusive. Its forever telling you exactly how you should be feeling. (David's in trouble! Sad! The yokels at the demolition derby are throwing rotten fruit at a bad guy! Happy!)

    The children are one dimensionally malevolent. Its a common-place that children are monsters, but they're complex monsters. These kids were apparently from the Cybertronics "Damian - finally, a child you feel good about starving and beating!" product line.

    There isn't really a clear rendering of how David's mind works. He's emotionally needy, and well-behaved, and, um, hmmn. The movie's vague with regard to where he resembles humans and where he is other.

    Cybertronics sensibly keeps its main R&D office in a half-submerged skyscraper in a drowned city. No doubt this makes it easier to attract and retain employees of a certain cast of mind (ie, romantics and those on the run from the law) but I wonder if its really logistically practical.

    Spoiler warning:

    I had high hopes for the aliens. I thought it would be a good ending if they set David up with a simulacrum of his mother with whom he could spend the rest of eternity, oblivious to the strangeness of his situation. I thought it would be good if the aliens remained remote, curious archaeologists. But no... they turn out to have soft english voices chock full of cloying world-sadness. They're just awfully impressed with humanity and wish they could be like us. Using their essentially unlimited technology they can resurrect the dead but, um, they time-expire faster than a big mac.

    In short, as far a golem stories go, rank this not with Frankenstein, Pinochio, or Golem XIV (great novella by Stanislaw Lem), but with "Hawkman versus the Death Droids!!" or the manual for your autonomous robotic lawnmower.

    -Zachary Mason

  • by Anonymous Coward on Sunday July 01, 2001 @07:53AM (#116317)
    I always wonder why people bring up Asimov's three laws of robotics as though they had anything to do with reality. I mean, when you really study robots, you find that robots programmed with safeguards will malfunction and cause horrible, painful death. A good example was a robot made for hospitals to deliver radiation treatments to people. Unfortunately, I can't remember the name of the robot, but what I do remember is that the original model robot had mechanical safeguards that would shut off the machine if the radiation levels got to be dangerously high. Unfortunately, these mechanical safeguards were considered too expensive and it was decided in later models to use only software based safeguards. Well, the software based safeguards, of course, failed miserably. The software was incredibly buggy, and the robot ended up horrible burning or radiation poisoning many of the people who came in for treatment. It was a big scandal.

    Now, this was a _really_ simple robot, designed to deliver a few different kind of radiation pulses as cancer treatments. How are we going to be able to program safeguards into super advanced robots with emotions and human level intelligence?

    Isaac Asimov wrote a lot of entertaining stories about intelligent robots that had to obey people, couldn't kill, and the like. These were fun stories, and I recommend I, Robot to anyone who wants some light reading (though Foundation is better.). Just don't treat it as gospel truth, and remember, other writers have had completely different views of Artificial Intelligence. (My favorite is, Fondly Farenheit by Alfred Bester.)

  • The story featured in Wired is the original one--it was first published in 1969. He wrote two sequels, "Supertoys When Winter Comes" and "Supertoys in Other Seasons" thirty years later. AI incorporates at least one element from the latter.
  • by Indomitus ( 578 ) on Sunday July 01, 2001 @08:56AM (#116321) Homepage Journal


    The ending (after he goes into the ocean) has apparentlly been in the script since Kubrick started developing it. It went through various revisions but it's not a Spielberg add-on, as much as it might seem so.

    Also, they're not aliens. They're super-advanced robots. Spielberg does a horrible job of communicating that but it's a fact.


  • You're full of it. So you did CS and philosophy at uni, now you're qualified to make a bunch of blanket statements about what is and is not possible 200 hundred or so years from now, based on how you think AI works ? I did a degree in AI, and all I really learnt was the depth of our ignorance. AI doesn't work at all yet, almost everything known in AI today is useless at predicting how "real" AI will work.

    Your arguments are like some caveman saying "hah, they could never make flying machines that go over 200 miles an hour - if you tried flapping a wing that fast it would break, unless it was too heavy to even get in the air in the first place".

    You're "facts" are too stupid to delve into in any depth, but just as an example, doesn't it bother you that your arguments to support #1 and #4 are completely contradictorary. If an AI would have to be based on a neural net, then maybe it would be hard to make it "love" someone.
  • you silly fuck. they used the word "mecha", not "robot". go fuck yourself.
    ------------
    a funny comment: 1 karma
    an insightful comment: 1 karma
    a good old-fashioned flame: priceless
  • AGHHHHHHHHH!

    I checked the f--king box and that means no Jon Katz! Ever! Not even when that commie pinhead michael sneaks in on weekends! Got it?

    No pinko whining about technology.

    No global warming sky is falling rubbish.

    No false maudlin techno worry-warting that implies that if we don't have pseuds like Katz fretting about it it will soon destroy the ecology, all public schools, Salon (whoops)... and we'll all be eating red meat and GM fries served by a single large MicroMcDonaldsLockheedSoft conglomerate run by Newt Gingrich.

  • by warmcat ( 3545 ) on Sunday July 01, 2001 @07:44AM (#116327)
    ...if I ticked the box I would NEVER see anything by Jon Katz again!!!!
  • Well, the future civilization is neat, and not entirely happy, if you think about whose civilization it is. I agree that there was no point to giving David what he wants. In fact, realizing that you can't always have what you want would have been a good moral to illustrate.

  • Before reading the comments here, I never thought the thought. They just scream 'greys' in form. But I agree that the ending makes more sense when you think of them as future robots. So if that was the intent, then it was just a horrible mistake in visuals. I mean Really Really Horrible.


    I guess there's no point in not talking directly -- anyone reading the thread has seen the movie or doesn't care about spoilers.

    Why do the robots look like stereotypical aliens? For the same reason why aliens are depicted that way -- the slender body and big head are signs of beings that are specialized for thinking rather than physical activity. With their antigravity tech or whatever it is, there is no need for them to do physical work.
  • by nedron ( 5294 ) on Sunday July 01, 2001 @07:57AM (#116334) Homepage
    Spoiler.... Michael apparently didn't pay a whole lot of attention to the movie. The "aliens" he mentions were actually robots! I'm not sure how he missed it, but they even refer to themselves as such in the movie.

    As an append, the movie should be watched as a >robot's< fairy tale. It makes much more sense and is thoroughly enjoyable in that context.

  • I'm about to reveal my stupidity: I missed the fact that the beings at the end of the story were evolved from (indigenous?) A.I.s and were not extraterrestrial in origin, if indeed that's the case. And that distinction certainly puts a different spin on the story. So take anything else I have to say in such context as you deem appropriate. One of the problems with telling this kind of story is pacing -- sometimes the dialogue and the underlying ideas are carrying the freight, not action and noise. If you think you're suddenly about to see the pace pick up and, in fact, it winds up slowing down even more (as happens in this movie toward the end), it's distracting and disappointing to have mentally prepared yourself for one type of situation only to be confronted with the opposite. I think this shortcoming is probably a first for Spielberg, who maintains pretty even pacing through a story but has never tried to follow through on another director's creative vision so thoroughly. For me this movie has proven very provocative intellectually in a way Spielberg has never managed before, and after I chew on it a while longer, I'll almost certainly go back and see it again.
  • There's another sci-fi story out there somewhere, not necessarily by Aldiss, but possibly, it's been a long time since I read it, featuring a boy (a real one) and his robot teddy bear, but its plot goes in an entirely different direction.
  • So if aliens make robots, they aren't robots? Then what are they?
  • If they'd stayed true to the original story there wouldn't have been a character for Jude Law to play, or at least not that character.
  • by HEbGb ( 6544 ) on Sunday July 01, 2001 @07:48AM (#116340)
    I found the movie enjoyable, but not nearly as thought provoking or as well executed as it could have been.

    Despite the strong presence of Kubrick's influence (the movie would have been horrible otherwise), there were countless episodes of Spielberg-isms akin to those things that made me dislike Jurassic Park so much. Gratuitious tear-jerkers, cutsey-laughs, and all of the other crap that's thrown in to make the movie more marketable to the typical McDonald's customer and general-purpose merchandisers.

    I was also disappointed with the trivialization of Kubrick's role in forming this - it's quite clear from the movie that his role was much more than "talking [about it] with Spielberg".

    And I totally disagree with michael and JK's conclusion at the end that this is any indication of "Spielberg's ability to grow artistically .." and I am horrified that they think "this movie begins where 2001 left off, and then goes a galaxy or two farther."

    No way. The movie was decent, and I'm glad I saw it, but to even compare this to achievements like 2001, or even Speilberg's real achievements like Jaws, Close Encounters or E.T. is nonsense.

    It's a good flick, but it's no epic. Get over it, boys.
  • by Goonie ( 8651 ) <robert,merkel&benambra,org> on Sunday July 01, 2001 @04:51PM (#116341) Homepage
    Starship Troopers - Almost nothing remains true to the Grand Master's plotline. The characters are switched around, one even has a sex change from the book to the story so (s)he can be a love interest. The original POINT of the book, to be R.A.H.'s dissertation on war and government, is completely ignored in favor of changing it into gore-splattering CGI fest. An utter disappointment, in every conceivable way.

    I gotta say that you missed the entire point of Starship Troopers (the movie) - as I viewed it anyway. It's not a retelling of the book, it's a parody of it. Heinlein's book, at one level anyway, is an advertisement for an anti-democratic, military dominated state, and the film neatly skewers this with such deadpan subtlety that I'm still not convinced that the actors were in on the joke, let alone the studio execs that funded the film.

    I found Heinlein's book repulsive, myself, but I'm aware that this isn't a universally held opinion. Paul Verhoeven, the movie's director, certainly seemed to think so.

    Go you big red fire engine!

  • A.I. - Ho hum, Asimov's Frankenstein complex is in full force. Despite nearly every robotics (a word Asimov coined), despite every robotics major ANYwhere having read his robot novels, somehow they forgot the laws of robotics. D'OH!!

    You know, this constant harping on the "Laws of Robotics" every time someone writes a story about robots really bugs me.

    First off, Asimov wasn't doing research into robotics, he was writing stories. FICTION stories. His conclusions shouldn't be the be-all and end-all of artificial intelligence research. The three laws are flawed, as even Asimov himself admitted when he was forced to create a Zeroeth Law for his own stories.

    Secondly, were we to decide that the Three Laws were indeed necessary and sufficient, that doesn't guarantee that we could implement them in any meaningful way, or that we'd do so bug-free. A robot's program is going to be incredibly complex, and no human endeavor that complex will be free of mistakes.

    Hell, if nothing else, can't a story be good because it serves as a cautionary tale about the dangers of NOT following Asimov's train of thought? We made them too smart, too fast, too many to properly restrict them? We needed them *NOW*, not after perfecting Asimov circuits?

    David was obviously built to not deliberately hurt humans, he says that he'll "get in trouble". When he pulls Martin into the pool, it doesn't look like he's trying to hurt him; it looks like he's unaware that doing so WILL hurt him. Asimov circuits won't help a robot not cause inadvertent harm, even if implemented perfectly.

    -
  • by Hiawatha ( 13285 ) on Sunday July 01, 2001 @07:40AM (#116349)
    Are you kidding? The ending, though a bit too long, is way cool, and absolutely heartbreaking. Besides, if they leave at that point, they'll miss the stunning Coney Island sequence, as impressive a bit of filmmaking as Spielberg's ever done.
  • Precisely. They talked about David as an "original" robot, which made it seem to me that the robots we were seeing weren't created directly by humans, but instead were created by other robots who were created by other robots, who somewhere down the line had been created by humans, but none of these original robots had survived, so there wasn't any lasting direct contact or knowledge of humans.
  • Totally off topic, but just in case you want to hear a beautiful rendition of "The Stolen Child" check out Heather Alexander [heatherlands.com]'s album Wanderlust Some very nice fiddle work on it as well.
    rark!
  • C'mon, Michael, don't try to pretend you knew that the "aliens" were descended from mecha, but nevertheless chose to use some weird semantic distinction. I missed it too, but at least I can admit it.
  • by alienmole ( 15522 ) on Sunday July 01, 2001 @01:05PM (#116359)
    We all know Katz is an AI, or at the very least a rather messy and verbose Perl script. Sure, with Katz there's something of an emphasis on the A, not so much on the I, but still. Having the Katz AI review a movie about AI is too perfect an opportunity to pass up, so they sneaked it onto the main page. I can live with that, can't you?
  • The last peice, 2000 years ahead, was one of the most emotionally charged and draining points of the entire movie. That was the ending that Kubrik wanted, though he probably would have made the new mechas a bit different.

    I was really astonished by Speilbergs ability to imitate Kubrik's style. The entire movie looks and feels like a Kubrik, from the sudden 270 degree plot twists, the slow, long visuals, and the entertwined plots.

    I won't argue that the movie could have ended with the Blue Faerie and been great, but the last fifteen minutes was an emotional rollercoaster that made it hard for me to get out of my seat.

    Chris
  • "What's with that garbage about space/time, not being able to clone for more than one day, etc. hooey? Couldn't they clone her every day? It's not like they had her memories anyway."

    the whole point of the space/time hooey was to explain why his mother _did_ obviously have memories of David and the house, etc. A mere clone wouldn't.

    "David was poorly designed."

    _exactly_. he was a prototype. A test. David was an extremely flawed model because he is so dependant on his emotions. Think about how devoted he would have to be given the transition to the third act. His need for his mother's love became so crippling to his electronic psyche that it precluded every single other concern.

  • by Marooned ( 21804 ) <toxygen AT vink DOT org> on Sunday July 01, 2001 @08:25AM (#116372) Homepage
    I would pick when the narrator comes again, and you zoom out/in on the ice. Man, the first thought that crossed my mind when the next scene began was "Oh jesus what the f*ck is this, the borg?!"
    first words out of my friend's mouth: "Oh great.. f*cking aliens"
    Commentary heard from the people in front of me (a bunch of 10 year ol' boys): "finally something cool happens!"

    honestly, I've never had such a good movie ruined completely by its ending.. I mean, there's been some terrible endings in a lot of movies, but nothing where the ending just destroyed the whole movie, even the good parts, because it makes you focus on itself so much.

    - Marooned
  • And they were indeed big ETs running around.

    The Spielberg ending of the movie was pathetic, and if it weren't specifically explained (in long, boring, overwrought detail) that they weren't magical ET's and were instead magical robots, well then you wouldn't have posted that meager defense of the worst hollywood ending I've seen in a long time. But they were obviosly ET/close encounters of the 3rd kind rip-offs. What does it matter if Spielberg calls them robots??? He could have called them ducks or goats, but THEY WERE ALIEN RIP OFFS!

    And the pseudo-psychological-metaphysical-nonsensical bullsh*t where it's explained that a day and/or that sleep has some connection with the universe at a fundamental level... Blah Blah F*cking Blah...

    The movie was good, and had a decent ending. I just should have walked out when the aliens showed up.

    -Ben
  • Because you are a computer scientist, you think David is a "computer" in a robot suit. You also think that David has been "programmed" to fall in love with his "mother" and that "mother" is a variable which can be set at a specific point in the program.

    But David is robot, not a computer, and is obviously part evolving hardware. Just like humans and ducks. And just like humans and ducks, they fall in love with a certain thing at a certain time (first living thing after they hatch or person who takes care of them and whose voice they recognize before they are born.) There's no set list of actions which David must do, just a couple guidelines to follow (don't do this, it isn't safe/right/ etc...) ,and an important point, "Love" is his motivation (computers don't have motivation currently, but could be programmed to have one and to self-evolve to this goal).

    David is more like a hard-wired machine in this respect than like a general purpose computer. It's like a continuously re-written imprint, that's evolving as david learns. Which is why David cannot be re-written... without being effectively destroyed. The analogy is to ducks, but there's no reason a robot couldn't be this way. It's just that general purpose computers aren't built that way NOW.

    -Ben
  • Facts of the movie are:
    1. They were robots from 2000 years into the future, who had inherited the Earth from humans, but humans no longer existed (for one reason or another).
    2. This is supposed to be a deep part of the plot, telling us that in the future, robots will be the only things left of human origin on the Earth. And they will look nothing like what we would have built.

    But goddamn, they looked like close encounters mixed with ET, acted like them (notice the strange way they "touched" each other like ET) and they were magical (notice how they talked about the psycho-consiousness-space-time sh*t).

    Just like ET.

    Thus, look like a duck, act like a duck, quack like a duck, and you can call it a robot all you want, but they were aliens indeed.

    -Ben
  • I think I had the same reaction to A.I. that I had with Eyes Wide Shut in the theatre. I had high hopes. I watched carefully for messages and symbolism. And the longer it went, the more I started thinking "My god, when does it end?" And when it finally did, I walked out thinking "Ug, that sucked."

    But then I go home thinking. And continue thinking all through the next day. And at least at one point I think about going to see it again. Because I realize that taken as a whole (and not as beginning/middle/end), it is a complete picture, and it is very good.

    The best part of the movie is the supertoy. I think every geek will want one.


  • David: "Please make me a real boy"
    Teddy: "Shut up"
    David: "Please make me a real boy"
    Teddy: "Oh God, just shut up"
    David: "Please make me a real boy"
    Teddy: "I'm going to kick your f*cking ass if you don't shut up"
    David: "Please make me a real boy"

  • by spectecjr ( 31235 ) on Sunday July 01, 2001 @01:22PM (#116385) Homepage
    The ending (after he goes into the ocean) has apparentlly been in the script since Kubrick started developing it. It went through various revisions but it's not a Spielberg add-on, as much as it might seem so

    A variant of the ending was in the original Aldiss stories... (note the plural; there's three of them - Super-Toys Last All Summer Long is just the *first*)

    Simon
  • by spectecjr ( 31235 ) on Sunday July 01, 2001 @07:50PM (#116386) Homepage
    Precisely. They talked about David as an "original" robot, which made it seem to me that the robots we were seeing weren't created directly by humans, but instead were created by other robots who were created by other robots, who somewhere down the line had been created by humans, but none of these original robots had survived, so there wasn't any lasting direct contact or knowledge of humans.

    Everyone should take the time to run, not walk, to Amazon and pick up a copy of Brian Aldiss's short story collection "Super-Toys Last All Summer Long" (taking the name of the 'story' that AI took its plot from - it's actually three stories). Other stories in the trilogy actually deal with this - IIRC, they were indeed aliens, picking through the ruins of human (and robotic) society, after the robots had died out (after they had built a society after the *humans* had died out).

    The 'original' reference refers to the fact that this was a robot made by *humans*, not by other robots.

    Simon
  • by jmauro ( 32523 ) on Sunday July 01, 2001 @07:48AM (#116388)
    If it ended before the second half of the end it would of been cool. There is just something about a movie that goes on a little too long for it's own good. This is one of them. It would of been more "Kubrick" if it ended before the second half. But the second half was cool in and of its self, but it just really didn't add anything to the movie I though.
  • It was not the worst film I've seen in a while. My first comment after viewing was "Wow, that was the best android-centric movie with Robin Williams that I've seen all year!".

    The good:
    Well, I did get to see Ministry in a Speilberg flick. Never expected that one in a million years (confirmed in the credits: Even if it wasn't 'officially' Ministry, it was Al Jorgensen and Paul Barker, which is darn near close enough. )
    The character I found the most believable was the ueber Teddy Ruxpin. It was the only one with believable lines.
    I really enjoyed the score. It didn't carry the film, no matter how hard Speilberg hoped.

    The bad:
    I found most of the dialogue to be downright lousy: So the mecha know that they've got some serious advantages over the orga. Note that they do absolutely nothing about it other than be able to hang around in an icecube until the really lousy cgi aliens show up. It surprised me to see that every single android shown had no qualities that put them above humanity. No speed advantage. No memory advantage. No strength, no brains, nada. (exceptions: 1) the aliens harvested all of david's memories out, so the retention was there, but david never used any of it 2) Gigalo Joe and spasm radio)

    The ugly:
    The aliens. Awful. Lousy. Everything about them was worthless.
    Chris Rock as the comedian android. At least it only survived for a minute or so of the film.
    Robin Williams as Dr. Know. Horrible.
    I finally got to see the LoTR trailer, but the exciter bulb went out on the projector, so no audio. I still blame Speilberg for that one =)

    -transiit
  • well, I must've dozed off during the line that explained what they were. Maybe it was one of those "you had to have read the short story first" moments. Either way, I interpreted them as aliens. Would machines care if another machine is happy? Would machines say things like "These robots must have been around when the living humans existed"?

    I'll refer to them as the Jello Automatons then. They were still lousy and detracted from the quality of the film.

    -transiit
  • Demand your money back. Clearly you have been defrauded.

    Seriously, it's not that hard to just not read Katz if you don't want to.

    --
    gnfnrf
  • If all the icecaps melted, sea levels would rise about 90 meters. In the worst case that is remotely plausible (the Greenland and other Arctic island icecaps melt; the Antarctic ice cap shrinks a bit), sea levels would rise 5-10 meters.
    /.
  • by thing12 ( 45050 ) on Sunday July 01, 2001 @10:17AM (#116397) Homepage
    I wouldn't say he does a horrible job communicating it - he's just not having them say 'Look at me! I'm a super advanced robot!'

    They are portrayed as archeologists, that much was obvious. Then they reactivate david using some magical power transfer -- that made me wonder what they were. Then the sequence moves to all of the robots downloading David's life experience - that was a big clue. And then there was the conversation with David before his mother is brought back. It left me with no question that they were robots that wanted to know the humans who are ultimately their forebearers.


  • by weave ( 48069 ) on Sunday July 01, 2001 @03:19PM (#116398) Journal
    If all the icecaps melted, would NYC really be flooded to those depths? It looked like it covered about 30-40 stories, or 600 feet or so. That's a helluva lot of sea level rise.

    But my big question is, the ice. It couldn't of just froze like that. If the planet cooled itself over time, the poles would start to freeze again and weather patterns would slowly drop more precipitation on the poles where it would then refreeze again. All this meaning the water levels should go down first and then an ice age would begin and the ice flows would descend down from the poles. Correct?

    And of course, if so, none of those skyscrapers would still be standing. If an ice flow can sheer off a mountain, the World Trade Center isn't going to be able to resist it!

  • by sometwo ( 53041 ) on Sunday July 01, 2001 @11:46AM (#116400)
    Yes I saw AI this afternoon. Combine pinnochio with bicentennial man, close encounters, and an oedipus complex and you've got yourself this movie.

    There are 2 things that I will mainly comment on: the ending and the plot holes. How many endings were there to this movie anyway? It could have ended when he reaches Manhatten and finds his creator. It could have ended after he jumps into the ocean. It could have ended, albeit sadly, during the 2000 years he spent watching "the blue fairy." How many themes was Speilberg/Kubrik really supposed to bring out in this thing?

    The plot holes are many. What the heck happened to his father? His mother is all he talks about. It is so oedipal, it's rediculous. The movie even ends with his mother and him in bed. He calls his mother "Mommy" after he is imprinted and completely ignores the father. He pays more attention to his mean brother. Wouldn't his programmers make it so he was imprinted to 2 people? Thus his father becomes a static character that is quite flat. David calls his father "Henry" the whole movie.

    David breaks easily from a little spinach and yet he lasts 2000 years frozen in the water. He dives into water twice and that doesn't hurt his circuits at all. Also how, exactly, are the robots powered? This is a small issue because this is sci-fi and you have to suspend your belief but come on!

    Also, when his creator tells him to wait while he rounds up the people, he never hears from "the creator" again. Why isn't there a big search for David?

    Why do they bother leading David to New York anyway? Who would expect that he would steal a helicopter and make his way to a particular building in the abondoned Manhatten. If you're gonna lead him somewhere, lead him someplace easily accesible at least.

    The last thing is with the aliens. I can't believe that David is the only "living" remnant of humans. If one robot survived, couldn't others? Sure I could suspend my belief that aliens came to study Earth in the future. But how would they detect him anyway. They went directly to him like they knew he was there. And they can bring somebody back but for only one day and only once. How stupid is that? Exactly one day. No more, no less. So some law of nature depends on when somebody goes to sleep? Crazy.

    Other than that allusion to the poem, why would anyone fly a ship that everyone could see and run away from? That giant balloon was so bright that the robots could run away from it. There is a full moon 2-3 days per month. I think somebody would get the idea that it wasn't a moon if it was 10x as big as a real one anyway.

    Little things: What engineer thought a 3-wheeled car would work? 4 wheels is the most efficient and stable design. 3 wheels looks cool and futuristic though. Also, if half the world has been engulfed, what is David's family doing living in such a luxurious and large house? Wouldn't there be space limits?

    I suppose it does takes more than a little talent for a little kid to hold an entire audience's attention for 2.5 hours.

    Yes, it is something you will want to see but it will never have any replay value. I won't buy the DVD. I can suspend my belief for a lot of it but there are just so many things wrong with that movie.
  • The ending reminded me of a (sniff sniffle, nostalgia...) SimEarth game I played once, where after I had a civilization evolve to the point of Nanotech and mass exodus of the Earth, a population of ROBOTS overtook the entire world, driving everything else to extinction.

    It only happened once.
  • Riiight ... recommend fast & the furious -- a movie with bad dialog, bad plot, bad fx, really innacurate technical data -- because you couldn't get over the pinnochio parallels and the possibility that mankind might become extinct.
  • You must have missed the whole "let's see what would happen if we give robots emotions" premise of the movie. Those robots had evolved over the course of 2000 years. They're the only thing left on the planet. They were designed to mimic human emotions. Is it so strange that they would share our fascination with the past?
  • of this movie. Who cares whether or not they were robots or aliens at the end? Who cares about Asimov's rules? Who cares about the societal implications of AI on humanity?

    The point that both Kubrick and Spielberg were trying to make is this: in the future, you won't have to put up with your bitchy girlfriend anymore, because there will be a "Lover Series" of robots.

    You guys go ahead and argue about Robotics Laws. Go ahead and spend money on DVD burners and 1.7 GHz Athlons. From this moment on I'm saving up my money in hopes that before I die the Lover Series will hit the streets. You guys feel free to invite each other over and show off your latest tech toys.

    Meanwhile, I plan on being the first on my block with his own robo-harem.

  • I agree... at least, I feel that where Kubrick would have ended it. The lasst 10 minutes were neat from a sci-fi and special effects point of view, but just seemed way to cheesy in the context of the rest of the movie.

    ----
  • I say the Three Laws Of Robotics cannot be implemented as written. Period.


    From a fictional point of view, it's not out of bounds obviously.

    I can't believe nobody has introduced RoboCop into the discussion (at least the first one). Here was an attempt to integrate the human psyche with computerized control (his prime directives). The parallel is in having human emotion while having a glass ceiling. They did a half decent job in that movie exploring the complications involved. I don't believe that this movie really wanted to explore these complexities. As justification, the makers were "trying so hard to see if they could, that they never stopped to see if they should". In fact, all the emotional trauma that was caused was ultimately encouraged.

    Thus, in agreement with you and in start contrast to the previous poster, asimov's laws had no place in this movie.

    -Michael
  • maintanence? He definately has air-passages and vocal devices. Even if the food didn't go to his stomach, certainly these devices would be vulnerable.

    -Michael
  • haha.. laughable. We're hard-wired for sexual desire, for fear of danger, for blinking when things approach the eye, or for mothering rage. So why don't we react like a programmed interrupt controller when certain conditions arise? Because the main point of our nuerons is that they're programmable. We have both actuators and inhibitors. They both battle over the initiation of events. Yes there's a hard-wiring to put into motion a response for a given perceived action. BUT, we also have the ability to surpress those hard-wired reactions.

    You could go to great lengths to encode the reward / punishment / reaction system within a neural net, but the fundamental nature of neurons is that they can be super-ceeded (and the super-ceedings can become the new nature, only later to be super-ceeded by something else). A tough man learns to not blink, and to restrain his anger. Most men learn to supress their sexual urges. Most people learn to control their need to excrete.

    The only way they could do it would be to have an external response mechanism that doesn't allow over-riding (which would defeat most of the point of being alive and having an adaptive neural-net).

    Obviously they do this, because of the activation code; that's something that is rather important to not 'over-ride'. But notice how little of that sort of activity is actually used.

    -Michael
  • She's very uncurious about why

    there are no other members of the family around, for instance. As I've thought about it,
    though, the reason for this is that David's Mommy is having a dream.


    I'm not really going to try and explain such an open-ended cop-out, except to say that most recounted experiences with "time-travel" or more simply temporal e.s.p. leave one in a sence of a dream-state.

    From what I got, everything in the universe leaves it's imprint on the analog universe, much like ripples in a zero-resistance ocean, or what-ever analogy floats your boat. They speculate that the physical piece of matter (possibly the complex DNA strand, but also possibly the matter itself) is like a finger-print that can be used to searched in the cosmic ocean to reach-back for the rest of it's constituent parts. It does suggests that time-travel isn't possible (or they wouldn't really need to ecscavate, nor would it be a problem to bring her here).

    I don't quite know if they're mearly finding her personality / her essence, or if they're projecting esp to or fro.

    As a good master, you don't explain the details, but try and find some high-level analogy that describes the functional parameters. Details can be depressing. Something Lucas should be-retaut.

    -Michael
  • In your otherwise admirable rush to correct the impression that the entities at the end of the film are aliens (of course they're robots), you've all missed the way the film picks up on themes from the first third of the plot from that point on. David's situation in this part of the film parallels Martin's toward the beginning: he's a broken little boy in frozed suspended animation who's revived and brought home to his family. The film's ending takes on much more significance if you take a few moments to think about it in this light. Those of you who are calling it tacked-on and superfluous are missing the boat completely.
  • I'm glad that at least some of the /.ers were clueful enough to realize that those are mechas, not aliens. If there is any doubt, read this:

    http://www.nytimes.com/aponline/arts/AP-WKD-Film -R eview-AI.html

    This was written before Spielberg ever got his hands on the movie. It's interesting how much of Kubrick's vision actually made it into the movie, including a world populated only by machines.
  • by gorsh ( 75930 ) on Sunday July 01, 2001 @08:00AM (#116424)
    I got to see A.I. Thursday night at a preview screening here in Chicago. As someone who's been following the project since Kubrick was going to do it, I've posted some of my initial impressions below, with no spoilers:

    I think A.I. would have been a brilliant film had Kubrick been able to produce it, but in Spielberg's hands the results are mixed. You get
    the feeling that Spielberg understood about 90% of the story, but there's still another 10% there that he didn't know what to do with, particularly in the film's third act.

    Several film critics have talked about Kubrick's use of "non-submersible" units - constructing a movie out of five or six sequences that support the argument of the film and tying them together with narrative links. In most of Kubrick's films, he ties together these units with such skill that a casual viewer doesn't notice that they're there. In A.I., which follows this (for lack of a better world) "Kubrickean" narrative structure the bits are all there, but they feel disjointed and clumsily put together. The transition to the third act in particular, is particularly clumsy, and it becomes clear that Spielberg doesn't completely understand all the
    ramifications of the final scenes, because they aren't thematically consistent with the rest of the film.

    I think one of the problems is that Spielberg is not an experienced screenwriter, and has trouble with some of the finer points of narrative storytelling. Additionally, his films tend to fall more into the traditional Hollywood narrative structure, so making something outside of that is a challenge for him, especially when
    working a much faster schedule than Kubrick would have.

    The other thing I missed was the acute sense of irony that fills Kubrick's films. The story of A.I. is really one of a huge cosmic joke, and I didn't get the feeling that Spielberg got it. There is certainly humor in the story (a welcome diversion from some of the film's emotional intensity), but it's "cute" humor, rather than
    satire.

    Spielberg does get credit for capturing the look that Kubrick probably intended for the film - no doubt the numerous storyboards provided by the Kubrick Estate helped. Also, the performances by all of the lead actors are fantastic, particularly Haley Joel Osment.

    The John Williams score is very overbearing in parts - one of the great things about Kubrick's films was the economy with which he used music - here it's a constant presence, and when Spielberg is trying to make a point, he just cranks up the volume.

    Despite it's flaws, I think it's a movie worth watching, however, if only for the little nuggets that shine through. It's one of Spielberg's most ambitious films, and I think he did very well with it in parts.

    Interestingly, I actually think Kubrick may have been on to something by proposing that Spielberg direct and he produce. It's well known that Spielberg has no patience for post-production, and leaves most of those duties to his long-time editor, Michael Kahn. Had Kubrick been in charge of post-production on this film, and taken the time to get it absolutely right, I think it could have been a masterpiece, even with Spielberg directing.

    Anyway, those are just my initial impressions - I will probably see it again, although I don't plan on paying more than matinee prices....
  • by mcarbone ( 78119 ) on Sunday July 01, 2001 @12:56PM (#116428) Homepage
    This is true (about Kubrick devising the 2000 years later bit), but let's be honest, he would have done it differently and more enigmatically.

    I think one of the worst moments in the film is when David answers the door in the fake house, lets in the advanced AI, and we cut to them sitting on a bed, legs crossed, discussing this awful new-agey science crap that reminded me a little too much of metachloreans. What's with that garbage about space/time, not being able to clone for more than one day, etc. hooey? Couldn't they clone her every day? It's not like they had her memories anyway.

    I didn't mind the concept of the ending so much as the execution. Kubrick wouldn't have had that awful conversation. Kubrick was known for his scientific accuracy (just watch 2001), and as a computer scientist, I was plainly ashamed at the pseudo-science that Spielberg was spinning.

    Anybody with the tiniest bit of common sense would not program a robot that could harm, that would eat spinach, that would even have an esophagus whose sole purpose is apparently to deliver damaging edibles into its most valuable circuitry.

    David was poorly designed.

    And I'd like to think that Kubrick wouldn't have made the same mistakes.

    If Kubrick had made this movie, people would still be angry at the film (like Eyes Wide Shut, which I think is an excellent movie with some flaws possibly due to his untimely death), but twenty years from now it would be greatly respected. Now, except for the visual effects, it will mostly be mocked.
  • I'm not going to talk about my opinion of how good the film was (I seem to be on par with a lot of people in here in my views).

    But did this movie remind anyone else of Blade Runner _in atmosphere_? I'm not talking about plot (someone mentions that up above, but I don't really agree other than one general parallel between the two). But from the commercials on through the actual movie, I found myself thinking of Ridley Scott's film.

    I wish I was more awake so I could provide examples better, but once you get past the beginning (which was anti-blade-runnerish and closer to Hollywood than Kubrick) to "David in the woods", the atmosphere suddenly changes to one which reminds me of the dark, sometimes-overwhelming, sometimes-desolate future world of where Deckard lives.

    Anyway, just a thought -- keep in mind this was just an oberservation, and not a criticism. I liked the atmosphere of the film, and a great fraction of the film itself. :)

    -Puk
  • I'm not sure an AI could be programmed with these and stay sane. Existence means that at some point someone's going to get hurt. Besides, if you're going for a purist's viewpoint, such rules conflict with free will, which is what any artistic programmer is going to go for. So the bots have to be able to kill the meat monkeys, then they can decide not to kill the meat monkeys (Er... or not, in which case you have a bit of a problem, but no guts no glory, right?)
  • through a sort of peaceful and very gradual transition

    Or maybe a quick and violent transition, with the robots wiping out all traces of the humans. Then, a thousand years later, they are embarassed by what their forefathers (umm... forerobots?) have done, and with no real humans around they start to idolize them, and believe that humans (as their original Creators) must have all the answers they seek about the meaning of existence.
  • To understand the movie just imagine asking William Gibson to rewrite Pinocchio. The movie started very Spielberg, started to get interesting, the went Spielberg to the end. To sappy, too much of the kid doing his patented look from Sixth Sense. Like E.T. you have the kid and sidekick robot teddy bear. The beginning of the movie was to short choppy with the parents loving the kid the suddenly turning on him, especially the father. The movie might of been better it it was about thiry minutes shorter. the movie was like this review, disjointed and no flow.

    enjoy
  • For those of you who didn't turn on the Sci-Fi news slashbox, here's a link [bureau42.com] to another review.
  • That wasn't the only possible good ending - I think the movie also could have successfully ended in the forest (when they walked towards the moon). An open ending like that would have been good... Or in the sea, just in front of his target. So close, yet so unreachable.

    Unfortunately, it ended later - it almost looked like a test audience had demanded a somewhat happy looking ending. Bah.

  • by pjdoland ( 99640 ) <`pjdoland' `at' `pjdoland.com'> on Sunday July 01, 2001 @07:36AM (#116461) Homepage

    If and when you decide to see this movie, walk out of the theater at the precise moment when David (Haley Joel Osment) jumps off the building into the ocean. Trust me on this. I won't spoil the ending because I've forced myself to repress it.

    I'm really hoping someone pulls a "Phantom Edit" with this film..

  • Finally, an insightful comment.

  • I am very split on this movie in a few areas: 1)I also was expecting an Artificial Intelligence movie. The theme of "love" was a perverted twist to the plot. Instead of actually wanting a mecha who *could* love, we are bombarded with pictures of dead children for whom these mecha's will be replacing. I will not credit Kubrick or Spielberg for this, but it certainly seems Kubrick-ish; humanity is so overwrought with its own desire to love/be loved that it creates the perfect robot to fulfill the needs: Nanny/Caretaker, Perfect Enamourous Lover, Perfect Adoring Child. Thus so, I found it perfect that the "Flesh Fair" would exist. It is important (to me) that the opposition be shown in stories, and the "Flesh Fair" was extreme enough to get across a few points. Firstly, that not everyone enjoys mechas; Secondly, they are willing to destroy them just as long as no pleas emerge from the mecha. If the darn thing gets "too" human, then what distinguishes it from humanity, but only by its hardware (or other fruity ideas)? Thirdly, when David smashes "David," I was pleased. It is extreme enough to get across the point that a robot can snap and destroy even other mechas. This, also, is a very human trait; however, I saw it as necessary in the task given to David. He was programmed to "love." His mother told him he was "unique" and "one of a kind." He saw another of his kind, which would undoubtedly alter the "unique" love his mother has for him, and therefore this obstacle must be removed. Perhaps this was too extreme, but I thought it was perfect. 2) I thought it was a pretty good movie. I'd like to buy it and analyze the heck out of it someday. Of course there are "Frankenstein" references, which I was very happy to read in other Reply's. I thought it was interesting that there is no middle ground for creating aritficial life: In "Frankenstein", the Dr. created his monster and deserted him unloved and uneducated. In "A.I.," David was left unloved and uneducated, but with a mission. The Monster also had a mission, but it was one of his own free will (work with me here), to kill Frankenstein's family. Contrarily, David only desired to be loved. Let me change that: David was only programmed to be loved, and he failed. He waited 2000 years (and didn't know it?!) to fulfill this task. Do I think he really loved his mother? NO. NONONO. That was his planned mission, and although I see Dr.whatshisface's point that "no other robot has made decisions on its own," and David was again "unique and one of a kind," I find it hard to believe that something programmed could somehow break away and *actually* love. This, my friends, is impossible, but good for a Sci-Fi movie. 3) The Ending: Ohmygosh. Ok, I thought David "became a real boy" when he gave up his desire to live and fell off the building. I also thought that this was a good marker for the end. But a prophecy by Whatta-ya-know-Joe earlier in the film prevented the credits from rolling: "in the end, all that will be left...is US." So, 2000 years later, apparently "the end," robots have taken over the earth, and some archeologists have found David and have a party. The ending was a closer for a few things, however lame it was that David "went where dreams are made": 1. He found the Blue Fairy and became a real boy, 2. He fulfilled his programming needs, and was loved by his mother, 3. The world was taken over by freaky robots, 4. We entered an unexpected ice age, but instead of the waters receding back to the poles, it just stayed in New York. I ultimately found that David "slept" to be a silly ending. He never desired to sleep before, so this is an unfounded conclusion to David's quasi-epic. Water won't destroy him (but spinich will..), time won't kill him...how exactly does David go? ... Good Movie? Bad Movie? I say "interesting, but more Kubrick would have been better." It was a little too fuzzy for a Sci-Fi. Reminds me of another fuzzy Sci-Fi movie..."E.T." Hmmmmm....
  • However, I can't remember the exact quote, but they mention something to the point of Humans were the most important race for some reason, which could make you assume that there were more races of intelligence out there. They could be referring to all races of life on Earth, too. But it wasn't THAT obvious that they were aliens, nor was it obvious that they weren't. There are many reasons to support both viewpoints, and it would be best to ask Spielberg himself what they were, than to assume or bicker over what they were.
  • Seven!
  • delete "mecha" from the following and i'm describing human experience. that kubrick definitely has a way of expressing complex, subtle inner emotions.
    1. mecha boy's parents are neurotic messes who create mecha child for their own benefit, not that of the mecha boy
    2. mother inevitably creates unhealthy, unbreakable bond between her and mecha boy
    3. father and brother obstruct mecha boy's relationship with mother, causing mecha boy's suffering to increase
    4. mecha boy's desire for greater closeness as well as fear of being hurt leads him to inadvertently cause harm to others
    5. mother inevitably abandons mecha boy for selfish reasons, but leaves him a chance for survival and thus a faint but real chance to find happiness
    6. mecha boy immediately finds his own kind: other damaged mecha people fumbling around for their missing pieces in the dark, neurotically stuck in their own self-limiting programming/conditioning
    7. mecha boy finds the strength to continue by blindly holding onto his idealized notion of mother, as well as his own childlike perceptions of his uniqueness and special qualities
    8. unable to get what he needs in the real world, mecha boy develops magical thinking, believing a fictitious supernatural force will help him meet his needs
    9. mecha boy ultimately finds that he is neither special nor unique and will never get the love he needs from his mother--then he tries to kill himself
    i left out "mecha boy gets therapy and feels better" because that doesn't always happen, but i thought it could have worked.
  • I personally believe people went into this movie with a lot of expectations which were not met. Did Spielberg try to become Kubrick to pay homage to his close friend? I think the answer would be a resounding yes, but people need to look past that and many other "plot" points and look at a few of the ideas which have never been wrestled with very well in the past. Such as when a robot becomes sentient (define it as you must) what are societies moral obligations to it. Perhaps David surpased what a robot truly is and in some way had a "soul." Also I thought the robots (loose definition) of 2000 years in the future revering humans as gods was quite interesting. Perhaps this is similar to how man views god. Despite everyone saying they did not like this plot line or that scene we must look at the movie as a whole. To look at Clock Work Orange or 2001 and not consider the ethical questions tackled is quite a shame. See past the idea that "Spielberg is not Kubrick and he was horrid in his attempt." The movie is much more then what is on the screen. Kubrick has always asked us to think outside the film, and if we do not do that with AI we are just dismissing what Kubrick's true legacy has been.

    Bingeldac denies any responsibility for the
    spelling and/or grammatical errors above.
  • That's because Spielberg can't do anything else.

    One of the best lines I ever read about Schindler's List was that "only Spielberg could make a feel-good movie about the biggest feel-bad event of the 20th century."

  • This movie was based on "SuperToys Last All Summer Long", which if I'm not mistaken, was written in the late 1960's. So the movie is all about what people thought of the possibilities of AI back then. We know a lot more now about AI, mainly that the theories presented here are bunk.

    Bad Science #1: It is hard to make an AI "love". On the contrary, this is the easiest thing to do. In its simplest state, its just a variable that you set, i.e, Emotion = CONSTANT.LOVE, or Emotion = CONSTANT.HATE. The real tricky part is giving that constant meaning. But to build a robot with the capabilities of AI's David, love would be one of the first things you program. Without love, hate, and a host of other emotions, it would be impossible to make a robot learn all the things they need to know to be human. Emotions, along with basic needs for survival, are the building blocks of motivation. Without motivation, nothing has reason to learn.

    Bad Science #2: It would be easy to create innumerable copies of David. False. David isn't something that you could just straight-out program. There is simply so much information that goes into being a human that it would be impossible to list it all. Even if you could, it's too dynamic to represent with simplistic "if then" clauses. No, a being such as David would have to be programmed to program itself, either through evolutionary or neural programming. This process of programming would not be able to start at 5 or 10 years of age. It would have to start from birth. Think of how long it took you to realize the basic functions of society. It was a very long time. I'm 23 and still learning. There may be things we could do to speed this up, but it would not change the basic process. Furthermore, this process would not be replicable: each "David" would have it's own unique personality based on it's experiences. Personality similarities would be about what we see between human twins, no more that about 50%.

    Bad Science #3: David would get stuck in a rut and sit there by the Blue Ferry for 2000 years. Bzzt. For the reasons mentioned above, the mere amount of intelligence that went into David would necessitate him not being able to get stuck in a "mental rut" like this.

    Bad Science #4: Irreversible imprinting. It would be impossible to program something like David to magically change alter its mind to fit our primitive notions of folk-psychology. You wouldn't just be able to open up an AI's mind and cause it to "love" someone, any more than you could open up George W. Bush's mind and cause him to be Democrat. The representation at that point in David's life would be too complex to even understand. Our scientists would only know enough to get the mental learning process going, not to alter it once it's long on its way. Thoughts such as "love for mommy" are not a switch in the brain that can be turned on or off. Similarly, to program something of this complexity would necessitate at least a roughly isometric representation, which would imply the same, that you could not alter something as simple as "love for mommy" with a switch, or a sequence of random words.

    Bad Science #5: "We can only bring a human back for a day, because once a space-time path is explored, it can never be explored again." This is just pure bullshit, intended for a tidy ending. I did like the robots at the end however, they were really cool.

    But overall, the movie was good. The acting was realistic enough to make me suspend disbelief, despite all of the above.

  • Your a moron pretend you know what they could do or how AI works.

    Actually, I have degrees in philosophy and computer science, which is about as close as you can get to being knowledgable in a field that is far from perfected.

    the imprint just told the program that she was his mom and his name was david.

    Explain to me how you would alter a human brain to do these "simple" changes. You have no idea how much processing goes into such "simple" things as recognizing your mother. This is not something you can just turn off and on. You're thinking in folk-psychology terms.

    We communicate to each other ideas such as:

    1. Bob loves Mary.
    2. Bob is jealous of Mary's love for Todd.
    3. Mary does not desire Bob's affection.
    But there isn't a chalk board in our brain where we can just change these things around. The information is stored in many places, and in many pieces.

    Developing a robot to think like a human would almost certainly require some isometry to the way that humans think. If you start from the basics of human thought, the idea of information stored in neural nets, you end up with something that is not easy to manipulate, except through its inputs and outputs (i.e., talking to it).

    So I'm not really against the "irreversible" part of the imprinting idea, I'm against the idea that you could imprint at all. If you disagree, tell me how it could be done! There's a lot that we don't know about the human brain, and what it would take to replicate it, but we do know that it wouldn't be this easy.

  • There could of been a major disaster, such as an asteroid crashing into the planet, kicking up enough dust to make the planet cold.

    As for the 2000 years of discovery, look at how much advancement there has been since the birth of christ until now, that's about 2000 years.
  • Just checking - we all do realize those were robots at the end and not aliens. The concept of our own creations outliving us seems pretty cool to me. When the Earth gets back to a good climate for humans it seems they would be able to clone us and humanity can survive its own extinction. While the ending was drawn out the ideas are worth it.
  • Oh, my humble pacemaker,
    one heartbeat from the throne,
    keep that son-of-a-bush healthy,
    my battery is running low.

    My heart companion runs java,
    version one-point-oh-dot-two,
    i can feel the pressure building,
    cuz there's memory management to do.

    the doctor say i'm healthy,
    my pounding friend beats true,
    he checks on it remotely,
    using linux network tools.

    But what's that sinking feeling?
    could my worst nightmares be true?
    it's those russians and the chinese,
    hacking in to turn me blue.

    They got my damn IP,
    from the whitehouse tour bathrooms,
    well, that's it for your VeePee,
    the floor approaches...boom.


    Treatment, not tyranny. End the drug war and free our American POWs.
  • by _Bean_ ( 128235 ) on Sunday July 01, 2001 @10:50AM (#116490)
    More proof that opt out doesn't work
  • just saw A.I. Call me insensitive, but it took all my willpower to not shout "Use the Force!" during that one scene. You'll know the one. It's really quiet, too. I probably would have, if it weren't for the higher-than-usual odds that, if I were thrown out, I'd be recognized and denied admittance to that theater in the future. And while there are probably ample opportunities to yell "I see dead people", wait until the very end. Also, the theatrical trailer is right about the initial premise of an artificial child. He doesn't age. He is an artificial son. Now, call me insensitive a second time, but that sounds like the worst idea ever. Every parent would tell you that the best thing about having a child is that you're raising him, watching him grow, and building a future for him. Take all that away, and all you have left four-foot-fall Tamagotchi. Instead of growing old and dying after six weeks with careful attention, he lives forever and apparently runs on a perpetual motion device. Imagine a future where there are 80-year-old couples who have had an 8-year-old son for fifty years. There is only one reason a corporation would spend millions of dollars to develop such a contraption: to lure millions of moviegoers. The child has still more examples of bad design. The imprinting being irreversible, for one thing. Suppose after 20 years, having a child running around the house with its emotional neediness begins to wear on you. There is no painless way to end the relationship. Even a pet dies on its own, but this child must be driven to the nearest Robot Shack to be destroyed. I'll bet they even have a little observation window where you can watch them put him in the guillotine. Were this robot released to consumers, this is probably the first question people would raise. The question of love was raised at the beginning of the movie, but the Tamagotchi already answered those questions. The board member that would serve as the conscience of the meeting asked that if the robot could love, what responsibilities would the parent have to love a robot? At least some Tamagotchi owners loved their pets, and many of them mourn their death [aol.com]. But won't there also be people who bring the robot home, imprint him, and then abuse him? The Sims and Black and White allow you to abuse the residents of their particular dollhouses, and many people do. And when Monica imprinted David, he started calling her "Mommy". Did he start calling his father "Daddy" and start asking to play catch with him? How long was it before anyone in the household referred to Martin as his brother? Did anyone raise the question of whether our robo-child would go to school, or whether it would make any of its own friends independently? I suppose it works as a metaphor for how people bring children into the world without any consideration for the consequences.
  • by istartedi ( 132515 ) on Sunday July 01, 2001 @08:01AM (#116494) Journal

    but why would humans make disobedient robots?

    D$%* it! Why won't this print???

  • Well yes, for robots he is.

    When you speak the word "robot" you automatically speak the invocation that calls up the God, like it or not.

    KFG
  • by _xeno_ ( 155264 ) on Sunday July 01, 2001 @08:04AM (#116510) Homepage Journal
    I have to post a "me too" to this, but really - the movie past that point is just... too weird.

    I think that Spielberg wanted to take some of the edge off the movie, and so he tacked on a poorly written ending that tries to solve David's desires as listed in the review above.

    It doesn't work though!

    Although I would recommend sitting through the entire movie anyway just to look at the cool visuals after his plunge into the ocean, ignore everything thereafter and just be amazed by the pretty visual effects.

    (Mini-spoiler below, it shouldn't really effect anything, but be warned...)

    It really does come off as a masterpiece until the narrator mentioned in the parent moves the story an additional 2000 years into the future. At that point any vision about the movie is lost and it just stops making sense. Although it eventually brings everything into a almost-ok wrapped up ending, without anything really being satisfied.

    The movie could have ended when David jumps into the ocean - or it could have ended again when the narrator pipes up again after his plunge. But it doesn't, and it loses its vision and direction.

    --

  • by tapin ( 157076 ) on Sunday July 01, 2001 @07:46AM (#116511)
    The movie was excellent; there were a few absolutely top-notch scenes, from both acting and sfx standpoints. However, if you haven't gone and seen it yet, leave when the narrator kicks in.

    It's about forty-five minutes too long; I'm convinced Spielberg simply wanted to emulate Kubrick as much as possible, and therefore threw a nearly nonsensical completely gratuitous and most especially pointless ending on the movie -- nevermind that it takes up nearly a third of the running time. I would've considered it a masterpiece if they would've just rolled credits after Joe hit the "submerge" button.

    Does anyone know much about the "Supertoys" short story? I figure I'll go snooping with Google in a bit; but the short story that Wired reprinted at the link in this article doesn't seem complete. A recent issue of Playboy had two short stories by Brian Aldiss that had "Supertoys" names -- did he just write a whole bunch of short stories about David-the-neurotic-robot, or are all of these excerpts from a novel?

  • by Ars-Fartsica ( 166957 ) on Sunday July 01, 2001 @07:52AM (#116522)
    After seeing AI I can say I was very intetested in the film and the plot although I wouldn't recommend it as entertainment per se.

    Like Clockwork Orange and 2001, this film is more about exploration than entertainment.

    And yes, I realize Spielberg directed it, but it is Kubrick's vision.

  • Why do story-tellers always portray emotions (especially love) as the test of humanity, and then equate humanity with personhood? No love==no personhood? But, I always though Spock was a great guy!

    An idea flowing out of the cognition field is that a human brain is mainly one big collection of pre-computed things to spare the mind from having to deal with everything.

    Imagine you are in a dark cave and you hear a growl. You could determine logically that it's unsafe and leave, or you could become afraid and leave. The result of emotion is often the same as the logic--just pre-computed and generalized. Every emotion from love to greed to lust to charity can have this generalized logic applied to it--so it's possible that's all it really is.

    But if you could create a faster mind capable of dealing with logically deducing if it's unsafe in a cave, you don't need emotion--and not only that--it's superior logic may often cause it to act in ways consistant to that of emotional beings. You might call logic "real-time emotion" instead of "pre-computed emotion"?

    I'll take Spock over Kirk any day!
  • Not spinach, evidently. Say what you like about dodges like "fairy tale" or "symbolic," but the idea that a robot boy could neither eat spinach nor get wet without having some kind of short-circuit is simply stupid.

    I've seen a lot of this lately. Filmmakers and artists have this tendency to overrate the big picture and forget that the details are also part of the big picture. When I talk about the can't-eat-spinach scene and otherwise intelligent people snarl, "Don't get so hung up on the details!" I feel like I'm the one talking to robots.

    Sure the details matter. The details make a lot of difference, even in a story that's supposed to be a fable. Calling your story a fable does not mean you have the license to cavalierly ignore things when they don't suit you. If you want to make your characters fly and dodge bullets, then you come up with a story that supports those things.

    "The Matrix" built nicely up to its allegorial rebirth ending (trying not to give any spoilers here). "A.I." just has senseless stuff like the above dropped in all along the way with no real explanation. The "fairy tale" credentials seemed largely due to it simply quoting "Pinnochio" directly (the "Blue Fairy") -- and for that matter, not quoting it with a great deal of insight or intelligence.

    Try this experiment. If the names "Steven Spielberg" or "Stanley Kubrick" didn't appear anywhere in this movie, would it have been anywhere nearly as interesting? I asked friends of mine to try the same experiment with "Episode 1" and they responded by merely getting angry. Most of the reason for the interest in the film is because it consists of a story that was never completeted by one very famous director and has since been completed by another. For that matter, "After the Rain" was pretty mediocre, too.

    At best, the movie is a failed experiment. At worst, it lapses into the kind of precious, pretentious sentimentalism that passes for emotions these days.
  • From a programming perspective, I didn't like the fact that he was homicidically dangerous. I mean, by the time kids are that age, they are sensitive to a siblings drowning death throes, if not completely aware that 'Mommy wouldn't approve of me killing my brother.' And then toward the end when he was walking amongst all his 'clones' (an excellent, eerie scene reminiscent of the Shining kitchen scenes, and not the Jurassic Park kitchen scenes, btw), um, just before that he gets into a rage and beheads one of them??? I agree: Isaac Asimov should have showed up in a cops uniform and written David a ticket for violating some principle of the Robots Creed. That would have been a funny scene...
  • iThe movie contained enough plot for approx. 15 minutes of lame TV crap. I feel as though 2.5 hours of my precious youth have been taken away forever!!

    This is remarkably similar to some student comments I read in a college newspaper. The comments were about the movie 2001 when it first came out, and the newspaper was found in a box of old stuff. Fascinating how shallow that comment seems with 20/20 hindsight and the passage of a couple of decades.

    It is always fascionating to go back through th old newspapers, and read what regular folks felt about stuff, and how the perception has changed.

    Check out the Vinny the Vampire [eplugz.com] comic strip

  • Like Clockwork Orange and 2001, this film is more about exploration than entertainment

    Agreed. But Im glad for it, We really need more opportunity to think, lest we entertain ourselves to death.

    See:

    "Amusing Ourselves to Death: Public Discourse in the Age of Show Business", by Neil Postman, Methuen, f.p.1985, new imprint 1998, 184pp., paper. http://chapters.indigo.ca/books/details/default.as p?ISBN=0140094385&mscssid=WM6H569WPWT99MM6MS06WHN2 55SADMJF&WSID=1307FEBA8D114200466C8A9EFCA0DB221A35 1201
  • Gratuitious tear-jerkers, cutsey-laughs, and all of the other crap that's thrown in to make the movie more marketable to the typical McDonald's customer and general-purpose merchandisers.

    I can't see any fast-food joint wanting to pick up the merchandise rights to this film. Besides the Teddy, and maybe the cars, what is there to market? And the film is unquestionably not intended for children--"killing" robots with cannons, "parental" abandonment, red light districts, gigilo robots, murder and death. There was far more non-cutesy than cutesy in this movie.

    It's a good flick, but it's no epic. Get over it, boys.

    While I agree that there wasn't anything groundbreakingly "first" about this movie, that doesn't mean it's not a great tale. Summer blockbuster addicts who went to this film expecting lots of action, adventure, and eye candy are going to walk away disappointed, and there's nothing anyone can (or should) do for them. But anyone who wanted to see an intriguing story in the true science-fiction vein -- not like Hollywood sci-fi, but like Issac Asimov sci-fi -- will walk away pleased.

    I dislike the closed-minded idea that only films like Spielberg's Jaws or Close Encounters or E.T. will be remembered in the decades to come. Each of those films stood out from their contemporaries because of their F/X as well as their stories (well, except Jaws, which was all effects around an overused monster story). A.I. has outstanding effects, but to be honest, they're nothing the audience isn't used to seeing these days. However, they deserve respect for the way they were so seamlessly blended into the movie. Very few effects stood out. The Teddy looked like a toy, the car looked like a car. Everything looked wonderfully, invisibly real.

    The story, meanwhile, is very different from your typical summer fare, and that's probably throwing everyone for a loop. The thrust of the film is a philosophical question: what makes humans "alive", what gives us a soul, that the robots lack? The professor at the beginning posits that the missing element is the abstract quantity of Love. The rest of the movie explores whether or not this is true.

    There's no "bang" in this film, and I'll agree that there's nothing too terribly novel about the story. Nevertheless, it's a rare story well-told, and deserves recognition for that alone. Hollywood is so packed full of high-adrenaline monsters and spaceships that everyone's forgotten what science fiction is really about.

  • I watched it with high expectations. At the end, at best, I could say this was a good movie, but not a great movie.

    The part that got me was in the first few minutes: We can design a robot that loves, but can people love him back. That's what the film was all about. More specifically, we can make a movie about a robot that loves, but will the audience love him?

    You've got to give kudos to the filmakers here. David was robot. He is irrational, he does'nt follow logic, and ultimatly, he NEVER strays from his programming. All he cares about is getting his mother's love (even at the expense of her). But it still makes him loveable.

    I hated the plot holes. I hated the fact that any rational person watching this film has to, at least a few times, go 'uh... what?' with the way the characters interact. But, I still like the movie.

    On a side note, Man, the 'supertoy' Teddy was the coolest sci-fi side kick in years. If only George Lucas would watch this and give us the personalities of Teddy instead of JarJar, I'd be in heaven.

  • Most people seem to be 'missing' this movie. Which isn't too surprising, since most people are idiots who need things spoon-fed to them, and this movie is quite a bit more demanding than that.

    - Spryguy
  • I went into this movie expecting something more along the lines of hard science fiction, with robots, computer science jargon, and some kind of Frankenstein scenario. Sure, the movie does contain stuff like that, but not a lot. It really is more like a modern Pinocchio. Don't go into this movie expecting something very techno-ish like the Matrix. Some have told me how genius this film is and how profound Kubrick is, but this movie simply wasn't entertaining. I was upset that it explicitly drilled the Pinocchio theme into the minds of the viewers. The people at my showing were laughing because the movie was so corny, babyish in its desire to showcase a modern e-pinnochio, if you will. I guess I would say that everyone should see this movie at least once. Some think its a masterpiece, others simply think its garbage.

    By the way, the thing that irked me most was its construction. (Spoiler?) ... It has three distinct parts to it. When the movie transitions from one part to another, it practically severs any relation between them other than the main character and his sidekick. It made the movie seem like three distinct stories sown together at their tangents. Maybe they could have served as three different storylines aching to be completed. One last thing... The third part.. The characters it involves is simply rediculous (back to that childish idea).

    So, all in all... Maybe it is deep. Maybe I'm just not seeing it. But it wasn't entertaining. Haley Joel Osment was great, he's an excellent actor. But I don't feel this as a box office smash. Try Fast and Furious if you like an entertaining movie with import cars.
  • Uhh... I know this is going to hurt... But for all the people confused, such as me, would someone please list some hard core evidence that the excavators at the end of the movie were indeed robots?

    I always tend to make my initial reaction a conclusion, and I only saw the movie once, so I assumed they were aliens. They looked like the stereotypical kind of aliens. Whats more, its hard to believe that they didn't coexist during the age of humanity or have some kind of record of humanity. If they really were the creation of the first generation AI, then this second generation should have all the knowledge of the first generation. The first generation coexisted with the humans (that's obvious). If I remember correctly, the alien commented as to the importance of David as one of few links back to humanity. This made it seem like they know little to nothing about humanity.

    Btw, I've been told that Joe's only-robots-will-exist line foreshadows the end of humanity and the reign of robots, so that might help the they-are-robots side of things...

    So, I'm willing to believe anything, but I'd like to have some concrete list of why they aren't aliens. (I'm aware someone gave a link to a NYTimes article supposedly shedding light on this issue, but the NYTimes server isn't playing nice w/ my box).
  • by closedpegasus ( 212610 ) on Sunday July 01, 2001 @07:58AM (#116574)
    There's another review here [kurzweilai.net] by Ray Kurzweil, a guy who has been around real-world AI for a while. Possibly a few plot-spoilers, but mostly about the feasability of stuff done in the movie.
  • go to speilberg for what I felt was the best film I have seen in a looong time. (I do agree, it would have been a far more powerful ending if it ended when he plunged into the ocean: the ultimate humanity in a machine, the acceptance of a harsh truth and a (metaphorical) suicide, but thats neither here or there. And yet, those are cyborgs, not aliens at the end.) congrats go for an equally good review to JohnKatz, who did the film justice. I guess the only thing I really wish was different was the title (and I suppose the way they have marketted it). I was expecting a crazy, scifi action thriller, but instead I got a far more amazing, intellectual experience. I am glad I paid the $8.50 for this one.
  • As Michael points out with his "eye twitch" comment, the robots don't blink. Haley Joel O. was on a talk show (can't recall which one) the other night and stated he doesn't blink during the whole movie. I'm reasonably certain none of the mechas do either. (Something interesting/stupid/odd/freaky/etc. to look for...)
  • Coming Attractions has a page up regarding the possiblity of AI2. Apparently Speilburg has bought the rights for the sequel short story and wanted/wants the rights to a single sentence for the "third idea of the saga". Check it out...

    Coming Attractions on AI2 [corona.bc.ca]

  • by account_deleted ( 4530225 ) on Sunday July 01, 2001 @03:51PM (#116609)
    Comment removed based on user account deletion
  • by Demerara ( 256642 ) on Sunday July 01, 2001 @08:34AM (#116618) Homepage
    Wow. Dubbya B (as we Irish like to call W.B.Yeats) quoted - in full - on /.

    I'm originally from Sligo, some 8km down the road from Glencar Lake (where Yeats is said to have written this poem).

    In an 1888 letter to Katherine Tynan, Yeats said 'my poetry...is almost all a flight into fairy land, from the real world...The chorus to the "stollen child" sums it up - That it is not the poetry of insight and knowledge but of longing and complaint - the cry of the heart against necessity. I hope some day to alter that and write poetry of insight and knowledge'

    This he did indeed go on and do. Here's a extract poem which, though I have not seen AI yet, may address the topical movie's theme:

    He wishes for the Cloths of Heaven

    Had I the heavens embroidered cloths
    Enwrought with golden and silver light
    The blue and the dim and the dark cloths
    Of night and light and the half-light,
    I would spread the cloths under your feet:
    But I, being poor, have only my dreams;
    I have spread my dreams under your feet;
    Tread softly because you tread on my dreams.

    You can see the waterfall in the hills above the lake here [merenda.com]

  • by Jormundgard ( 260749 ) on Sunday July 01, 2001 @07:55AM (#116627)

    I really loved the movie. I know that others will disagree and will nitpick at the flaws (which there were), but I think the great scenes made it worth it.

    But the reason I posted - those weren't aliens at the end, they were robots! And the narrator was referring to his fasincation with his human creators. Didn't you guys love the symmetry of the robot's "human" desire to understand humanity? That he already had what he was looing for but didn't even know it? Well, it was probably just me :).

  • by Salieri ( 308060 ) on Sunday July 01, 2001 @07:48AM (#116639)
    How can I best summarize AI?

    Part Close Encounters in the wonder of its visuals;
    Part A Clockwork Orange (Kubrick) in its pessimism of human nature;
    Part 2001 in its glacial pacing and technology plot;
    Part Hook and E.T. in its gushy family sentimentalism with otherworlders.

    Naturally, Kubrick and Spielberg don't mix well, so AI sort of splices these together end to end.

    Did I enjoy it? Yes. Do I recommend it? Yes, if you like said movies. I really enjoyed Jude Law as a robotic gigolo.

    The computer science part of me screamed the whole way though... basic CS punches huge plot holes. The biggest is this: the linchpin of the entire plot is that the robotic boy can never stop loving and longing for its "mother" owner, despite being destined to outlive her. This is absurd -- not being able to reboot his software, or at least reinstall it, is really contrived.

    But the photography and special effects are amazing, especially in the hands of Spielberg's admirable ability to have the effects serve the plot and not the other way around.

    And if you have any doubt in your mind that John Williams is the most versatile composer working today, this movie will put them to rest. Line up the soundtracks to "Star Wars," "Schindler's List," "Saving Private Ryan," "Seven Years in Tibet, and "AI" and you'll see what I mean.

    A final spoiler note: Despite what critics and IMDB commenters say, I'm absolutely against the notion that the beings at the end are aliens. They may be shaped like the "Close Encounters" creatures, but please! "Artificial Intelligence" is the name of the friggin' movie.
  • [ sorry, plot spoilers discussed ]
    Hello all, I rarely find myself necessitating a response to slashdot posts, but the shear number of mass attacks against the ending of this film truly disturbs me. Now, everyone has the priveledge to share and opinion, but I've seen far fewer true opinions and more intintual herd-whinings about the non-Kubrickness of the ending and how sap-happy it seems. Well, let's just think about it for a moment. For the moralists who claim the ending gives no resolution to the human moral issues of the film I say you are wrong. The opening arguement of the film is "what responsibility does a human have to a loving, emotionally-unique robot" (pardon my paraphrasing). However, beyond giving resolution and meaning to the desires of David's life, the ending and the temporary resurrection of the Mom answers this question, at least in part if not ultimately. The mother truly loves David, giving the answer to the philosophical moral issue of the film: David is every bit a human child, so the human Mother must owe him the same responsibility as she does her own flesh-born child. This is what the ending does, reveals and allowd humanity to be redeemed, or at least not damned as unempathetic and purely nihilistic. If the filmed ends with David's forlorn please to be a real boy then the revelation of the film is a nihilistic and meaningless moral destiny for humankind. My feelings and beliefs aside (damn, I'm even dicussing morality, me of all people), it is a much less powerful and meaningful ending to just damn humanity to unfettered distruction and emotional isolation from each other. One of the central issues of the film, especially when it involves the son, is that the family and humanity cannot and will not accept David as the human he so well desires to be. This smacks of Asimov's Bicentennial Man, Card's Piggies, Heinlein's Mycroft Holmes, and countless other varlese who know they are ramen. In fact, the evolutionanry path forged by David's emotional capacity is evident in the mere presence of the future machines. They are uncovering a past that enthralls them because they have lost its memory. They are resurrecting the memories and legacies of their creators, and they KNOW this. That they do not discard David as just another inferior relic of the ancient past, as is the case in so much technology-based fiction, they actually are quite impressed with this little boy robot, who "knew actual, living people". They seek to cater to his desires and wishes because he is their link to a forgotten past. By humoring David, they find a form of carthartic relief in reliving and seeing the forms of their past open up. Its the legacy of humanity passed on to our silicon offspring. I think the ending has its usefulness and demonstrates a much more profound and necessary conclusion to this story than the senseless waste of a human life, David. Of course, I think of David as a fellow human, albeit of different origin and design. So, is the ending perfect or the only possibility? No, but it is possibly much more insightful and meaningful than 99.9% of my peers are giving it credit. Think about it for a while. Let it eat you up inside a little. I mean, what have you got to lose in learning to be emotional about a fictional boy robot-come-human?
  • by geno523 ( 463998 ) on Sunday July 01, 2001 @08:14AM (#116691)
    I think it's safe to say that Kubrick's film ended with David praying to the Blue Fairy; Spielberg's begins with David's resurrection. If the film had ended with David's endless, unanswered prayers, I don't think that anyone would be able to deny that it was a serious, provocative film. Most commentators have taken the view that Spielberg's ending was a cheap attempt at tacking on a happy, emotionally satisfying ending that would make the film more of a crowd-pleaser. I question this view. The ending was far from happy. David is delivered into the hands of grey being whose interest in him is identical to the interest that Dr. Hobby's team had in him: How can he help us solve our problems? They offer to David the choice that Monica had. He is given the chance to satisfy his emotional need, at the expense of another. Just as Monica "activated" David, placing him in a world where he was meant to have neither autonomy nor freedom, David chooses to have Monica resurrected, so that she may die again, all for his selfish desire. For the Monica at the end is surely not the "real" Monica; she is the idealized Oedipal mother, just as David was the idealized, perfect son. Did Spielberg actually believe that the ending was a happy one? If so, he is a bigger fool than most take him to be.

Technology is dominated by those who manage what they do not understand.

Working...