Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Education IBM Privacy Idle Your Rights Online

IBM Patenting HAL-Like Stuffed Animal Toys 112

theodp writes "'Look, Dave,' said HAL. 'I can see you're really upset about this. I honestly think you ought to sit down calmly, take a stress pill and think things over.' Put a HAL 9000 in a baby's stuffed animal toy, a toddler's EEG-equipped knit cap, or other interactive monitoring device, and you've got the gist of IBM Research's just-published patent application for its Adaptive System for Real-Time Behavioral Coaching and Command Intermediation. 'For example,' explains Big Blue, 'to help a child who plays rough with other children the interaction data can include multiple interaction operations that can be performed by the interactive device for helping the child play less rough with other children. For example, one interaction operation can include an audible warning telling the child 'to play nice' in a strict tone of voice, whereas another interaction operation can include an audible warning that asks the child 'would you like someone to do that to you' in a softer tone of voice along with a visual cue as well."
This discussion has been archived. No new comments can be posted.

IBM Patenting HAL-Like Stuffed Animal Toys

Comments Filter:
  • You know, Teddy [youtube.com], from Spielberg's underrated but still crappier than usual AI

    • I was about to say "Super toys last all summer long", but you beat me to it.
      • by unitron ( 5733 )

        I thought of that title, but it turns out that it isn't the title of the story of which I was thinking, I think.

        In the one of which I'm thinking every kid has a teddy bear type toy, with an A I built in that's programmed to program the child as it grows up so that he or she grows up all well-adjusted psychologically. All of them (the children) are programmed not to be able to kill, except for one child, whose "toy" has had it's programming slightly altered for a purpose I won't reveal here as I've already

        • Re:What about Teddy? (Score:5, Informative)

          by deimtee ( 762122 ) on Sunday March 06, 2011 @12:36AM (#35394578) Journal
          "I always do what Teddy says." - Harry Harrison
        • by Raenex ( 947668 )

          I won't reveal here as I've already probably spoiled the story for anyone who hasn't already read it

          I'm not going to read it, but would like a spoiler.

          • by unitron ( 5733 )

            I'm not going to read it, but would like a spoiler.

            Han shoots first, Lando betrays them to the Empire, the wierd little creature on the swamp planet is actually a Jedi Master, Luke and Leia are brother and sister, and Darth Vader is their father.

            And people who read a Harry Harrison story are glad they didn't choose not to.

            • by Jezza ( 39441 )

              Do we have an ISBN for that? (or a link the Amazon's page?)

              Ta :-)

              I think the spoiler was when Han didn't shoot first...

        • by Meski ( 774546 )
          One who'll grow up to follow HAL's wonderful example.
        • That's the story I was thinking of also. Some men doctor the 'teddy' of one oif their children to remove an important part of the device's social conditioning protocols. This gives the child certain advantages in the conditioned society. For example (non-spoiler), the ability to create graffiti in an obsessively clean and organized society would be quite useful to some. Just have the 'teddy' stop admonishing the kid from drawing on walls and sidewalks as a child, turning its advice towards discretion of act

    • My daughter and I watched that movie a couple days ago. She wants a Teddy! I especially liked him insisting, "I am NOT a toy!" (The first time I watched the movie, I thought it sucked. The second time, I realized that the "aliens" at the end were not aliens -- they were what the robots had evolved into. In general, Spielberg did a bad job of explaining things; I assume the book was much better.)
      • Huh, I thought it was very clear the advanced beings in the end were androids.
        I recall this being explained quite clearly and detailed how they cloned humans and brought them back to life because they had hoped to rebuild humanity, but the human's died after a single day.

        • The 'future robots' were a colossal fuckup on the part of the production designers, and a surprising one as well, considering who was helming the film. They didn't look anything like mecha. They looked completely organic, just like cliched "grey aliens" [forgetomori.com] from countless cheesy SF movies.

          Fooled me completely... I snorted in disgust (as did half the theater) and almost walked out.

      • Weren't they evolved humans? Anyway, it was still a retarded ending.

    • The thing about Teddy is that it clearly demonstrated the emergence which the engineers were trying to build into other robots.

  • by Anonymous Coward

    "Johnny, you shouldn't run with scissors."

    "I noticed you've fallen."

    "Your mother will be upset if you get blood on the carpet."

    "Would you like me to call the ambulance?"

    " I can sing you a song until they arrive?"

  • by theodp ( 442580 ) on Saturday March 05, 2011 @11:45PM (#35394362)

    University of Tennessee DCN Lab [dcnlab.com]: We currently use a very safe, comfortable 128 channel cap(high density EEG sensor array) to collect the infant EEG/ERP. The EEG cap contains sponge discs and there is no risk to the infant wearing the cap.

    • Simply remarkable..
    • Of the several times I had to wear a sensor cap, they had to wet my head and constantly shift the nodes to make sure they were picking up signals correctly. How accurate are caps like this?
      • Maybe their emphasizing the comfort, after all sounds way more comfortable than when they stuck pushpins in your head.
  • by msobkow ( 48369 ) on Saturday March 05, 2011 @11:46PM (#35394368) Homepage Journal

    Isn't it the parent's responsibility to "coach" their child? Maybe if more parents did their job properly there wouldn't be a perceived need for IBM's technology.

    • Re: (Score:3, Informative)

      Parents? Coach their children? Where have you been the last 10 years, it's the schools job to make kids productive members of society and teach children morals (that the parents don't like) and how to behave (that the parents don't re-enforce)

      • Re: (Score:2, Funny)

        by Anonymous Coward

        The don't reinforce spelling skills either.

    • Maybe the parents will see this as a tool to help them do their job. Who else would buy it? (I mean, other than guilt-ridden professionals with more money than time to spend on their kids?)
      • by Anonymous Coward

        I believe a "bark collar" at an early age can do wonders. The technology is already here, people!

    • I understand why you might be concerned about too much technology in our childrens' development. However, I think that this isn't going to be one of those problematic technologies.... because honestly, it's IBM, they just pay their employees special bonuses to patent whatever they've brainstormed. I'm sure the patent will have expired by the time the toy actually exists. :b
    • by EdIII ( 1114411 )

      This will just be a tool like anything else. It's easier. At least this tool might have a chance to actually teach a child something positive. The previous tool parents used to raise children was called TV, and I don't think it had a very positive impact at all.

      I think after a couple of generations it will be like Idiocracy and they won't know why they use the technology in the first place:

      Pvt. Joe Bowers: What *are* these electrolytes? Do you even know?
      Secretary of State: They're... what they use to make Brawndo!
      Pvt. Joe Bowers: But *why* do they use them to make Brawndo?
      Secretary of Defense: [raises hand after a pause] Because Brawndo's got electrolytes.

    • How about parents just do their job?

      Well, it looks like IBM just patented our jobs :(

    • Heck with properly... How about at all. Everyone loses when parenting is out sourced. By everyone, I mean the entire population of Homo Sapien Sapiens on Terra Firma. Heck, I could make an argument that includes all Earths Flaura and Fauna.
  • We should give them up for 'droids. Much easier to control.

  • by k2backhoe ( 1092067 ) on Sunday March 06, 2011 @12:07AM (#35394454)
    Claim 12: A system as in claim 1 where, if the audible warning telling the child 'to play nice' in a strict tone of voice and the audible warning that asks the child 'would you like someone to do that to you' in a softer tone of voice along with a visual cue as well are not effective, then a small correctional current is applied through EEG electrodes 1 and 2, inducing the desired behavior or a peaceful coma.
  • Open Source (Score:3, Funny)

    by sodafox ( 1135849 ) on Sunday March 06, 2011 @12:10AM (#35394470)
    Things like this really need to be open source. I'm not just talking about the source code, but the dialog too. Parents need to know what sort of things it's going to say. Last thing I'd want to hear coming out of it's mouth is "IBM is the Light".
    • by Anonymous Coward

      "My name is Talking Tina, and I'm going to kill you..."

  • by Culture20 ( 968837 ) on Sunday March 06, 2011 @12:10AM (#35394478)
    Please don't teach kids that it's okay to receive moral instruction from an AI (or worse, a mere expert system). Kids are insidious rules lawyers who will bend and twist words/actions to fit what they want. Human guardians need to be there to make them understand that rules lawyering is not socially acceptable. An expert system will be just as easily beat as the end game boss monster in Mega Man XXXVI.
    • But this may prove to be a useful skill in a world with lots of AIs: q.v., Star Trek Liar Paradox [youtube.com]

    • by Anonymous Coward

      rules lawyering is not socially acceptable

      Try telling that that to the lawyers.

    • by Anonymous Coward

      People are insidious rules lawyers who will bend and twist words/actions to fit what they want.

      Human guardians need to be there to make them understand that rules lawyering is not socially acceptable, lest they become politicians.

    • Kids beating the AI is the least of our problems. Teaching kids it's okay to obey AI's is just the first step to creating the first generation of human slaves...
  • A 6ft length of rattan cane will do a good job of enforcing the rules and is much cheaper than any AI. Plus it can be fun for at least one participant!
  • IBM always try to capture every corner of the market.
  • by Brett Buck ( 811747 ) on Sunday March 06, 2011 @12:42AM (#35394608)

    Have you ever considered turning off the TV, sitting down with your child, and hitting them?

    • Also:

      Roberto: "Death to the 1X Robots!!"
      *Electric arc through his head*
      "I love those magnificent 1X Robots! The 1X Robots are my Friends."

      Bender: "What happened to your previous enthusiasm for stabbing them?"

      Roberto: "I'm past that man. Later blood"

      • Bender: Your son plays the holophoner beautifully. How hard did you have to hit him?

          High Society Dowager: Fairly Hard.

    • No, if you're earning $$, why bother? Like a dictator or mob boss, have somebody else do the dirty deed for you. That way you don't get sued for whatever happens to the child. Let the manufacturer think of the children.
  • 'For example,' explains Big Blue, 'to help a child who plays rough with other children the interaction data can include multiple interaction operations that can be performed by the interactive device for helping the child play less rough with other children.

    Just so long as they're not going to be teaching kids English....

  • I want an IBM Watson. :P

  • Awesome those teddy bears from the movie Screamers will finally become reality!
  • Methinks the inventors have dramatically underestimated a) the learning capabilities of 2-to-5-year-olds b) the social intelligence of same, b) the destructive potential of same, and dramatically overestimated a) the everyday authority these toys will have in the eyes of 2-to-five year olds b) the electronics ability to differentiate between c) the willingness and/or ability of parent to feed toy with the behaviors mentioned. In short, this is a disaster waiting to happen. Woe to the parent that relies on
  • I am afraid these toys will lead to more extreme forms of misbehaviour: A child trying to pry open an other child to remove the batteries.
    Ratl
  • Or a manager size . . . ?

    "Dave, I think that your employees deserve raises."

    Lucky for me that the EEG sensors will not be able to penetrate my tinfoil hat.

  • The last mimzy should teach them what hollywood do to your prefered tales when you don't behave. And IBM will just make it even (if possible) worse.
  • I think we can all imagine a 9 year old little shit banging his toy against a wall just to hear it say "ow"
  • Cue thousands of kids thinking they have schizophrenia.
  • I wonder if they'll include the V-chip option to electrocute your child every time he says a cuss word?

    Bill

  • "I love you, cold unfeeling robot arm." ~ Invader Zim
  • I can see how this may be helpful but only if used in conjunction with a real therapist as well. I don't think the use of verbal cues alone by a computer will necessary help reduce unwanted behavior. It will also be difficult to catch the inappropriate behavior across different environments or circumstances. The part about the warnings "would you like someone to do that to you' in a softer tone of voice along with a visual cue as well" also doesn't seem to go along with general theories behind behavioral
  • The patent system is so out of control. The wording of this patent is so broad that it could apply to any biofeedback system.

    There is no mention of a "teddy" btw, just an "interactive electronic device".

  • Does this mean that my being a dad means I am also a patent violation?

"Hello again, Peabody here..." -- Mister Peabody

Working...