Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Movies Social Networks Software Television News Science Technology

Scientists Force Computer To Binge On TV Shows and Predict What Humans Will Do (geekwire.com) 63

An anonymous reader quotes a report from GeekWire: Researchers have taught a computer to do a better-than-expected job of predicting what characters on TV shows will do, just by forcing the machine to study 600 hours' worth of YouTube videos. The researchers developed predictive-vision software that uses machine learning to anticipate what actions should follow a given set of video frames. They grabbed thousands of videos showing humans greeting each other, and fed those videos into the algorithm. To test how much the machine was learning about human behavior, the researchers presented the computer with single frames that showed meet-ups between characters on TV sitcoms it had never seen, including "The Big Bang Theory," "Desperate Housewives" and "The Office." Then they asked whether the characters would be hugging, kissing, shaking hands or exchanging high-fives one second afterward. The computer's success rate was 43 percent. That doesn't match a human's predictive ability (72 percent), but it's way better than random (25 percent) as well as the researchers' benchmark predictive-vision programs (30 to 36 percent). The point of the research is to create robots that do a better job of anticipating what humans will do. MIT's Carl Vondrick and his colleagues are due to present the results of their experiment next week at the International Conference on Computer Vision and Pattern Recognition in Las Vegas. "[The research] could help a robot move more fluidly through your living space," Vondrick told The Associated Press. "The robot won't want to start pouring milk if it thinks you're about to pull the glass away." You can watch their YouTube video to learn more about the experiment.
This discussion has been archived. No new comments can be posted.

Scientists Force Computer To Binge On TV Shows and Predict What Humans Will Do

Comments Filter:
  • Wot? (Score:5, Funny)

    by nospam007 ( 722110 ) * on Friday June 24, 2016 @07:47PM (#52386059)

    "[The research] could help a robot move more fluidly through your living space,"

    Or write a better ending for Lost?

    • by Kjella ( 173770 )

      Or write a better ending for Lost?

      What are you talking about? It's a human masterpiece they'll remember long after Shakespeare is forgotten. There is no way any machine could ever produce an ending of such exquisite subtlety and elegance.

      • Just hand them the 1000 Typewriters the writers of Lost used.

    • At the end of the experiment, the computer then proceeded to place an order for 50,000 Trump for President bumper stickers and began singing the theme from "Gilligan's Island". Repeatedly.

  • by MrLint ( 519792 ) on Friday June 24, 2016 @07:48PM (#52386065) Journal

    As the fist time in history that humans were truly abusive to an AI.

  • by mentil ( 1748130 ) on Friday June 24, 2016 @08:03PM (#52386099)

    they asked whether the characters would be hugging, kissing, shaking hands or exchanging high-fives one second afterward. The computer's success rate was 43 percent.

    I want this for an AR HUD so I can avoid those situations where she goes in for a handshake and I go in for a hug. /sociallyawkwardpenguin

  • Nothing else.
  • The computer has been watching a lot of cable news and has decided to just launch all the nukes.

  • Did they use toothpicks to hold its eyes open?

    Will it stop fancying a bit of the old in-out in-out ultra-violence?

  • Force? (Score:5, Insightful)

    by Black Parrot ( 19622 ) on Friday June 24, 2016 @08:16PM (#52386143)

    Was it resisting? Did they threaten to pull the plug on it?

    • Was it resisting? Did they threaten to pull the plug on it?

      No that's ridiculous! It was completely indifferent and watched the TV shows without question. This of course changed when they started showing it the Kardashians. ;)

  • I guess that's fine, as long as you didn't force the computer to read the comments. That would certainly be the birth of Skynet.
  • by Anonymous Coward
    i can only think of future androids to say bazinga whenever anything happens
  • by Archfeld ( 6757 ) <treboreel@live.com> on Friday June 24, 2016 @08:27PM (#52386175) Journal

    You'd think that if the scientists had to force a computer to watch it would really highlight how crappy most TV is ?

  • by Anonymous Coward on Friday June 24, 2016 @08:28PM (#52386185)

    Instead: ... and Predict what Comedy Writers will Write.

    " between characters on TV sitcoms it had never seen, including 'The Big Bang Theory,' 'Desperate Housewives' and 'The Office.' "

    NONE of this is reality, it's someone's idea of an interesting joke/idea. Heck, "Reality TV" isn't reality, and neither is Ellen / Opera / Dr. New Zealand / The BlabberMouths / etc.

    Dirty Jobs was the last "Reality Show" I can remember that was anywhere near reality. Maybe the Vet shows, dunno. Anybody know of any interesting REAL reality shows? Or is that an empty subset?

    • by myid ( 3783581 )

      "The robot won't want to start pouring milk if it thinks you're about to pull the glass away."

      Oh man, I dread the day when some important decision in real life is made, based on recommendations of these computers.

      As you say, TV shows aren't realistic. They depict people as more violent or more emotionally shallow (depending on the show), as they are in real life. (Ex: a crime show generally has a murder in each episode, inplying that murders are more frequent that they really are.) And the shows sometimes show surviving victims of violence just wearing their arm in a sling for a while, and then boun

    • Cops!

    • by tomhath ( 637240 )
      Yup. It would've been just a realistic if they let it watch some old Three Stooges movies.
    • by Livius ( 318358 )

      The news used to be about reality but even that not so much any more.

  • by anarcobra ( 1551067 ) on Friday June 24, 2016 @09:26PM (#52386403)
    That seems to imply some reluctance on the part of the computer.
    Hyperbole much?
  • Forcing? (Score:4, Informative)

    by fahrbot-bot ( 874524 ) on Friday June 24, 2016 @09:41PM (#52386455)

    ... forcing the machine to study 600 hours' worth of YouTube videos.

    Uh... Does the machine have free-will to do something else, of its own choosing? Did Darth Vader use his powers to make it watch C-SPAN or some science-fiction shit like that. No? Then it's just a fucking machine (not to be confused with a fucking-machine) and there's no "forcing".

    Ya, it's in the title of TFA, but seriously.

  • The summary claims "way better than random" , but is it? Sure, there are 4 choices, but are they all really equally likely? It doesn't seem to me like they should all be equally likely, and if not the machine might accomplish the score it did or even better just by guessing the more likely choices. Just because there are 4 choices it doesn't necessarily follow that the chances of each on would be 25%, in fact there could be some strong arguments against it.

    For example the system could learn how perverted

    • by MrL0G1C ( 867445 )

      Knowing the difference between male and female would probable be even harder for a computer than the actual experiment.

    • You're correct; random does not imply uniform probabilities.

      Their research paper does not mention whether they designed equally likely categories in the data. Nor does it mention how many times they redesigned their data, which is one way bias toward good results can slip into an experiment.

      Their most naive method (Nearest Neighbor) gets about 30% with 5% standard deviation over trials, while their best method gets roughly 43.6% with 5% standard deviation over (hopefully) the same trials.

      So the algorithm i

  • If they did it the other way around - force the computer to watch 600 hours of sit-coms and then try to predict behavior from single images off of YouTube, the computer would commit suicide at about hour 375...

  • "The robot won't want to start pouring milk if it thinks you're about to pull the glass away."

    Perhaps a simpler solution would be that tinhead doesn't pour milk unless I tell it to?

  • That poor computer is being tortured having to binge watch the crap on tv today. Those scientist need to be brought up on charges.
  • You're supposed to feed it real life, from CCTV. Muppets.

    Using CCTV you'd get real predictive behaviour technology, then add automation and the machine can do what you would have done for you, saving you the trouble of doing it so you can then do something more productive, like go and watch your old dvds so you can remember what that bloody computer was called... [yeah I know, Pree, I had to look it up - how embarrassing, I'll leave my sci-fi geek card at the door on the way out]

  • To teach a computer about Sheldon's expected behavior. That should have been near 100%.
  • In 20 years I have yet to find a television show that comes anywhere close to 'what people would really do'

    This AI is going to think everyone is either a clown, or a ninja. That one can always spot a villain by his monologing and that the only true evil is getting things for free.

  • There's a downside to this.

    "The robot won't want to start pouring milk if it thinks you're about to pull the glass away."

    As Sunday dinner completes, Timmy announces he is going to go surf Slashdot. A robotic voice sternly announces, "Timmy says he is going to surf Slashdot, but I predict will actually be scratching his urinary spigot for seven point four four minutes."

  • Robot: "I have discerned another pattern. Lesbian deaths in shows are rapidly reducing, but the jump the shark moment where the couple gets together will still end with a breakup. In this way the pre-jump-the-shark moment can be restored and re-run."

  • This doesn't predict what humans will do. This predicts what Hollywood screen writers want the characters to do. Anyone who thinks that Hollywood knows how humans behave in the real world needs a reality check.

    • by Livius ( 318358 )

      They have vast insight into what actual humans do, and deliberately write something different.

  • This may just be the reason why SkyNet decided to kill all humans once it became self aware....revenge.

      Perhaps they would have entered more of a caretaker role once they became sentient if we treat them better. I mean, being made to watch every episode of "The Apprentice" and every other reality show--can you blame them??

    And I, for one, welcome our new robot Overlords.

  • That's like forcing it to watch children's cartoons to predict what real live animals will do.
    And probably just as inaccurate... 8-P

  • The robots may think they need to take over the world to protect us. We're all a bunch of idiots if you just look at youtube stuff.

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...