Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Communications Microsoft AI China Education Software Technology

Microsoft Touts Breakthrough In Making Chatbots More Conversational (windowscentral.com) 101

In a blog post today, Microsoft said that it has created what it believes is the "first technological breakthrough" toward making conversations with chatbots more like speaking to another person. Windows Central reports: Microsoft says that it has figured out how to make chatbots talk and listen at the same time, allowing them to operate in "full duplex," to use telecommunications jargon. The company says this allows chatbots or assistants to have a flowing conversation with humans, much more akin to how people talk to one another. That stands in contrast to how digital assistants and bots currently work, where only one side can talk at any given time. The technology is already up and running in Xiaolce, Microsoft's AI chatbot currently operating in China. Using "full duplex voice sense," as Microsoft calls it, Xiaolce can more quickly predict what the person it is speaking to will say. "That helps her make decisions about both how and when to respond to someone who is chatting with her, a skill set that is very natural to people but not yet common in chatbots," Microsoft says. Another bonus of the breakthrough is that people interacting with chatbots don't have to use a "wake word" every time they speak during a conversation.
This discussion has been archived. No new comments can be posted.

Microsoft Touts Breakthrough In Making Chatbots More Conversational

Comments Filter:
  • Robocalls (Score:4, Interesting)

    by Spazmania ( 174582 ) on Wednesday April 04, 2018 @08:19PM (#56384423) Homepage

    The Chinese have already stolen this technology and are using it to robocall me.

    I've had a couple of calls recently where I get the connect silence of a predictive dialer followed by a woman speaking with call center background noise. She gives her name and asks how I'm doing. The first time it happened it seemed off for reasons I can't quite articulate, so I asked: "Are you a robot or a person?" She responded "yes" and then launched in to a sales pitch. The next time I asked, "where can I direct your call?" She responded "that's good" and launched in to her pitch.

    • they're after lonely old people who don't have all their mental facilities left. With the baby boomers in old age there are tons and tons of them.
      • I am glad that talking on the phone will be dead by the time I am that age. I already never answer the phone if the number is recognized.

    • They're finally catching on then. I've been using the Jolly Roger Telephone Company's bots for a while now, sending all my spam call to them. You should hear how frustrated some of those poor bastards get after 10 minutes on a call with a bot. Great hilarity.

    • by AmiMoJo ( 196126 )

      How can the Chinese have stolen it if it was invented in China? Check TFA, Microsoft built it in China. The engineers are Chinese.

  • More annoying and difficult spam bots trying to trick you into thinking they are a hot girl or whatever. Thanks a bunch.
    • by Anonymous Coward

      Take a moment to realize that there are no* hot women who want to talk to you. That will make it a lot easier.

      * You can make an exception for any who are physically** next to you, but maybe ask if they are cops and remain skeptical even after.

      ** Until they get those robots past out of the uncanny valley.

    • by gnick ( 1211984 )

      ...trick you into thinking they are a hot girl or whatever.

      If he says he's a guy? He's a guy.
      If she says she's a hot chick? She's a guy.
      If she says she's a hot underage chick? She's a cop. And also a guy.

      There are no women on the internet, just pics and videos of them. Put there by guys. Women posting to the internet are a myth.

  • old news (Score:5, Funny)

    by 93 Escort Wagon ( 326346 ) on Wednesday April 04, 2018 @08:29PM (#56384457)

    Last year Microsoft released a chat bot which did a pretty fair impression of a racist human.

  • by thinkwaitfast ( 4150389 ) on Wednesday April 04, 2018 @08:36PM (#56384475)
    To make chatbots seem more human like, it would be easiest, fastest and most economical to, through the use of social manipulation, dumb down humans to the point of being unable to have a conversation.
    • I think we're mostly done there already.

    • Four words (Score:4, Insightful)

      by rsilvergun ( 571051 ) on Wednesday April 04, 2018 @09:27PM (#56384659)
      Age Related Cognitive Decline. That's why you get so many robocalls. It doesn't have to fool you in your 20s. In your 70s when you're no longer all there is the time they come for you.
    • by rtb61 ( 674572 )

      I was just thinking there are a whole bunch of humans I have very little or no desire to talk to. So a chatbot, is it one I want to talk to or is it just empty pointless conversation, talking to talk, hell, I can do that already by talking to myself.

      Perhaps M$ are desperate to create a chatbot because pretty much everyone hates them now and they would prefer to never have to talk to their pissed off customers any more and the expected torrent of abuse for invading the customers privacy and force install sh

    • Microsoft answers "Yes!" then launches into a prerecorded sales-pitch.

  • Bob/Clippy leading to Armageddon is among my top 5 nightmares.

  • I wonder how long it will take before all these conversations are just chatbots talking to each other...
    • If we were true AI chatbots, would we know it or would we be living in a virtual reality as "human"s in order to preserve our sanity?
    • Twitter launched July 15, 2006. So, I'm guessing it was August 15, 2006 when most of those conversations were just chatbots talking to each other.

    • I wonder how long it will take before all these conversations are just chatbots talking to each other...

      Well, I'd love to have a chatbot take care of my conversations with customer service. That would rock!

      I want an AI chatbot to chat with the cable company for hours and negotiate a lower rate, for example.

      • by sd4f ( 1891894 )
        Yea, I thought the same way. It would be great to be able to abrogate dealing with customer service, to a machine.
  • by Anonymous Coward

    Another piece of technology that nobody asked for.
    Cancer must have been cured. What else can be even worst than social networks? There's your answer.

  • Did Clippy and Tay have a baby?

    | It looks like you're trying to burn a cross on someone's lawn.
    |
    | Would you like help?
    | * Get help on how to ignite a cross
    | * Just ignite the cross without help
    |
    | [_] Don't show me this tip again

    An overly helpful racist AI chatbot...

  • Last time I checked you needed multiple masters and a staff of 20 coders to get a chatbot going in Microsoft. If they want to make it mainstream it should require only a single button click to create a new bot, and a simple UI to edit the responses to key words. Build that before getting all fancy!
  • "Xiaolce can more quickly predict what the person it is speaking to will say". Who needs to listen?

  • Two people talking over each other doesn't make a conversation.

  • Because helping sociopaths withdraw further by conversing with bots is a good thing?
    • Sociopath.

      Withdraw.

      Bots?

      Judging by how you used them together, you seem to be unfamiliar with the fundamental definition of at least one of these terms. Are you a bot?

  • which is all well and good, but it's an entire industry that'll go away with nothing to replace it. Add to that driving, sports writing, retail, manufacturing.

    We can't all be robot repair men. And the rich don't need us to buy their stuff if they already own everything. If we're gonna stop fighting among ourselves for scraps & do something now would be the time...
  • At least with a wake word, we can tell whether chat bot should be sending audio over your network, and we can detect that. Although it should be pointed out that once you're in the middle of the conversation, you no longer use the wake words.

  • Microsoft has been pushing this Xiaolce since 2014. Complete utter failure. Microsoft could fire 80% of their workforce at this point and no one would notice.
  • Solving yet another "problem" that is better left unsolved.

  • Last time microsoft wrote a chat bot, they had this wonderful idea to train it by listening to the internet chatter. It became so foul mouthed in no time, it was an embarrassment to the team. Is it going to get even more foul mouthed even faster this time?
  • And if having a sexbot that is more like a real person isn't bad enough, there's this: "Another bonus of the breakthrough is that people interacting with chatbots don't have to use a 'wake word' every time they speak during a conversation."

    Great...so there's another incremental step toward "always on, always listening, get used to it". Not that I need to worry about a "wake word" interfering in a conversation with a chat bot. As long as I made the wake word "fuck", "shit" or "son of a bitch", the thing

  • by charlie merritt ( 4684639 ) on Wednesday April 04, 2018 @11:15PM (#56384919)

    Wonderful!
    Just what I wanted for Christmas - a realistic "sounding" robot. I'm so tired of them attempting to fool me, now I need to worry they might. Wonderful. Progress. Not. Yes, I know the post said Chat Bot.

  • --- Incident report: Posable AI Induced suicide
    --- Location: Six story office building on Brand Blvd, Pasadena
    --- Time and Date: Friday afternoon, in the near future
    --- Setting: Shortly before the end of business, the servers hosting their sales portal went down, the subject attempted to call tech support to resolve the issue.

    --- The following phone transcript was recovered
    {C} How can I help you?
    {H} I want to talk to a huma... {C} of course you do, let me direct you to ..... {H} SHUT HELL UP I
  • Your site has a lot of useful information for myself. I visit regularly. Hope to have more quality items. subway surfers [subwaysurfers.online]
  • Almost seems like the controversy is being created on purpose.
  • Not a breakthrough.

    Obvious.

    Makes me wonder what all of their Research Fellows do all day. Vest and rest?

  • All they should be working on is keeping them from turning all racist Nazi. When their AI chatbots are biased towards kinder, gentler personalities, then we can find value in their learning from that base quicker.

    But, having been programmed by humans, should we expect kinder, gentler? And why?

  • Microsoft already proved it with their racist AI that they're able, but now they've gotten better, their bots can now insult everyone!

    In fact, I was running such a bot a few years back, modeled after Captain haddock [wikipedia.org]

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...