Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Google Businesses The Internet Education Technology

The Future of Google Search and Natural Language Queries 148

eldavojohn writes "You might know the name Peter Norvig from the classic big green book, 'AI: A Modern Approach.' He's been working for Google since 2001 as Director of Search Quality. An interview with Norvig at MIT's Technology Review has a few interesting insights into the 'search mindset' at the company. It's kind of surprising that he claims they have no intent to allow natural questions. Instead he posits, 'We think what's important about natural language is the mapping of words onto the concepts that users are looking for. But we don't think it's a big advance to be able to type something as a question as opposed to keywords ... understanding how words go together is important ... That's a natural-language aspect that we're focusing on. Most of what we do is at the word and phrase level; we're not concentrating on the sentence.'"
This discussion has been archived. No new comments can be posted.

The Future of Google Search and Natural Language Queries

Comments Filter:
  • by Arancaytar ( 966377 ) <arancaytar.ilyaran@gmail.com> on Tuesday December 18, 2007 @11:22AM (#21739660) Homepage
    "I'm sorry Dave, I'm afraid I can't search that."
    • Go back to Excite and the search engines before: you have a box, and you get a list of 10 results, with a little bit of information accompanying each result. We've just stuck with that.
      TR: What has changed?
      PN: The scale. There's probably a thousand times more information.
      1000x? That's got to be the understatement of the year! If not the understatement of the second.
  • by yagu ( 721525 ) * <yayagu.gmail@com> on Tuesday December 18, 2007 @11:23AM (#21739668) Journal

    I tend to agree with Norvig's focus on keywords and less emphasis on natural language. Trying to even define a natural language on top of a query engine introduces a layer of complexity probably unnecessary. Natural Language even introduces a level of noise to interfere with accurately (as possible) defining what the user is asking for.

    Google has done a good job, and they get better each iteration figuring out what the user is looking for. I find their suggestion [google.com] an effective way to not only constrain a query, it actually provides a way to spell check in a pre-emptive way. If you've not used this, install the Firefox Google toolbar, or use the experimental Google "Suggest" [google.com]. Often Google will provide suggestions in the drop down menu that refine your search in ways you hadn't considered that drive to a more direct and accurate representation of your intended query. Of course if their suggestions don't satisfy, you get to continue typing your keywords to your heart's desire.

    (I have to offer an example of suggestion's effectiveness. I often Google to get to the Chicago Tribune (I don't visit there often enough to have created a bookmark, plus it's easy to do this in anyone's browser). Simply typing the first four letters, "chic", I see the first suggestion is "Chicago Tribune". A simple TAB and RETURN, I'm on the Google page with the first link or so my link to the Tribune (with the added bonus of Google's breakout of sublinks).) Your mileage may vary (Google's ranking system may vary the order and options that appear in the drop-down over time), but I find it an amazingly effective research tool (suggestion, not the Trib).

    Natural language is mostly trying to guess intent with structure and key words (as opposed to keywords), but at the end of the day, if you filter out the natural language, and focus on keywords you're going to end up in close to the same place.

    • by krog ( 25663 )
      No, I'd say the phrase "natural language" is just about perfect at describing what natural language is.
    • Primitive question words like "what is" or "where is" or "how many" would still be nice to have, though - I agree however that trying to understand a full question is needless overkill.

      "What is" is already mapped to "define:" as far as I know. "Who is" works in a similar way.

      "Why did World War I start" or "what does a duck eat" are questions that require too much understanding and explanation of the concepts. But simple definitions, locations or numbers shouldn't be that difficult to spew out. "how many" co
      • by pluther ( 647209 ) <plutherNO@SPAMusa.net> on Tuesday December 18, 2007 @11:58AM (#21740156) Homepage

        "Why did World War I start" or "what does a duck eat" are questions that require too much understanding and explanation of the concepts.

        Not at all. I do that kind of question in Google all the time.

        Googling for "Why did World War I start" brings up, as the first result, an article titled "The Causes of World War I".

        Followed by a few million more hits if that one isn't good enough.

        And the question "What does a duck eat" gets many hits as well. The first one has, in the summary:

        Ducks in the wild eat a variety of plants, insects, and native foods that will differ from...

        I know it's just picking out keywords from the query and matching them to the sites, not trying to parse the natural language, but it works pretty damn well.

        • by 0100010001010011 ( 652467 ) on Tuesday December 18, 2007 @12:16PM (#21740354)
          Fine, those were easy. Lets see google understand this one: Women.
          • For that, the top hit on Google is:

            "The daily destination for women, with horoscopes, health and pregnancy information, message boards and blogs, celebrity gossip, beauty and more."

            I think it's pretty much on the money there too!
        • by Sciros ( 986030 )
          It does do some low-level parsing. Google "how tall is Mt. Hood" for example.
          • by MrMr ( 219533 )
            This came up as #1

            How tall is Mt. Hood? According the U.S. Geological Survey, Mt. Hood is 3426 Meters (11239 Feet) tall. To learn more about Mt. Hood geology visit ...

            • by Sciros ( 986030 )
              Eh? That comes up as #2. This is what comes up as #1:

              Mount Hood -- Elevation: 11,249 feet (3,429 meters)
              • by pluther ( 647209 )
                Interesting that the two numbers aren't the same.
                When I was growing up, I'd always heard that it was 11,235 feet tall, which I thought was very cool.
        • And that's exactly why almost the entire field of information retrieval is focused on these 'statistical' approaches instead of some sort of deep semantics. It works. Semantic analysis is very difficult, highly language dependent and slow as hell. For google to do something like that they would have to not only make it work, but make it work for many different languages and make it fast.

          Let's not forget that information retrieval requires highly optimized algorithms. Linear time (over the size of the doc

        • by trawg ( 308495 )
          I often do the same, not because I expect Google to be able to magically figure out what I want, but because I figure Google have already indexed a page where someone has asked the exact same question before. I frequently use quotes around it (to search for the whole string), when trying to find something really specific and simple (eg, "how much does the earth weigh"). It's really easy to tweak the question to try alternatives.
        • I know it's just picking out keywords from the query and matching them to the sites, not trying to parse the natural language, but it works pretty damn well.

          This is because Google uses a popularity-dependent algorithm. It's not popular to ask/answer questions like "What does a duck eat?" where duck was in the meaning of ducking, or something like that. Obviously, a natural language processor should use the same mechanism. There'd only be confusion here if two different meanings of the word competed for the top results (i.e. both being popularly asked), *or* if you searched for an unusual meaning of the word, but in a context that made it look like some other q

      • Searching for how many female redneck scanks with a doctorate degree in nanobiology would date me made me really sad.

        Those women don't know what they are missing.
      • Sometimes I like to idly type things into Google Suggest and see what comes up:

        why is everything
        can you eat
        can you die from
        where can I go to get
        is it possible to
        how would you

        From playing with it for a few minutes, it seems that Google is mostly used by women in various stages of pregnancy, people worried that they might be arrested for using Limewire, and people looking for Wiis.
    • Not to mention, by understanding the root conecpts of word structures, without going whole-sentance, eventually you can get good results from sentances, or questions as an emergent behavior.

      I tried several questions in google, and it performed really well, only having trouble with:
      Why does ask suck and google not?

      That came back with a bunch of results saying google sucked. My other questions seemed to produce very useful results. I think that was the point the dev wanted people to understand
      • by Intron ( 870560 )
        I think your post explains the difficulties of natural language processing very well.
    • by porcupine8 ( 816071 ) on Tuesday December 18, 2007 @11:35AM (#21739858) Journal
      I would find the drop-down suggestions a lot more useful if I could read more than the first two words. As I type in, for example "Chicago dog boarding" all I see is a list of "Chicago do... " I'm sure there must be a way to make the search space take up more of the toolbar (I don't really need that much room in the URL space, since most URLs that long are nonsense), but I don't know how and I don't really want my browser window to be the width of my screen.
      • by yagu ( 721525 ) *

        If you're using the Google Suggest page, I think the width is sufficient if you have the browser at any reasonable width, so I'm assuming you're talking about the drop down from the toolbar, in which case you're in luck. Type something to invoke the drop down, or click the arrow to look at history. In the lower right, you should see a handle, expand to your heart's content. It's nicely implemented, even pushes the box to the left if you're browser's too close to the right side of your screen. Enjoy.

        • Hm, no handle for me. I'm in Firefox 2.0 on OS 10.3.9. Maybe this only works in windows FF.

          It would really help if the right half of the drop-down weren't taken up by the word "Sugges..." on the first line, which for some reason also creates a big blank space on all the lines below it. They couldn't just put that as the first line, if they really need to point out that they're suggesting things to me?

          • Re: (Score:3, Informative)

            by encoderer ( 1060616 )
            I think you two are confusing each other..

            The parent to my post is talking about the Google Search Box built-into firefox. The GP to my post is talking about the Google search page that has Suggest activated within it. It looks basically like the normal google search page up to the point you start typing-in queries.
    • Exactly what I was thinking. Search engines, especially Google, are great at picking out the important search terms, even if you do type it as a standard question. So being able to specifically parse natural-language questions seems to have a low reward-to-effort ratio. If you're going to natural language processing, the goal should be to simplify a difficult problem. For example, translating between languages automatically, which takes YEARS of training for an individual to be able to do consistently.

      S
      • Actually, one of the main challenges with natural language is that we humans perform so badly to begin with. Half the time we neither say what we mean, nor mean what we say. But it hardly matters: far more than half the time, the person (or people) listening hear either what they expected to hear, or what they wanted to hear, or they already knew they would disagree with whatever you were about to say before you even opened your mouth.

        Sometimes it does matter. However, by the time you design a linguistic
    • Another way to try Google Suggest would be just to install Firefox itself, sans toolbar, and use the browsers Search Box...
    • by Malkin ( 133793 )
      I agree. Being a programmer, I think natural language is good for talking to other human beings, and hopelessly inefficient for anything else. Why recite Dickens to a dishwasher, when it has perfectly good knobs and buttons? Why do we constantly suffer under this mad delusion that computers are somehow meant to act like people? Alas, Turing, why did you steer us off this cliff?
      • Re: (Score:3, Funny)

        by Intron ( 870560 )
        You: What larks, eh, Pip?
        Dishwasher: CHANGE TO MODE POTS_AND_PANS
        You: I ent dun nuffink!
        Dishwasher: CANCEL RINSE CYCLE
    • by SL Baur ( 19540 )

      Simply typing the first four letters, "chic", I see the first suggestion is "Chicago Tribune". A simple TAB and RETURN, I'm on the Google page with the first link or so my link to the Tribune (with the added bonus of Google's breakout of sublinks).) Your mileage may vary (Google's ranking system may vary the order and options that appear in the drop-down over time), but I find it an amazingly effective research tool (suggestion, not the Trib).

      I find this unlikely in the extreme. When most people start typing at google and reach "chic", Chicago is not exactly what they're looking for. (Or Hot Chicago pizza for that matter).

  • I wonder if any of these types of translation or recognition engines use Lojban as an intermediary. The unambiguous yet rich grammar of Lojban is ideal for representing different languages. Eventually, it will be used directly.
  • by eln ( 21727 ) on Tuesday December 18, 2007 @11:31AM (#21739786)
    The problem with natural language searches is that natural language itself is a moving target. Sure, ten years ago "How do you change the air filter in a Toyota Camry?" would have been a legitimate question to ask a search engine online, but these days it would probably be asked like "lol how do u chng filtr in my pos car? kthxbye :)". I don't know how Google is supposed to keep up with that.
    • Re: (Score:2, Funny)

      by Anonymous Coward
      u opn hd, tk off top of big rnd blk thng, rmv rnd sqzbx thng, put new sqzbx in, rplc top, cls hd.

      lol easy, looser

      :P

      • Welcome to Mikita's. How may I serve you?
        - I'd like 'rullers, 'ugar, 'ucks and a Mikita 'cup... And then I think I would like a large... ...with 'eam.
        - And could I please have 'elly donut and... ...raspberry and a 'nge drink?

        What?
        - I'm sorry. And 'eaker 'oken.

        Let me recap the order: A cruller, two sugar pucks, a large coffee with cream, a raspberry jelly doughnut, orange drink, a box of five-holes.

        - Yeah.
        Thank you. Drive around, please.
    • Re: (Score:2, Informative)

      by AnyoneEB ( 574727 )
      Most linguists currently believe in the existance of something called "universal grammar", which is a set of properties common to all acquirable human languages (that is, langauges which can be learned as a native language). If you were able to get a computer to comprehend one language (or probably a few to make sure you have sufficiently generalized your principles), then additional languages and dialects would be relatively easy: just give it enough examples of sentences in that language, and the computer
      • by vertinox ( 846076 ) on Tuesday December 18, 2007 @01:09PM (#21741122)
        Most linguists currently believe in the existance of something called "universal grammar", which is a set of properties common to all acquirable human languages (that is, langauges which can be learned as a native language).

        The argument against universal grammar is of course is non-Latin languages like Japanese (and possibly Russian) which don't play by the rules. I'm not really a language expert on either, but I'm tried to learn Japanese and its really tough.

        Everything is relationship based off the speaker and to the person or object he is talking about and then the audience. As in... If I'm talking about a pencil sitting on my desk, it has a different tense than a pencil on your desk and then a difference tense in someone else's hand or a pencil that is sitting at a far off place (-sara or -kara? I can't remember). And we haven't even gotten to issues about ownership like if it was in my hand or your hand.

        Whereas in Latin based languages it is more concerned about action or tense of ownership but not relationship to the speaker or audience. Hence... It is argued universal grammar does not apply in that respect.
        • It's more a matter of principles and parameters that every language chooses from: there's a universal set of principles and a universal set of parameters, and every language is built up on a general structure developed from a combination of several of the first set plus several of the second set. One could imagine that a new language invented ex nihilo would begin by assign signs (usually phonemes) to signifieds (objects in the real world, concepts, processes, etc.) and relate them to one another by means o
        • Re: (Score:3, Funny)

          by OWJones ( 11633 )

          I'm not really a language expert on either, but I'm tried to learn Japanese and its really tough.

          Perhaps you should try and nail down English first. :)

          Cheers,
          -jdm

        • Behind every sentence is an idea (or several). And an idea can be parsed and stored unambiguously. (Allow me to remove the ambiguity in the previous sentence... there can be a machine readable representation of every idea, in which the representation is unambiguous. Even ambiguous ideas can have an unambiguous representation.)

          Check out the language Lojban [lojban.org] for just one way to do this.

        • by rsborg ( 111459 )

          The argument against universal grammar is of course is non-Latin languages like Japanese (and possibly Russian) which don't play by the rules. I'm not really a language expert on either, but I'm tried to learn Japanese and its really tough.

          You do realize that universal grammar allows different languages to have different rules, right? The universality is about how nouns and verbs exist in sentences, and relate to each other, etc. Even in French vs. English, genderization and possession are wildly differ

        • As the other people who have already replied to you have mentioned, the differences you specify between Japanese and langauges more familiar to you have nothing to do with it not following universal grammar, they are simply ways Japanese differs from the other languages you have encountered. There actually are aspects of Japanese, which also appear in other languages, which linguists have trouble explaining (classifiers and double-nominative verbs likely among other features I am not familiar with), but tha

  • Most of what we do is at the word and phrase level; we're not concentrating on the sentence.
    What's the difference between a phrase and a sentence?
    • Re:phrase/sentence? (Score:5, Informative)

      by harmonica ( 29841 ) on Tuesday December 18, 2007 @11:38AM (#21739902)
      A phrase is part of a sentence. WP [wikipedia.org]
    • What's the difference between a phrase and a sentence?

      i'd assume it's something like the difference between "how do i set up my d-link router?" and "d-link router set up". i believe google already parses out "natural language" queries about as well as any other search engine, including ask jeeves, which was supposed to use natural language as its unique selling proposition. google does give different results for both queries but both sets of results seem to be relevant.

      i'm more curious about how the use

      • by teslar ( 706653 )

        i'd assume it's something like
        Not to be pedantic, but why assume when Wikipedia [wikipedia.org] is just a click away and can show you that your assumption is wrong?
        • Not to be pedantic, but why assume when Wikipedia is just a click away and can show you that your assumption is wrong?
          what about that link makes my assumption wrong? "d-link router set up" seems to be a noun phrase. set up is used as a noun quite often.
          • by teslar ( 706653 )

            what about that link makes my assumption wrong?
            To be really pedantic, the lack of "the" or "a" ;) An article would be required to turn your collection of words into something that is a single unit within a sentence, especially since you're using the singular. As you presented it, your assumption seemed to be "a phrase is a collection of words that is not a proper sentence" and that is wrong, at least from a linguist's perspective.
    • by iabervon ( 1971 )
      The most significant thing in their case is that the phrases they deal with don't have verbs (and the associated syntactic function words). They're talking about noun phrases, which go from just after a word like "the" to the next word that's nested less deeply than "the". For example, "most significant thing" or "associated syntactic function words". You know, like the things you might type into a Google search...
  • What is 'natural' about the English language?
    • Re: (Score:2, Informative)

      by lstellar ( 1047264 )
      Everything. All languages are natural. In fact, the spoken word is as good a subject to study evolution and 'survival of the fittest' (to a degree) as any biological organism. The way that different languages and dialects have collided over the years and weeded out words, phrases and structures that work or don't work is one of the most complex and interesting topics around. Despite its quirks the English language is as natural as any creole or foreign language out there, simply evolved differently.
    • Comment removed based on user account deletion
  • by dpbsmith ( 263124 ) on Tuesday December 18, 2007 @11:32AM (#21739816) Homepage
    Isaac Asimov's fictional Multivac was a huge computer with some near-universal knowledge database that answered natural-language questions, giving Asimov all sorts of opportunities to present philosophical conundrums as entertaining short stories.

    In the 1960s and thereabouts, when I used to hack around on minicomputers, but personal computers weren't well known to the general public, I always found it difficult to explain what computers did. One of their commonest questions was "Well, how does it work, do you type in questions and does it answer them?" Programming in assembly language didn't really fit that description.

    Many technological fantasies seem to remain surprisingly distance. I tried ViaVoice and gave up: it's not a "voice typewriter." Roomba is not a general-purpose housekeeping humanoid-form robot, and neither are the machines that weld automobile chassis.

    However, it seems to me that Google is within striking distance of Asimov's "Multivac" fantasy.

    Incidentally, if you type in queries as complete sentences Google seems to do any worse than if you don't. Sort of the converse of adventure games, where one begins by typing "Walk over to the table on the left and pick up the silver key with your left hand" and quickly learns to use telegraphic style: "Go table. Take key."
    • I meant that "if you type in queries as complete sentences, Google doesn't to do any worse than if you don't." That is, even though it's not an advertised feature, you can use natural language with Google if you like. It just doesn't help you; you might just as well use truncated phrases.
    • I tried ViaVoice about eight years ago and I heard that signal recognition - both OCR and VR - have come a long way since then. Haven't tried it though, and I don't know how good ViaVoice is now.

      Also, "What is the sine of half pi and a half times the cosine of one quarter plus the answer to life, the universe, and everything?" works correctly. I found this pretty awesome.
    • Google comes close with simple "what" questions like:

      What is one plus one? [google.com] will give you 2.

      What is the speed of light? [google.com] will give you 299 792 458 m / s.

      And maybe even something like...

      What is the Capital of Sweden? [google.com] will give you Stockholm.

      Will give you the answer at the top of the screen.

      Of course if you type

      What is the reason for Napoleon's 1812 defeat? [google.com]

      It will give you the 1812 overture as the first hit so it has a bit more to go on context.
      • by SL Baur ( 19540 )
        Another example of questions that work well is "What does <acronym> mean?"

        The search engines have always been good at playing Jeopardy. Typing the exact text of error messages tends to always lead to "What caused this?" and "How do I fix this?".
      • What is the reason for Napoleon's 1812 defeat?

        It will give you the 1812 overture as the first hit so it has a bit more to go on context.


        That, or Google believes it is on to a little known historical secret, related to the role of music in warfare. :P
        • (No wait: Actually, it just has the causal relationship reversed. After all, Napoleon's defeat was directly responsible for the 1812 overture!)
  • It would actually be a great advance, but the resources required would not offset its advantages since 99% of the time you can find what you're looking for using keywords and phrases.
  • I tell new users that they should just ask Google a question in plain english. That gives the a more natural context in which to embed their keywords. I know Google is just picking up on the keywords and ignoring the filler words, but it usually gets the correct results and it's a lot easier for people who are just starting out on the Internet.
  • this is also why (Score:5, Insightful)

    by circletimessquare ( 444983 ) <circletimessquar ... m ['gma' in gap]> on Tuesday December 18, 2007 @11:40AM (#21739916) Homepage Journal
    text-to-speech or speech-to-text is also useless (unless your blind/ deaf/ driving a car)

    the idea of interacting with a computer like a human is an artificial hangover from being introduced to the computer the first time. after using it for awhile, you realize that ineracting with a computer, in small limited ways, like searching information, is easier NOT using natural language

    for the very simple reason that it takes more thought, and more typing to interact naturally. it is easier to train a human to interact with a computer than it is to train a computer to interact with a human. and for the human, it is more rewarding, because the human realizes he doesn't need to exert so much effort

    "what is the capital of france?"

    versus

    "france capital"

    if you were to shout "france capital" at someone, it would be rude and confusing. but for a computer, it's actually superior

    it is the conservation of communication effort at work here that wins out over natural language in computer interaction

    • by CruddyBuddy ( 918901 ) on Tuesday December 18, 2007 @11:58AM (#21740150)
      Paris Hilton says:

      "That's easy! The capital of France is 'F'."

    • by garcia ( 6573 )
      if you were to shout "france capital" at someone, it would be rude and confusing. but for a computer, it's actually superior

      I know what you're trying to get at but that example wasn't exactly a good one. The search engine could simply strip all the words that are pointless (is, the, and of). I'm sure that if it accepted natural search words like "what" that would automatically be eliminated too.

      My biggest question is how many searches come from people in a natural way? Since Sunday only two have landed a
    • by Sciros ( 986030 )
      This is only true for basic queries where interpreting the queries as bags of words would suffice.

      Besides, for communication via speech it's completely unnatural to say "france capital" to a machine as opposed to "what is the capital of France," even. So for speech recognition systems NLP really helps out.
    • by xant ( 99438 )
      I don't think t2s or s2t are useless at all; just useless for controlling a computer. Not all the time, but in some situations I would very much like to be able to dictate an email into my phone, or call something that reads my email to me. And then, being able to log my conversation with another human being to text, which then gets emailed me, is a righteously good app.

      These aren't the holy grail technologies they were once hailed to be, for sure. But they have some very important niches.
    • Well, the long term idea is that computers will be able to ``understand'' what you mean, kind of like humans can understand what you mean most of the time. We currently don't "see it" because current generation of speech-to-text (and vice versa) just sucks. It's phrased based, not semantics based.

      Do you see people still using keyboards 200 years from now? How about 50? If not... then -something- has got to replace'em... Also notice the lack of keyboards on startrek :-)
  • Real questions ... (Score:5, Interesting)

    by foobsr ( 693224 ) on Tuesday December 18, 2007 @11:40AM (#21739922) Homepage Journal
    Typing "What is the capital of France?" won't get you better results than typing "capital of France." ... Most of what we do is at the word and phrase level; we're not concentrating on the sentence. We think it's important to get the right results rather than change the interface.

    This misses situations like searching for "That sf-short-story were the crew of the visiting spaceship is given a dog as a present" in which googling failed, at least for me, or, more technically, when you have absolutely no idea about what the relevant terms within the outcome might be. In short, if you have a real question.

    CC.
    • Yeah, recently I had that problem with "that one Bradbury story where the spacemen landed on Venus and it rained all the time and half of them went insane, also there were natives hunting them I think."
    • by Agripa ( 139780 )
      That sf-short-story were the crew of the visiting spaceship is given a dog as a present

      Just make sure not to report that the offog came apart under gravitational stress. That would really upset headquarters.

      Wow. There is ONE Google reference to that now and perhaps there will soon be two.
  • by harmonica ( 29841 ) on Tuesday December 18, 2007 @11:41AM (#21739932)
    These days, hardly any user enters queries in the form of natural language questions, judging from log files. That was different a couple of years ago.

    Just like "Click here to do X" isn't used as much on Web pages anymore. People now tend to know that they can click on underlined text to find out more.
    • Exactly. If there was ever a real need to be able to do natural language searches, that time is basically over. People have been learning how to search the internet effectively, it's not really that hard to do it successfully. As the general populace gets more and more computer and internet savvy, a lot of this sort of thing becomes almost intuitive.
  • Most of what we do is at the word and phrase level; we're not concentrating on the sentence.
    No wonder AI:A Modern Approach is such a tough read!

  • do people really type questions into search boxes? that always stumped me about the ask jeeves thing....who the crap really ASKED anything. I thought you just googled what you wanted to know about (or nowadays, hit the wikipedia page for it for starters).

    Maybe I'm just not up on my search engine technology (or, rather, I don't know anything about it). I just don't know anybody who'd think to put a regular question into google.
  • Develop a natural language search engine that provides results that are as effective as Google's.

    I wonder if MS or Yahoo are listening...

    RS

  • by Animats ( 122034 ) on Tuesday December 18, 2007 @12:18PM (#21740376) Homepage

    If you have the opportunity to look at query logs, you see how dumb most search engine queries are.

    First, a big fraction of queries are simply navigational. Many are just URLs. The major search providers recognize these in the front end machines and send back canned answers, without even passing them to the real search engine. If you type "myspace" into Google, very little work is expended returning the canned reply.

    After that, most queries are one word. Phrase queries are less common.

    Few people seem to have noticed, but Google started returning results based on synonyms and homonyms a few weeks ago. There have been some significant algorithm changes recently.

    Less than 1% of queries use any operators, like '"" or '-'.

    The real problem with natural language queries, though, is that "Ask Jeeves" was a flop. Remember Ask Jeeves? That was a system designed to process queries written as sentences. But it wasn't used that way, and didn't succeed commercially.

    • Ask Jeeves was a flop because it started returning stupid results like, "Would you like to buy a Subatomic Physics?".
    • by Alomex ( 148003 )
      Few people seem to have noticed, but Google started returning results based on synonyms and homonyms a few weeks ago. There have been some significant algorithm changes recently.

      I've noticed because the quality of the results went down noticeably.

  • He's lying (Score:4, Insightful)

    by helicologic ( 845077 ) on Tuesday December 18, 2007 @12:20PM (#21740416)
    I think Norvig's lying. Google may not be pursuing linguistic structure above the phrase level in searches, but I'd bet a donut they're working their asses off trying to analyze crawled docs linguistically. To get relevance, they need to extract what a document is about. That implies sentence-level syntax analysis, which is input to sentence-level semantics, which is input to paragraph-level semantics, which is input to "pragmatic" analysis. I think what he's not saying is that the place the linguistic research dollars are going is elsewhere than parsing "Where is Paris?"
    • by clodney ( 778910 )
      Even with Google's computing resources, I think attempting to do natural language analysis of the entire Internet would be a daunting proposition.

      Even though the number of queries processed every day is immense, the amount of text to analyze pales in comparison to the amount of text on the pages they crawl every day.

      Of course, they could prune their search set considerably if they just assumed that there is no semantic content in most MySpace pages and blog entries.
  • by peter303 ( 12292 ) on Tuesday December 18, 2007 @12:30PM (#21740524)
    How much natural language do you really need for a search? Not much.
  • by 192939495969798999 ( 58312 ) <info AT devinmoore DOT com> on Tuesday December 18, 2007 @12:48PM (#21740766) Homepage Journal
    All you have to do is look at Yahoo answers' average question clarity to get a sense of why whole-sentence AI may not be the best strategy for a search engine.
  • NLP is very useful (Score:3, Informative)

    by Sciros ( 986030 ) on Tuesday December 18, 2007 @01:10PM (#21741138) Journal
    Natural language processing is useful when it is well-done. Getting it well-done is the tough part. Don't let Google reps trick you into thinking otherwise just because their R&D in the field isn't where they'd probably like it to be.

    Here are some situations where it's useful:
    1) interpreting a question rather than just treating it as a "bag of words." For instance, one can type "how tall is Mt. Everest" in the search bar and Google, rather than searching for documents that contain those 5 (or so) tokens will interpret that as a query asking for height and also search for documents that contain "Mt.", "Everest", and "height". Take that a step further and it might look for strings that represent height such as a number followed by "ft" or "meters" or "m".

    2) Condensing query chains. Suppose you want to know what sport our 4th president enjoyed playing most. You can ask "what sport did the fourth president of the US like playing?" and the system will give you an answer by first interpreting "fourth president of the US" as Madison, and then searching for what sports Madison enjoyed playing. If not for such interpretation you would either have to run 2 queries (first to find out who the 4th president was, then what sports he liked), or hope that there is a document out there that Google's indexed that contains the words in that initial query.

    3) Speech recognition! If you want to run a Q/A session with a computer system that has a speech recognition front end, it is more natural (easier and faster) to ask it "how tall is mt. everest?" than to say "mount everest height" or whatever you would end up typing into Google today. People like to speak using *natural language,* after all. They would gladly do it with computers if the SR systems in them were good enough (some are).

    4) More precise query results. What's better, getting back a document that is likely to contain the answer to your query, or getting back the sentence that contains it? Or better yet, getting back the answer and nothing else? The more robust an NLP system the more complicated queries it can interpret and the more elegant its result can be.

    On that note, Google actually *does perform* NLP on queries despite what from the summary (I didn't RTFA) looks like claims to the contrary. If you ask Google "how tall is Mt. Everest?" it actually DOES interpret that particular sentence and gives you the answer -- 29000ft or thereabouts. And you only get such an elegant result if you type "how tall is Mt. Everest" (without quotes) or "Mt. Everest how tall". Other queries of this nature will not give you quite as precise a response.
  • by Dan East ( 318230 ) on Tuesday December 18, 2007 @01:18PM (#21741254) Journal
    > wii
    Your query does not include a verb.

    > find wii
    Whose "wii" do you want me to find?

    > find wii review
    Unable to find any reviews authored by "wii".

    > find review about wii
    No reviews found concerning the common noun "wii".

    > find review about Wii
    Here is the most recent review about the proper noun "Wii": [url to a page full of keywords related to Wii]

    > find review about Wii order by relevence
    "relevence" is not an English word. Did you mean "relevance"?

    > find review about Wii order by relevance
    Here is the most relevant review about Wii: [url to a 2 year old pre-review of the Wii before it was launched]

    > find review about Wii order by relevance then date
    Here is the most recent and most relevant review about Wii: [url to a fanboy site]

    > find all reviews about Wii order by relevance then date
    Working...

    > abort
    Abort what?

    > abort search
    I am currently performing 1,231,415 searches. Which search do you want me to abort?

    > abort last search
    You do not have permission to abort others' searches.

    > abort my last search
    Last search aborted.

    > find several reviews about Wii order by relevance then date
    "Several" is not a quantifiable adjective. Do you mean "seven"?

    > find seven reviews about Wii order by relevance then date
    Here are your results. For better search results please capitalize the first word of sentences, and end sentences with proper punctuation.

    Dan East
    • by Sciros ( 986030 )
      Wow Dan East your NLP system is a real piece of trash. You should look at how most systems of this sort are actually put together before making a pointless straw man. :-P
      • Not that anyone cares one way or another, but my post was meant to be a joke.

        Dan East
        • by Sciros ( 986030 )
          Oh, well then nvm ^_^

          I used to have a friend that said things along the lines of what you said, but in seriousness to try and argue against something. I approached your post with the wrong mindset and I'm glad I was mistaken.
  • While natural language might seem like a good idea to people who are less technical, it's actually a really bad idea. It would slow a lot of things down in terms of search and would bring with it deep inefficiencies. Frankly, I think search engines would be improved if they offered advanced features with brief commands (kind of like how Unix abbreviates 'copy' as 'cp' or 'move' as 'mv'). For example, which do you think is better when you want to move quickly, a vehicle with wheels, or a bipedal vehicle w
  • I guess they'll just let Powerset become the next Google. Face it, "keywordese" language is often not adequate. Questions constitute a significant fraction of search engine traffic, and all search engines fail miserably on anything but "how to" queries. Just yesterday I was looking for a comparison between two products on the web. I've found it, eventually, but there's no real reason why it shouldn't be the first hit after I enter "comparison between X and Y". It's not a question in itself, yet it's a disti
  • or at least the option. That includes escaping "+" and "-". That would do sooooo much to improve searches.
  • Unless a computer knows things the way a human does, it's not possible for natural language queries to ever work.

"Being against torture ought to be sort of a multipartisan thing." -- Karl Lehenbauer, as amended by Jeff Daiell, a Libertarian

Working...