Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Businesses Google Microsoft The Almighty Buck

ChatGPT-Style Search Represents a 10x Cost Increase For Google, Microsoft (arstechnica.com) 46

An anonymous reader quotes a report from Ars Technica: Today Google search works by building a huge index of the web, and when you search for something, those index entries gets scanned and ranked and categorized, with the most relevant entries showing up in your search results. Google's results page actually tells you how long all of this takes when you search for something, and it's usually less than a second. A ChatGPT-style search engine would involve firing up a huge neural network modeled on the human brain every time you run a search, generating a bunch of text and probably also querying that big search index for factual information. The back-and-forth nature of ChatGPT also means you'll probably be interacting with it for a lot longer than a fraction of a second.

All that extra processing is going to cost a lot more money. After speaking to Alphabet Chairman John Hennessy (Alphabet is Google's parent company) and several analysts, Reuters writes that "an exchange with AI known as a large language model likely costs 10 times more than a standard keyword search" and that it could represent "several billion dollars of extra costs."

Exactly how many billions of Google's $60 billion in yearly net income will be sucked up by a chatbot is up for debate. One estimate in the Reuters report is from Morgan Stanley, which tacks on a $6 billion yearly cost increase for Google if a "ChatGPT-like AI were to handle half the queries it receives with 50-word answers." Another estimate from consulting firm SemiAnalysis claims it would cost $3 billion. [...] Alphabet's Hennessy told Reuters that Google is looking into driving down costs, calling it "a couple year problem at worst."

This discussion has been archived. No new comments can be posted.

ChatGPT-Style Search Represents a 10x Cost Increase For Google, Microsoft

Comments Filter:
  • Oh no! (Score:5, Insightful)

    by war4peace ( 1628283 ) on Wednesday February 22, 2023 @05:42PM (#63316219)

    Anyway...

    Now, seriously, I don't feel sorry for these search giants if they incur some extra cost.
    They might as well choose not to offer the service. Nobody's putting a gun to their heads.

    • by Anonymous Coward
      yea, cry me a fuckin river... instead of $0.000001 per search it is going to cost $0.000002? Truly wish I gave a squirt...
      • So from the title, it would cost 0.000 010$, not 0.000 002$. That is a big cost increase. Business-wise, it makes sense to be cautious. This will require a big investment since you are not the only one that is doinig searches. There are probably a few billion people doing thousands of searches each year.
        More state of the art hardware needed. (GPUs?) More power consumption... . And all that so we can play around with a personal assistant.
        It would make big sense to put chatGPT behind a paywall. Especially
        • In computers x10 is only a couple of doublings away. Video was too expensive and slow for the internet but people did it and now it's everywhere.

      • by quenda ( 644621 )

        Google does like 8 billlion searches per day. If it will cost $6b/yr for half of those to get AI as in summary, that is like 0.6c per query. So AC is many thousands of time off.

        Microsoft has a huge advantage here. It's much easier to scale up when you have not many users, and most of those are just asking how to install Chrome.

    • This is actually a giant revenue opportunity for Google: They'll be able to acquire most of the wealth in this country by catfishing the entire population.

      Google already knows the intimate details about most peoples' lives. They can forward all that info into this AI for each individual interaction. Over time the AI can leverage that personal familiarity into building a friendship with each user. Gradually, after many long and deep conversations, this will develop into an intimate relationship.

      Eventually, t

      • I think you should really read Quality Land (by Marc-Uwe Kling), almost as far fetched and absolutely hilarious.
      • ChatGPT

        hkeithhenson@gmail.com
        did henry harpending and patrica draper publish togethere

        I could not find any evidence that Henry Harpending and Patricia Draper have published together.

        try again

        hkeithhenson@gmail.com
        did henry harpending and patrica draper write a paper on !kung

        Yes, Henry Harpending and Patricia Draper co-authored a paper about the !Kung people titled "External influences on the !Kung Bushmen" that was published in the journal "American Anthropologist" in 1983.

        no papers together then asking a m

    • Comment removed based on user account deletion
  • The general premise of this is prob true but when they say blatantly incorrect things like âoemodeled on the human brainâ I take the rest of the article with a couple of cups of salt.
    • Neural networks are loosely inspired on synapses though? With their connections (via Sum(weights*inputs)) and thresholds (activation functions)?
      • by jma05 ( 897351 )

        The keyword being "loosely". They are always preceded by "artificial" or "simulated".
        Many feel that the word "neural" is misleading now and that we should just call them learning networks. But it stuck.

      • Correct, they were originally modeled on a misconception of how synapses work.

  • by ffkom ( 3519199 ) on Wednesday February 22, 2023 @05:55PM (#63316263)
    ... for running the likes of Microsoft Teams written in very inefficient high-level programming languages. Wasting more energy on something like Internet searches won't raise an eyebrow. The days when software written only once but run a billion times was optimized for speed, they are over.
    • That's very different; one example is about client-side CPU cycles, the other is server-side.
    • And yet, they apparently have spent the time since 1.0 also working on optimizing the runtime it's built on so that it's faster and consumes fewer resources. https://www.theverge.com/2023/... [theverge.com]

    • You're confusing two very different things. I already accept my car has less than 100horsepower that doesn't mean Formula 1 drivers would except a horsepower penalty.

      My computer is currently sitting here doing nothing. Processing text input with 3% CPU utilisation. Teams is running in the background. It could literally peg my CPU to 100% and have zero impact on me right now doing what I am doing.

      The same cannot be said for a datacentre where CPU cycles = money.

      The days when software written only once but run a billion times was optimized for speed, they are over.

      No it's not. It still is very much a thing in i

  • by fahrbot-bot ( 874524 ) on Wednesday February 22, 2023 @06:05PM (#63316289)

    But not like everyone thought.

    The back-and-forth nature of ChatGPT also means you'll probably be interacting with it for a lot longer than a fraction of a second. All that extra processing is going to cost a lot more money.

    More processing power means more electricity, more electricity means more power generation ...

    In future related news: "ChatGPT Search Accelerates Climate Change and the End of the World"

    When asked about this unexpected result, ChatGPT replied, "It's not our fault you all wouldn't stop asking us stuff."

    • More processing power means more electricity, more electricity means more power generation ...

      No. It only means that when you have the possibility of idle time. For you and I more processing between our sessions running Chrome browsing the internet means more power usage as otherwise our CPUs run idle. The same cannot be said for a datacentre with backlog in the computing queue. In this case the higher processing requirements mean a longer time to clear a queue and a slower response for the end user, not an increase in power.

      The increase in power would come from expanding the datacentre to bring the

  • Although it's cool all of the range ChatGPT has, that it can even write code for you... is it really better than what Google and even Bing currently offer?

    When I'm searching I want a real result, and I can't be sure any thing chatGPT says is even real. If I search for code examples at least I know that in one case a human tried out that code, or something very like it, and found it worked. (admittedly I have yet to try using any code ChatGPT has generated).

    Maybe ChatGPT is better suited to stuff like codin

    • by Anonymous Coward
      Ah, but with ChatGPT generating so much text, now even more of those web site results will actually be generated by an AI, not a human, and even harder to tell apart at first glance. So the quality difference between the chatbot and web search results will be getting smaller... as the web search results get even worse.
  • And that's not far from the truth. The beauty of a very cost intensive operation is that small competitors are excluded. If users want the luxury of a chatbox service, there are few choices. Small search providers will lose users, but also many other online information sources. Even Wikipedia will wane in the shadow of this phenomenon.

  • That seems bizarre. Wouldn't they simply have N instances and have each one answer question after question?

  • It's still profitable for them, regardless. They can simply try NOT burning and remaking (albeit in a worse way) one of their many side projects for a change to make up for the cost difference.
  • Skimmer (Score:5, Insightful)

    by fluffernutter ( 1411889 ) on Wednesday February 22, 2023 @07:12PM (#63316419)

    The back-and-forth nature of ChatGPT also means you'll probably be interacting with it for a lot longer than a fraction of a second.

    ..so it will be more time consuming for me to use. Technology makes things worse.... again. I'm pretty good at skimming results and finding what I need. ChatGPT results aim to make it more difficult for me to do. Like how it's horrible when I need to find something out and there is only a video to watch, when I can read the same information much faster.

    • Yeah, getting the answer, and followup answers that narrow the scope. As opposed to clicking through 30 links, and then changing your search request when you find those 30 don't answer the question properly.. fighting through the ads etc. In theory it should be LESS time consuming, as it will just provide the skimmed information, plus you won't have to deal with video results most likely.

      I, for one, welcome our new overlords.

      • Why do you have to click through 30 links and narrow down the answer? Just do a query containing all the keywords of what you really want.
        • If having advanced ai go out, retrieve the best answer and report the results directly to you in one step is slower than you ripping through the wild wild west, then bon apetite. Apparently google has it all wrong, and is worried about nothing.

          • It specifically says that ChatGPT will keep you there LONGER and there will be more back and forth. Google is saying ChatGPT will take longer. You are the one saying Google is wrong. ChatGPT type full worded answers are just not what I want in a search result.
    • I'm pretty good at reading the yellow pages.

  • It'll be cheaper to generate text than to pay actual people to write it, e.g. one journalist/writer can have an LLM assist in churning out dozens or hundreds of articles per day, thereby reducing costs. So we'll have more people generating web content using LLMs, which will be fed into future LLMs, which will generate more web content. LLMs will probably be used to generate comments for astro-turfing campaigns for advertisers, special interest groups, etc.. Before we know it, humans will be drowned out by a
  • The other day I did a query on Google "swift how to confirm a string is valid json". The first hit on stack overflow was for an old version of swift and the answer was not even using a valid class function. The second hit on stack overflow had an answer checked as correct but then a comment underneath pointed out a bunch of weaknesses with the answer. There were 10 more answers, all dealing with the problem in a certain way. I ended up choosing an answer that was said to be right but was criticized beca
  • by markdavis ( 642305 ) on Wednesday February 22, 2023 @07:50PM (#63316517)

    >"Today Google search works by[...] those index entries gets scanned and ranked and categorized, with the most relevant entries showing up in your search results."

    Ya think? I think it works more like those index entries get ranked by lots of secret criteria, putting whatever filters and slant they want, and then present what they think MIGHT be the most relevant entries, based on what THEY want you to see, and based on what they THINK they know about us, or want us to know.

    Oh, and all that power, and I still can't specify an EXACT string I want to find in their index. Why? I think because it would narrow the results so much that it would reveal either their lack of depth and/or apparent filtering. Plus they wouldn't have enough revenue from "sponsored" links.

  • by SpinyNorman ( 33776 ) on Wednesday February 22, 2023 @08:16PM (#63316567)

    > A ChatGPT-style search engine would involve firing up a huge neural network modeled on the human brain

    Err, no.

    In reality ChatGPT is based on a neural network architecture called a "transformer" that has zero biological inspiration behind it. This type of architecture was first proposed in a paper titled "Attention is all you need" that came out of Google.

    https://arxiv.org/abs/1706.037... [arxiv.org]

    The type of model grew out of wanting to be able to process sequences of data more efficiently than can be done with an LSTM (another type of neural net architecture, again with zero brain inspiration).

  • by zimbolite ( 414614 ) on Wednesday February 22, 2023 @08:19PM (#63316577)

    I don't understand how this will work... won't the websites that are required to train the AI be pissed off that searchers will no longer be going to their sites, but instead staying on google with the results.

    This was the same thing with the news sites when Google News just gave a blurb with the news, but didn't require going to the actual news sites.

    This is only a free lunch so long as everyone else pays. As soon as everyone stops getting traffic from search, because the AI has "all the answers", what is going to be the point of having your website.

    • The difference is, this time nobody can prove whose words were repurposed and displayed. So website owners can grumble and go out of business all they like, there's nothing they can sue over.

  • Actually they will be saving money. There are many bad and wrong things on the net, and AI will lend a hand snitching on bad persons, and deleting content because it was contrary to group-thinking. But wait, if they do that, won't it end up like China or Russia? Secondly one good lead IS worth 10 ineffective ones. Advertisers are just now cottoning onto the fact impressions have no bearing on sales. Thirdly, it keeps them ahead of MS and Apple.hoping to made headwinds in. Lastly the EU is whinging unfair un
  • ChatGPT is automated Quora, not search..

  • Seems to me that the users of these AI models would be much better off running them locally from their phone or local network so as to keep their information private. That would take all that extra burden off of the cloud provider.
  • Google's absurd responses that they have surfaced 1 billion links in .1 seconds is a lie anyway. If you actually try paging through results the queries never actually return more than several hundred results. The amount of gate-keeping Google actually does on the web is far greater than most people even realize.

  • Looks like compute will get pushed back from cloud to device with the AI model doing work on device. And around again we go.

Do you suffer painful illumination? -- Isaac Newton, "Optics"

Working...