ChatGPT-Style Search Represents a 10x Cost Increase For Google, Microsoft (arstechnica.com) 46
An anonymous reader quotes a report from Ars Technica: Today Google search works by building a huge index of the web, and when you search for something, those index entries gets scanned and ranked and categorized, with the most relevant entries showing up in your search results. Google's results page actually tells you how long all of this takes when you search for something, and it's usually less than a second. A ChatGPT-style search engine would involve firing up a huge neural network modeled on the human brain every time you run a search, generating a bunch of text and probably also querying that big search index for factual information. The back-and-forth nature of ChatGPT also means you'll probably be interacting with it for a lot longer than a fraction of a second.
All that extra processing is going to cost a lot more money. After speaking to Alphabet Chairman John Hennessy (Alphabet is Google's parent company) and several analysts, Reuters writes that "an exchange with AI known as a large language model likely costs 10 times more than a standard keyword search" and that it could represent "several billion dollars of extra costs."
Exactly how many billions of Google's $60 billion in yearly net income will be sucked up by a chatbot is up for debate. One estimate in the Reuters report is from Morgan Stanley, which tacks on a $6 billion yearly cost increase for Google if a "ChatGPT-like AI were to handle half the queries it receives with 50-word answers." Another estimate from consulting firm SemiAnalysis claims it would cost $3 billion. [...] Alphabet's Hennessy told Reuters that Google is looking into driving down costs, calling it "a couple year problem at worst."
All that extra processing is going to cost a lot more money. After speaking to Alphabet Chairman John Hennessy (Alphabet is Google's parent company) and several analysts, Reuters writes that "an exchange with AI known as a large language model likely costs 10 times more than a standard keyword search" and that it could represent "several billion dollars of extra costs."
Exactly how many billions of Google's $60 billion in yearly net income will be sucked up by a chatbot is up for debate. One estimate in the Reuters report is from Morgan Stanley, which tacks on a $6 billion yearly cost increase for Google if a "ChatGPT-like AI were to handle half the queries it receives with 50-word answers." Another estimate from consulting firm SemiAnalysis claims it would cost $3 billion. [...] Alphabet's Hennessy told Reuters that Google is looking into driving down costs, calling it "a couple year problem at worst."
Oh no! (Score:5, Insightful)
Anyway...
Now, seriously, I don't feel sorry for these search giants if they incur some extra cost.
They might as well choose not to offer the service. Nobody's putting a gun to their heads.
Re: (Score:1)
Re: (Score:2)
More state of the art hardware needed. (GPUs?) More power consumption... . And all that so we can play around with a personal assistant.
It would make big sense to put chatGPT behind a paywall. Especially
Re: (Score:2)
In computers x10 is only a couple of doublings away. Video was too expensive and slow for the internet but people did it and now it's everywhere.
Re: (Score:2)
Google does like 8 billlion searches per day. If it will cost $6b/yr for half of those to get AI as in summary, that is like 0.6c per query. So AC is many thousands of time off.
Microsoft has a huge advantage here. It's much easier to scale up when you have not many users, and most of those are just asking how to install Chrome.
Re: (Score:3)
This is actually a giant revenue opportunity for Google: They'll be able to acquire most of the wealth in this country by catfishing the entire population.
Google already knows the intimate details about most peoples' lives. They can forward all that info into this AI for each individual interaction. Over time the AI can leverage that personal familiarity into building a friendship with each user. Gradually, after many long and deep conversations, this will develop into an intimate relationship.
Eventually, t
Re: (Score:2)
Re: (Score:2)
ChatGPT
hkeithhenson@gmail.com
did henry harpending and patrica draper publish togethere
I could not find any evidence that Henry Harpending and Patricia Draper have published together.
try again
hkeithhenson@gmail.com
did henry harpending and patrica draper write a paper on !kung
Yes, Henry Harpending and Patricia Draper co-authored a paper about the !Kung people titled "External influences on the !Kung Bushmen" that was published in the journal "American Anthropologist" in 1983.
no papers together then asking a m
Re: (Score:2)
âoeModeled on a human brainâ - um no (Score:1)
Re: (Score:3)
Re: (Score:2)
The keyword being "loosely". They are always preceded by "artificial" or "simulated".
Many feel that the word "neural" is misleading now and that we should just call them learning networks. But it stuck.
Re: (Score:2)
Correct, they were originally modeled on a misconception of how synapses work.
People already accepted 10x computational costs.. (Score:5, Informative)
Re: (Score:2)
Re: (Score:1)
And yet, they apparently have spent the time since 1.0 also working on optimizing the runtime it's built on so that it's faster and consumes fewer resources. https://www.theverge.com/2023/... [theverge.com]
Re: (Score:2)
Re: (Score:2)
You're confusing two very different things. I already accept my car has less than 100horsepower that doesn't mean Formula 1 drivers would except a horsepower penalty.
My computer is currently sitting here doing nothing. Processing text input with 3% CPU utilisation. Teams is running in the background. It could literally peg my CPU to 100% and have zero impact on me right now doing what I am doing.
The same cannot be said for a datacentre where CPU cycles = money.
The days when software written only once but run a billion times was optimized for speed, they are over.
No it's not. It still is very much a thing in i
ChatGPT ends the World (Score:5, Funny)
But not like everyone thought.
The back-and-forth nature of ChatGPT also means you'll probably be interacting with it for a lot longer than a fraction of a second. All that extra processing is going to cost a lot more money.
More processing power means more electricity, more electricity means more power generation ...
In future related news: "ChatGPT Search Accelerates Climate Change and the End of the World"
When asked about this unexpected result, ChatGPT replied, "It's not our fault you all wouldn't stop asking us stuff."
Re: (Score:2)
More processing power means more electricity, more electricity means more power generation ...
No. It only means that when you have the possibility of idle time. For you and I more processing between our sessions running Chrome browsing the internet means more power usage as otherwise our CPUs run idle. The same cannot be said for a datacentre with backlog in the computing queue. In this case the higher processing requirements mean a longer time to clear a queue and a slower response for the end user, not an increase in power.
The increase in power would come from expanding the datacentre to bring the
Is it even better (Score:2)
Although it's cool all of the range ChatGPT has, that it can even write code for you... is it really better than what Google and even Bing currently offer?
When I'm searching I want a real result, and I can't be sure any thing chatGPT says is even real. If I search for code examples at least I know that in one case a human tried out that code, or something very like it, and found it worked. (admittedly I have yet to try using any code ChatGPT has generated).
Maybe ChatGPT is better suited to stuff like codin
Re: (Score:1)
- but we'll make it up in volume! - (Score:2)
And that's not far from the truth. The beauty of a very cost intensive operation is that small competitors are excluded. If users want the luxury of a chatbox service, there are few choices. Small search providers will lose users, but also many other online information sources. Even Wikipedia will wane in the shadow of this phenomenon.
new neural net for each query? (Score:2)
That seems bizarre. Wouldn't they simply have N instances and have each one answer question after question?
drop in the bucket (Score:2)
Skimmer (Score:5, Insightful)
The back-and-forth nature of ChatGPT also means you'll probably be interacting with it for a lot longer than a fraction of a second.
..so it will be more time consuming for me to use. Technology makes things worse.... again. I'm pretty good at skimming results and finding what I need. ChatGPT results aim to make it more difficult for me to do. Like how it's horrible when I need to find something out and there is only a video to watch, when I can read the same information much faster.
Re: (Score:2)
Yeah, getting the answer, and followup answers that narrow the scope. As opposed to clicking through 30 links, and then changing your search request when you find those 30 don't answer the question properly.. fighting through the ads etc. In theory it should be LESS time consuming, as it will just provide the skimmed information, plus you won't have to deal with video results most likely.
I, for one, welcome our new overlords.
Re: (Score:2)
Re: (Score:1)
If having advanced ai go out, retrieve the best answer and report the results directly to you in one step is slower than you ripping through the wild wild west, then bon apetite. Apparently google has it all wrong, and is worried about nothing.
Re: (Score:2)
Re: (Score:1)
I'm pretty good at reading the yellow pages.
Closed loop (Score:2)
unclear (Score:2)
"Most relevant" (Score:3)
>"Today Google search works by[...] those index entries gets scanned and ranked and categorized, with the most relevant entries showing up in your search results."
Ya think? I think it works more like those index entries get ranked by lots of secret criteria, putting whatever filters and slant they want, and then present what they think MIGHT be the most relevant entries, based on what THEY want you to see, and based on what they THINK they know about us, or want us to know.
Oh, and all that power, and I still can't specify an EXACT string I want to find in their index. Why? I think because it would narrow the results so much that it would reveal either their lack of depth and/or apparent filtering. Plus they wouldn't have enough revenue from "sponsored" links.
NOT modelled after any brain (Score:5, Insightful)
> A ChatGPT-style search engine would involve firing up a huge neural network modeled on the human brain
Err, no.
In reality ChatGPT is based on a neural network architecture called a "transformer" that has zero biological inspiration behind it. This type of architecture was first proposed in a paper titled "Attention is all you need" that came out of Google.
https://arxiv.org/abs/1706.037... [arxiv.org]
The type of model grew out of wanting to be able to process sequences of data more efficiently than can be done with an LSTM (another type of neural net architecture, again with zero brain inspiration).
Google news all over again (Score:5, Interesting)
I don't understand how this will work... won't the websites that are required to train the AI be pissed off that searchers will no longer be going to their sites, but instead staying on google with the results.
This was the same thing with the news sites when Google News just gave a blurb with the news, but didn't require going to the actual news sites.
This is only a free lunch so long as everyone else pays. As soon as everyone stops getting traffic from search, because the AI has "all the answers", what is going to be the point of having your website.
Re: (Score:2)
The difference is, this time nobody can prove whose words were repurposed and displayed. So website owners can grumble and go out of business all they like, there's nothing they can sue over.
Compliance cost SAVINGS (Score:2)
Quora (Score:2)
ChatGPT is automated Quora, not search..
AI models should be local to the user (Score:1)
Search results are a lie anyway (Score:2)
Google's absurd responses that they have surfaced 1 billion links in .1 seconds is a lie anyway. If you actually try paging through results the queries never actually return more than several hundred results. The amount of gate-keeping Google actually does on the web is far greater than most people even realize.
We go round and round (Score:2)