

ChatGPT Users Send 2.5 Billion Prompts a Day 37
ChatGPT now handles 2.5 billion prompts daily, with 330 million from U.S. users. This surge marks a doubling in usage since December when OpenAI CEO Sam Altman said that users send over 1 billion queries to ChatGPT each day. TechCrunch reports: These numbers show just how ubiquitous OpenAI's flagship product is becoming. Google's parent company, Alphabet, does not release daily search data, but recently revealed that Google receives 5 trillion queries per year, which averages to just under 14 billion daily searches. Independent researchers have found similar trends. Neil Patel of NP Digital estimates that Google receives 13.7 billion searches daily, while research from SparkToro and Datos -- two digital marketing companies -- estimates that the figure is around 16.4 billion per day.
How do I write a resignation letter? (Score:2)
2.5 billion people looking for another job.
Financials may be shaky (Score:3)
Re: (Score:3)
Especially since the biggest use is probably just as a chat bot behaving similar to a search engine. So in effect directly competing with the economics of Google.
Re: (Score:2)
Of course.
I send 50 just to make sure they pay for it.
Re: (Score:2)
I have mixed feelings about this due to the environmental impact and the fact that it inflates their numbers for investments. But I kinda really want to see those investors get burned anyway.
Re: (Score:3)
How long did it take for Amazon to make profit? How long for Facebook? I guess 2.5B queries a day is a good motivation for investors to keep investing.
Re: (Score:2)
Re: (Score:2)
I see OpenAI in the future as a part of GAFAMO. I don't like it, but I think it's hard to avoid. Maaaaybe Anthropic takes the place.
First, inference is getting way cheaper every year (latest trend was 9-900 per year, depending on the application). The ChatGPT 3.5 Model with 175 billion parameters can be replaced with Llama 3.2 3B (3 billion parameters) now. That runs on a good smartphone.
Second, they already started making deals with publishers. In case there will be laws about training data, they are bette
Re: (Score:2)
Try, try, try again (Score:2)
When some of the prompts are just to see if the first prompt is the right question. Simon says. It inflated the number of prompts
Re: (Score:2)
Indeed, I always have to tell it: Again, without the em dashes, the fucking oval buttons that cannot be copied into my research.
Each and every time!
It got a memory file but it NEVER reads it.
Re: (Score:2)
Chatgpt: I'll do whatever I want, dad.
Re: (Score:2)
without the em dashes, the fucking oval buttons that cannot be copied into my research.
Not "research". Copy-pasting AI-slop and calling it "research".
Re:Try, try, try again (Score:5, Informative)
Re: (Score:2)
Wanting to copy actual html links instead of a fucking oval button is not 'hiding'.
2.5 billion daily prompts (Score:2)
Honestly, I’m one of those 2.5 billion daily prompts. Last week I was trying to figure out how to get a legit discount on Bluehost hosting (most coupon sites are full of expired links or fake codes).
I asked ChatGPT, and it pointed me to a Reddit thread where someone shared a working setup method — no coupon code, just direct steps. I ended up getting 75% off + a free domain.
Sharing in case anyone else is in the same boat: https://www.reddit.com/r/Blueh... [reddit.com]
Say what you want about ChatGPT, but it s
Re: (Score:2)
And how did just googling do?
Because the "AI" is still really nothing more than a glorified search engine wrapped in a nonsense blurb-spouter.
Re: (Score:3)
And how did just googling do?
Because the "AI" is still really nothing more than a glorified search engine wrapped in a nonsense blurb-spouter.
I don't use ChatGPT, but I do use search engines a lot. And I can well believe that, at least for now, the results of an LLM query are probably a lot more focused and relevant than those of a search engine query.
Of course, it's likely that over time LLM results will end up serving ads and favouring various agendas having nothing to do with the query made. But for now, I suspect that there's more wheat and less chaff in LLM results than in search engine "slop".
Re: 2.5 billion daily prompts (Score:2)
I just tried the same question on Google. The reddit thread was the second result - the first was the link to the provider website. And it was like 10X faster than any chatbot (and likely 100-100X cheaper in terms of machine resource usage).
Re: 2.5 billion daily prompts (Score:2)
You should start by finding a better hosting company.
Re: (Score:2)
Have you never used a search engine before or something? Do you think the internet pre-2022 was just people entering random letters and numbers in the address bar and praying for the best?
Profit (Score:4, Insightful)
Is it profitable yet?
Even the $200 / month tier isn't profitable, apparently.
And they were trying to push a $10,000 "PhD-level" tier to try to make some cash.
While you're giving stuff away, I'm sure it's popular. What happens when you start charging even cost price for it? Or when you start trying to profit from it?
Because you only really have another year or so before investors want answers like that.
Google was "free" for years, great, useful and very popular, but had to then basically pivot into the advertising business to survive.
Before we redesign how we do everything and get locked into buying this stuff and making things dependent on it... it would be nice to know how you intend to fund it when investors start demanding a return.
I mean, obviously, if it was actually real AI it would be making its own money, just like any intelligent being. Has ChatGPT got so much as a Saturday job yet?
Re: (Score:3)
Has ChatGPT got so much as a Saturday job yet?
It's taken loads of customer service jobs. They probably don't pay it though. And they shouldn't because its customer service is pure shite and goes in circles.
Re: (Score:3)
... its customer service is pure shite and goes in circles.
How is this different from most human-based customer support?
Re: (Score:2)
It's going to be really funny when all of these companies who fired half their staff get the pot turned up to "we need to be profitable" pricing.
Yes, it will probably still be cheaper than a human being, especially a western one. But OpenAI would be stupid not to charge 5 digits to replace a 6 digit office worker.
Re: (Score:2)
Do you talk about queries or amortized training? For $200 you can execute many queries. Training is another thing, but currently it seems that amortizing it into the query price is +10% on the inference price. That of course depends on the number of queries (see article) and how long the same model is used.
It is just another tool (Score:5, Informative)
If you treat it as a tool it works just fine. Yes, if you want better results you need to learn to ask better questions, but it is not difficult to figure out. As one person noted in the comments here- it is basically a glorified search engine. When I google for information I get links I then evaluate to see if they fit my issue. With ChatGPT I describe the situation, ask the question, and generally get an answer I can use, but, and this is HUGE, you need to know what you are doing in the subject you are asking about because YOU are the final arbiter of what might and might not work.
Lots of wasted queries (Score:1)
How many of this huge number is "how can I get a date" ??
So... (Score:2)
Re: (Score:3)
Are we just going to end up with a generation unable to think or reason when this bubble bursts?
I think that ship has already sailed beyond the horizon.
Re: (Score:3)
2.5B pieces of spam (Score:2)
Less than I expected, but that's still a lot of spam.
I can Grok that .. (Score:2)
Tons of people not knowing how to think? (Score:2)
Yep, makes a lot of sense. ChatGPT will not fix that though, as it is programmed to tell complete idiots what they want to hear. Probably why it is so successful.