Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
United States AI

ChatGPT Users Send 2.5 Billion Prompts a Day 37

ChatGPT now handles 2.5 billion prompts daily, with 330 million from U.S. users. This surge marks a doubling in usage since December when OpenAI CEO Sam Altman said that users send over 1 billion queries to ChatGPT each day. TechCrunch reports: These numbers show just how ubiquitous OpenAI's flagship product is becoming. Google's parent company, Alphabet, does not release daily search data, but recently revealed that Google receives 5 trillion queries per year, which averages to just under 14 billion daily searches. Independent researchers have found similar trends. Neil Patel of NP Digital estimates that Google receives 13.7 billion searches daily, while research from SparkToro and Datos -- two digital marketing companies -- estimates that the figure is around 16.4 billion per day.

ChatGPT Users Send 2.5 Billion Prompts a Day

Comments Filter:
  • 2.5 billion people looking for another job.

    • by evanh ( 627108 )

      Especially since the biggest use is probably just as a chat bot behaving similar to a search engine. So in effect directly competing with the economics of Google.

    • Of course.

      I send 50 just to make sure they pay for it.

      • > I send 50 just to make sure they pay for it.

        I have mixed feelings about this due to the environmental impact and the fact that it inflates their numbers for investments. But I kinda really want to see those investors get burned anyway.
    • by allo ( 1728082 )

      How long did it take for Amazon to make profit? How long for Facebook? I guess 2.5B queries a day is a good motivation for investors to keep investing.

      • I am not a financial analyst, but personally, I think OpenAPI is overvalued and I don't see a path to positive ROI unless something significant changes. There are a lot of factors including significant competition for talent and customers plus huge financial outlays required, and there doesn't seem to be much of a mote around their technology, which doesn't seem to be improving very quickly (just appearing in more places). OpenAPI currently has something like a mindshare advantage related to a significant c
        • by allo ( 1728082 )

          I see OpenAI in the future as a part of GAFAMO. I don't like it, but I think it's hard to avoid. Maaaaybe Anthropic takes the place.

          First, inference is getting way cheaper every year (latest trend was 9-900 per year, depending on the application). The ChatGPT 3.5 Model with 175 billion parameters can be replaced with Llama 3.2 3B (3 billion parameters) now. That runs on a good smartphone.
          Second, they already started making deals with publishers. In case there will be laws about training data, they are bette

          • I think/hope the OpenAI lawsuits are just getting started, and just litigating could be a huge cost to them. OpenAI is knowingly and intentionally recklessly releasing non-deterministic technologies that generate inaccurate output, which can result in harm and is increasingly exacerbating certain human mental frailties. It's also possible to jailbreak these systems - apparently just using complex phrasing - to output content that is intentionally harmful (though in my experience, grok is actually worse - fo
  • When some of the prompts are just to see if the first prompt is the right question. Simon says. It inflated the number of prompts

    • Indeed, I always have to tell it: Again, without the em dashes, the fucking oval buttons that cannot be copied into my research.

      Each and every time!

      It got a memory file but it NEVER reads it.

      • by Krneki ( 1192201 )

        Chatgpt: I'll do whatever I want, dad.

      • without the em dashes, the fucking oval buttons that cannot be copied into my research.

        Not "research". Copy-pasting AI-slop and calling it "research".

      • by JoshuaZ ( 1134087 ) on Tuesday July 22, 2025 @07:36AM (#65536386) Homepage
        I'm not sure what "research" you are doing. But if you are copying and pasting ChatGPT as "research" and going out of your way to disguise that it is from ChatGPT, there seems like there are multiple problems here. These systems can be useful, but they are not reliable and everything needs to be checked. And even when it is correct, you'll understand what it is saying much more if you read it and then rewrite what it says in your own words, tweaking it with your own ideas and other sources of information, not simply copying and pasting it.
  • Honestly, I’m one of those 2.5 billion daily prompts. Last week I was trying to figure out how to get a legit discount on Bluehost hosting (most coupon sites are full of expired links or fake codes).

    I asked ChatGPT, and it pointed me to a Reddit thread where someone shared a working setup method — no coupon code, just direct steps. I ended up getting 75% off + a free domain.

    Sharing in case anyone else is in the same boat: https://www.reddit.com/r/Blueh... [reddit.com]

    Say what you want about ChatGPT, but it s

    • by ledow ( 319597 )

      And how did just googling do?

      Because the "AI" is still really nothing more than a glorified search engine wrapped in a nonsense blurb-spouter.

      • And how did just googling do?

        Because the "AI" is still really nothing more than a glorified search engine wrapped in a nonsense blurb-spouter.

        I don't use ChatGPT, but I do use search engines a lot. And I can well believe that, at least for now, the results of an LLM query are probably a lot more focused and relevant than those of a search engine query.

        Of course, it's likely that over time LLM results will end up serving ads and favouring various agendas having nothing to do with the query made. But for now, I suspect that there's more wheat and less chaff in LLM results than in search engine "slop".

    • I just tried the same question on Google. The reddit thread was the second result - the first was the link to the provider website. And it was like 10X faster than any chatbot (and likely 100-100X cheaper in terms of machine resource usage).

    • You should start by finding a better hosting company.

    • by Calydor ( 739835 )

      Have you never used a search engine before or something? Do you think the internet pre-2022 was just people entering random letters and numbers in the address bar and praying for the best?

  • Profit (Score:4, Insightful)

    by ledow ( 319597 ) on Tuesday July 22, 2025 @08:42AM (#65536462) Homepage

    Is it profitable yet?

    Even the $200 / month tier isn't profitable, apparently.

    And they were trying to push a $10,000 "PhD-level" tier to try to make some cash.

    While you're giving stuff away, I'm sure it's popular. What happens when you start charging even cost price for it? Or when you start trying to profit from it?

    Because you only really have another year or so before investors want answers like that.

    Google was "free" for years, great, useful and very popular, but had to then basically pivot into the advertising business to survive.

    Before we redesign how we do everything and get locked into buying this stuff and making things dependent on it... it would be nice to know how you intend to fund it when investors start demanding a return.

    I mean, obviously, if it was actually real AI it would be making its own money, just like any intelligent being. Has ChatGPT got so much as a Saturday job yet?

    • Has ChatGPT got so much as a Saturday job yet?

      It's taken loads of customer service jobs. They probably don't pay it though. And they shouldn't because its customer service is pure shite and goes in circles.

      • ... its customer service is pure shite and goes in circles.

        How is this different from most human-based customer support?

    • It's going to be really funny when all of these companies who fired half their staff get the pot turned up to "we need to be profitable" pricing.

      Yes, it will probably still be cheaper than a human being, especially a western one. But OpenAI would be stupid not to charge 5 digits to replace a 6 digit office worker.

    • by allo ( 1728082 )

      Do you talk about queries or amortized training? For $200 you can execute many queries. Training is another thing, but currently it seems that amortizing it into the query price is +10% on the inference price. That of course depends on the number of queries (see article) and how long the same model is used.

  • by Fifth of Five ( 451664 ) on Tuesday July 22, 2025 @08:52AM (#65536476)

    If you treat it as a tool it works just fine. Yes, if you want better results you need to learn to ask better questions, but it is not difficult to figure out. As one person noted in the comments here- it is basically a glorified search engine. When I google for information I get links I then evaluate to see if they fit my issue. With ChatGPT I describe the situation, ask the question, and generally get an answer I can use, but, and this is HUGE, you need to know what you are doing in the subject you are asking about because YOU are the final arbiter of what might and might not work.

  • How many of this huge number is "how can I get a date" ??

  • So is it just like everyone knows it's pure garbage and makes shit up as it goes but just does it anyway because its easy and that's what everyone else is doing? Are we just going to end up with a generation unable to think or reason when this bubble bursts?
    • Are we just going to end up with a generation unable to think or reason when this bubble bursts?

      I think that ship has already sailed beyond the horizon.

  • Less than I expected, but that's still a lot of spam.

  • I may use ChatGPT but not for any serious research, while it suffers from hallucinations, as in it just makes stuff up. ChatGPT cuts-off at September. while Grok is current up to July 22, 2025. although there is a limit of three searchers per hour.
  • Yep, makes a lot of sense. ChatGPT will not fix that though, as it is programmed to tell complete idiots what they want to hear. Probably why it is so successful.

To the landlord belongs the doorknobs.

Working...