Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Intel Businesses The Almighty Buck

Intel Reports Largest Quarterly Loss In Company History (cnbc.com) 61

In the company's first-quarter earnings results (PDF) on Wednesday, Intel reported a 133% annual reduction in earnings per share. "Revenue dropped nearly 36% year over year to $11.7 billion," adds CNBC. From the report: In the first quarter, Intel swung to a net loss of $2.8 billion, or 66 cents per share, from a net profit of $8.1 billion, or $1.98 per share, last year. Excluding the impact of inventory restructuring, a recent change to employee stock options and other acquisition-related charges, Intel said it lost 4 cents a share, which was a narrower loss than analyst had expected. Revenue decreased to $11.7 billion from $18.4 billion a year ago.

It's the fifth consecutive quarter of falling sales for the semiconductor giant and the second consecutive quarter of losses. It's also Intel's largest quarterly loss of all time, beating out the fourth quarter of 2017, when it lost $687 million. Intel hopes that by 2026 that it can manufacture chips as advanced as those made by TSMC in Taiwan, and it can compete for custom work like Apple's A-series chips in iPhones. Intel said on Thursday it was still on track to hit that goal.

Intel's Client Computing group, which includes the chips that power the majority of desktop and laptop Windows PCs, reported $5.8 billion in revenue, down 38% on an annual basis. Intel's server chip division, under its Data Center and AI segment suffered an even worse decline, falling 39% to $3.7 billion. Its smallest full line of business, Network and Edge, posted $1.5 billion in sales, down 30% from the same time last year. One bright spot was Mobileye, which went public last year but is still controlled by Intel. Mobileye makes systems and software for self-driving cars, and reported 16% sales growth to $458 million.

This discussion has been archived. No new comments can be posted.

Intel Reports Largest Quarterly Loss In Company History

Comments Filter:
  • No kidding? (Score:5, Insightful)

    by beheaderaswp ( 549877 ) * on Thursday April 27, 2023 @08:47PM (#63482170)

    Who wouldn't have predicted this?

    Behind on process. Over budget on energy. Stiff competition.

    If anyone is surprised they haven't paid attention.

    • I think the real problem is the market for graphics cards dropped out before they could get to market. There's still plenty of money to be made there but they're not just going to get a bonanza that instantly pays back there investment like they could have if they were three or four years sooner to the party.

      I do hope they stick with it. They have a history of pretty solid drivers and hardware. But at the same time I've been shying away from their gpus. Then again I also replaced my rx580 with a used GT
      • Re:No kidding? (Score:4, Interesting)

        by Entrope ( 68843 ) on Thursday April 27, 2023 @09:58PM (#63482242) Homepage

        Graphic cards were a sideshow for Intel. They might have hoped to disrupt that duopoly, but Xi products were never (plausibly) going to yield significant revenue in this time frame. They will keep pumping out integrated graphics for the low-end market, but they need to deliver new process nodes to remain competitive there -- and those new process nodes are also what they need for their bread and butter products.

        • by dknj ( 441802 )

          Discrete graphics are a patent minefield

        • Re:No kidding? (Score:4, Interesting)

          by sg_oneill ( 159032 ) on Thursday April 27, 2023 @10:44PM (#63482298)

          If I was running Intel , other than completely running it into the ground because I suck at money, I'd be looking heavily at the emerging trends in machine learning, and observing that unlike junk like blockchains and the like, AI/ML probably isnt going away but it also is incredibly constrained in its ability to run on end nodes due to the ML core count and GPU ram for consumer devices. So I'd start looking at how to build a consumer friendly tech for doing that. If you make it possible to run a 48g model on a laptop, you change the whole game. Because the most interesting parts of things like ChatGPU are currently very dependent on running giant models expensively in server rooms which means actually taking that tech and making it more interesting, Ie fully interactive IoT home or business automation that you can just talk to ("Computer I'd like to make a thai green curry tonight, can you work out a good recipe based on the ingredients in the kitchen and if theres anything missing order it for me"), star trek style, without expensive subscriptions to OpenAI or whatever. Just a device that can be installed in a house to give it AI-like powers. That idea sort of exists on a cloud like scale with the google assistants and alexas, but its tied to specific cloud vendors and remarkably stupid in ways that the big transformer GPT style models are not.

          Basically, a ML accelerator card/chipset that can actually run modern models.

          • Neural engine (Score:4, Interesting)

            by sbszine ( 633428 ) on Friday April 28, 2023 @02:30AM (#63482504) Journal
            This is what the Apple neural engine cores in their M1 / M2 SoCs are aiming at. It's early days yet but it's the right track for this sort of consumer ML. You can imagine a future Mac Pro or high end Mac Studio starting to play in this space.
            • Right, and Apples definately on the right path, although they are far too feeble to run big models (I've managed to get the 7gb llama model with 4bit quantization running on my M1 laptop, but it kind of felt like it had been lobotomized due to the quantization and the 7gb size, compared to the full 48gb (or whatever it is) unquantized llama model which is.... surprisingly smart for a GPT clone.

              I expect the llama style models to be the ones that'll end up in IoT devices for talkie-toaster [youtube.com] type appliances wil

              • Re:Neural engine (Score:4, Interesting)

                by Entrope ( 68843 ) on Friday April 28, 2023 @06:51AM (#63482688) Homepage

                The largest LLaMA model is 65B parameters, which (with 4-bit quantization) should fit easily into 48 GB RAM. M1 Max laptops can have 64 GB, and M2 Max goes up to 96 GB. It just takes 400 ms per token or whatever, an order of magnitude slower than the 7B model. It should be faster if mapped into the Neural Engine or GPU, but I haven't seen any definite examples of that.

              • by r0nc0 ( 566295 )
                I can't speak for Apple Silicon team but the focus is not chips that support building and computing the models themselves but operating them completely remotely from any backing server. All of Apple's features on the Watch and iPhone for example run locally - no data is ever sent to Apple for those features because Apple doesn't maintain server infrastructure to support them. By features I mean things like the health features on the watch, workout classifications/detection, etc - I have no idea what happens
      • by TheRealMindChild ( 743925 ) on Thursday April 27, 2023 @10:22PM (#63482276) Homepage Journal

        I think the real problem is the market for graphics cards dropped out before they could get to market
         
        This isn't the first, or second, or third time Intel has tried to enter the video card market. And they are always at the bottom. They simply have no idea what they are doing

      • I do hope they stick with it. They have a history of pretty solid drivers and hardware. But at the same time I've been shying away from their gpus. Then again I also replaced my rx580 with a used GTX 1080 so I'm hardly an ideal consumer.

        Damn I knew you were poor but...fuck...

      • > the market for graphics cards dropped out

        That's not the issue. Intel's GPUs [wikipedia.org] has been discussed [slashdot.org] before. This isn't Intel's first rodeo either; they have a LONG history of failed GPUs: Intel 740, Extreme Graphics, GMA, HD Graphics, Larrabee, etc.

        ARC failed to make a dent because gamers went with Nvidia or AMD due to performance and mature drivers. Intel needs to invest another few years into discrete GPUs with stable D3D and Vulkan drivers to show people that they are serious. Until then, they will

      • Intel had crap graphics devices since forever.
      • by gweihir ( 88907 )

        Really? I mean they failed in that before. I think that was the other way round, namely Intel hoping to compensate for its screwed-up planning and bad engineering by, again, trying to be somebody in the GFX market.

        • Their plan before was to use a shitload of x86 cores to do the graphics processing, and as you doubtless know, it went pretty shit.

          Maybe they have a better plan now, but they are definitely at least a generation behind, people only buy their cards for video encoding. It was a good plan for them to focus on that since they clearly couldn't win on the basis of graphics performance, but the other vendors won't let them pull that trick again so they're going to have to get their shit together in the performance

          • by Entrope ( 68843 )

            Their plan before was to use a shitload of x86 cores to do the graphics processing, and as you doubtless know, it went pretty shit.

            All they have is a hammer, and they were shocked when they couldn't saw through that tree limb.

            • by gweihir ( 88907 )

              Yep. Is always fascinating to see mega-corps do really stupid things when they should know better and probably had enough people that knew. Just not people placed high enough,as it seems.

          • by gweihir ( 88907 )

            Yes, I am aware. They essentially tried to replace ther non-existent GFX skills with their mediocre CPU skills (Intel can only keep up because of in-house manufacturing and cutting corners). With predictable results. Lets see whether what they do now will turn out better in the long run.

      • You may have noticed that Alchemist didn't work out that well for them, from a technical angle. Here's hoping Battlemage is a better product.

      • I think the real problem is the market for graphics cards dropped out before they could get to market.

        The problem is that our brains are apparently OK with three competitors, but we seem to prefer two. Our minds, when it comes to competition, seem to have kind of a Sith Lord thing. "Always two there are".

        Back in the day it was Intel or PPC?
        Windows or Mac?
        Then iOS or Android?
        Boeing or Airbus?
        Uber or Lyft?

        Since about, oh, the late 1970's or so, established industries have all been heading toward a "two for the cage-match" thing (AMC got eaten by Chrysler, then Chrsyler got eaten by various foreign companies).

    • Re:No kidding? (Score:5, Informative)

      by timeOday ( 582209 ) on Thursday April 27, 2023 @11:24PM (#63482330)
      Intel's weakness isn't helping, but more than that it's a Covid hangover: "Gartner Says Worldwide PC Shipments Declined 30% in First Quarter of 2023 ."

      https://www.gartner.com/en/new... [gartner.com]

      The proof is that AMD's earnings have also plummeted:

      https://seekingalpha.com/symbo... [seekingalpha.com]

      • by sbszine ( 633428 )
        Ahh, so the idea is that everyone who was in the market for a new PC bought one during the lockdowns and the market is saturated. I would guess inflation is hitting them too.
      • NVidia will be interesting. The estimates are for growth, but maybe they are still betting on crypto sales. Will AI boost them enough?

        • NVidia will be interesting. The estimates are for growth, but maybe they are still betting on crypto sales. Will AI boost them enough?

          Probably. AI users want lots of RAM so they are buying high-end parts, and there is typically more profit on higher-dollar products. People (corporations, really) are still in the process of building GPU-based clusters for this purpose, let alone all the hobbyists.

        • by r0nc0 ( 566295 )
          After all the news about how Microsoft had to scramble to secure enough Nvidia GPUs to build the LLVM models it seems hard to imagine that Nvidia would be hurting in the same ways as Intel but I have no idea how much of that market makes up Nvidia's revenues.
    • by Chas ( 5144 )

      You're also forgetting COVID.

      During the whole thing, you initially had businesses shutting down and panicking.
      Intel, AMD, NVIDIA. All of them dropped significant fab contracts, thinking they couldn't meet the sales expectations.

      Then, once the fight or flight reaction wears off, it occurs to them "Hey! Remote access is kinda a thing!"

      And over the next 6-12 months, we see massive shifts in computer purchases and one of the largest spending binges in history.

      People were buying everything in sight. More or l

    • Intel is not behind on process.

      They measure feature size differently than AMD.

  • Intel CEO tells reporters he regrets Intel's Bud Light campaign and in future will stick to CPUs.
  • by NFN_NLN ( 633283 ) on Thursday April 27, 2023 @08:53PM (#63482178)

    Extended: 31.35 +1.49 (+4.99%)
    This is with the news priced in.

  • Thank You, AMD (Score:5, Informative)

    by zenlessyank ( 748553 ) on Thursday April 27, 2023 @09:08PM (#63482198)

    The suits to engineers ratio at Intel got out of hand and this is what happens.

    • Too big to fail though, so it's not that important if they make a profit. If things get bad taxpayers will bail them out.
    • They were hiring and promoting based on diversity. You would have to compete with your team mates for the opportunity of promotion and then rely on your team manager's arguing skills to get you actually promoted over those who were being put forward from other teams.
      The motivation is to join a team who are otherwise less capable than you and with fewer minorities (read:women), and with an aggressive manager. None of that is good for the company, imo, and the result is many middle aged white males getting pa

      • Intel wished they had a Lisa Su.
      • by gtall ( 79522 )

        I should think having a lot of different viewpoints would be an asset. If you put a bunch of clones on a project, you'll guarantee a mediocre product.
         

        • Re: Thank You, AMD (Score:5, Interesting)

          by dwater ( 72834 ) on Friday April 28, 2023 @06:27AM (#63482664)

          Indeed, but it seemed they were prioritising diversity over competence, in some situations anyway, or at least giving 'diverse' people more opportunity to succeed, by (eg) giving them second chances in interviews/etc.
          Also, I'm not sure where you get 'a bunch of clones' from...that's not the alternative. Hiring/promoting based purely on competence is the key.
          Their system was quite obtuse, imo, and worked directly against teamwork...performance appraisals were on a normal curve within each team, so teams that all did excellently could still only have (eg) 1 person who 'exceeded', while the same was true of a team with a bunch of idiots who completely failed (not that I knew of any such people, you understand). Ditto for people who had to improve their performance.
          In layoffs, they let people go who hadn't been promoted in (eg) 2 years, irrespective of what managers said.
          I seem to recall it being named 'stack ranking' or something close to that. I forget the details now...it's been a long time, and I'm sure they've changed their system.

          • No. Hiring based on pure competence is key to success of jobs requiring individuals to complete assigned predetermined tasks. There is ample evidence that diverse teams produce better outcomes for problem solving and complex tasks, or any task requiring creativity. Having just a bunch of single money experts promoted groupthink.

            In any case none of this has anything to do with Intel's current result. There been an industry wide crash. If your taking point is about diversity it says more about you then anyone

    • Intel's performance has been horrible, especially in the data centre group (DCG); however, the market as a whole for semiconductor firms is also particularly awful right now (see: Samsung). Everyone is getting beaten up.

      In a stronger market, Intel would really feel the pain since AMD would be picking off their customers due to Sapphire Rapids being such a terrible product. If Intel can remain solvet (big if!) they at least have a chance to deliver a few product improvements before it's too late.

  • Intel died with Andy Grove.

Elliptic paraboloids for sale.

Working...