Intel Reports Largest Quarterly Loss In Company History (cnbc.com) 61
In the company's first-quarter earnings results (PDF) on Wednesday, Intel reported a 133% annual reduction in earnings per share. "Revenue dropped nearly 36% year over year to $11.7 billion," adds CNBC. From the report: In the first quarter, Intel swung to a net loss of $2.8 billion, or 66 cents per share, from a net profit of $8.1 billion, or $1.98 per share, last year. Excluding the impact of inventory restructuring, a recent change to employee stock options and other acquisition-related charges, Intel said it lost 4 cents a share, which was a narrower loss than analyst had expected. Revenue decreased to $11.7 billion from $18.4 billion a year ago.
It's the fifth consecutive quarter of falling sales for the semiconductor giant and the second consecutive quarter of losses. It's also Intel's largest quarterly loss of all time, beating out the fourth quarter of 2017, when it lost $687 million. Intel hopes that by 2026 that it can manufacture chips as advanced as those made by TSMC in Taiwan, and it can compete for custom work like Apple's A-series chips in iPhones. Intel said on Thursday it was still on track to hit that goal.
Intel's Client Computing group, which includes the chips that power the majority of desktop and laptop Windows PCs, reported $5.8 billion in revenue, down 38% on an annual basis. Intel's server chip division, under its Data Center and AI segment suffered an even worse decline, falling 39% to $3.7 billion. Its smallest full line of business, Network and Edge, posted $1.5 billion in sales, down 30% from the same time last year. One bright spot was Mobileye, which went public last year but is still controlled by Intel. Mobileye makes systems and software for self-driving cars, and reported 16% sales growth to $458 million.
It's the fifth consecutive quarter of falling sales for the semiconductor giant and the second consecutive quarter of losses. It's also Intel's largest quarterly loss of all time, beating out the fourth quarter of 2017, when it lost $687 million. Intel hopes that by 2026 that it can manufacture chips as advanced as those made by TSMC in Taiwan, and it can compete for custom work like Apple's A-series chips in iPhones. Intel said on Thursday it was still on track to hit that goal.
Intel's Client Computing group, which includes the chips that power the majority of desktop and laptop Windows PCs, reported $5.8 billion in revenue, down 38% on an annual basis. Intel's server chip division, under its Data Center and AI segment suffered an even worse decline, falling 39% to $3.7 billion. Its smallest full line of business, Network and Edge, posted $1.5 billion in sales, down 30% from the same time last year. One bright spot was Mobileye, which went public last year but is still controlled by Intel. Mobileye makes systems and software for self-driving cars, and reported 16% sales growth to $458 million.
No kidding? (Score:5, Insightful)
Who wouldn't have predicted this?
Behind on process. Over budget on energy. Stiff competition.
If anyone is surprised they haven't paid attention.
Re: (Score:3)
I do hope they stick with it. They have a history of pretty solid drivers and hardware. But at the same time I've been shying away from their gpus. Then again I also replaced my rx580 with a used GT
Re:No kidding? (Score:4, Interesting)
Graphic cards were a sideshow for Intel. They might have hoped to disrupt that duopoly, but Xi products were never (plausibly) going to yield significant revenue in this time frame. They will keep pumping out integrated graphics for the low-end market, but they need to deliver new process nodes to remain competitive there -- and those new process nodes are also what they need for their bread and butter products.
Re: (Score:3)
Discrete graphics are a patent minefield
Re:No kidding? (Score:4, Interesting)
If I was running Intel , other than completely running it into the ground because I suck at money, I'd be looking heavily at the emerging trends in machine learning, and observing that unlike junk like blockchains and the like, AI/ML probably isnt going away but it also is incredibly constrained in its ability to run on end nodes due to the ML core count and GPU ram for consumer devices. So I'd start looking at how to build a consumer friendly tech for doing that. If you make it possible to run a 48g model on a laptop, you change the whole game. Because the most interesting parts of things like ChatGPU are currently very dependent on running giant models expensively in server rooms which means actually taking that tech and making it more interesting, Ie fully interactive IoT home or business automation that you can just talk to ("Computer I'd like to make a thai green curry tonight, can you work out a good recipe based on the ingredients in the kitchen and if theres anything missing order it for me"), star trek style, without expensive subscriptions to OpenAI or whatever. Just a device that can be installed in a house to give it AI-like powers. That idea sort of exists on a cloud like scale with the google assistants and alexas, but its tied to specific cloud vendors and remarkably stupid in ways that the big transformer GPT style models are not.
Basically, a ML accelerator card/chipset that can actually run modern models.
Neural engine (Score:4, Interesting)
Re: (Score:2)
Right, and Apples definately on the right path, although they are far too feeble to run big models (I've managed to get the 7gb llama model with 4bit quantization running on my M1 laptop, but it kind of felt like it had been lobotomized due to the quantization and the 7gb size, compared to the full 48gb (or whatever it is) unquantized llama model which is.... surprisingly smart for a GPT clone.
I expect the llama style models to be the ones that'll end up in IoT devices for talkie-toaster [youtube.com] type appliances wil
Re:Neural engine (Score:4, Interesting)
The largest LLaMA model is 65B parameters, which (with 4-bit quantization) should fit easily into 48 GB RAM. M1 Max laptops can have 64 GB, and M2 Max goes up to 96 GB. It just takes 400 ms per token or whatever, an order of magnitude slower than the 7B model. It should be faster if mapped into the Neural Engine or GPU, but I haven't seen any definite examples of that.
Re: (Score:1)
Re:No kidding? (Score:4, Funny)
I think the real problem is the market for graphics cards dropped out before they could get to market
This isn't the first, or second, or third time Intel has tried to enter the video card market. And they are always at the bottom. They simply have no idea what they are doing
Re: (Score:2)
I do hope they stick with it. They have a history of pretty solid drivers and hardware. But at the same time I've been shying away from their gpus. Then again I also replaced my rx580 with a used GTX 1080 so I'm hardly an ideal consumer.
Damn I knew you were poor but...fuck...
Re: (Score:2)
So, what, a GTX 1030?
Re: (Score:2)
> the market for graphics cards dropped out
That's not the issue. Intel's GPUs [wikipedia.org] has been discussed [slashdot.org] before. This isn't Intel's first rodeo either; they have a LONG history of failed GPUs: Intel 740, Extreme Graphics, GMA, HD Graphics, Larrabee, etc.
ARC failed to make a dent because gamers went with Nvidia or AMD due to performance and mature drivers. Intel needs to invest another few years into discrete GPUs with stable D3D and Vulkan drivers to show people that they are serious. Until then, they will
Re: (Score:2)
Re: (Score:2)
Really? I mean they failed in that before. I think that was the other way round, namely Intel hoping to compensate for its screwed-up planning and bad engineering by, again, trying to be somebody in the GFX market.
Re: (Score:2)
Their plan before was to use a shitload of x86 cores to do the graphics processing, and as you doubtless know, it went pretty shit.
Maybe they have a better plan now, but they are definitely at least a generation behind, people only buy their cards for video encoding. It was a good plan for them to focus on that since they clearly couldn't win on the basis of graphics performance, but the other vendors won't let them pull that trick again so they're going to have to get their shit together in the performance
Re: (Score:2)
Their plan before was to use a shitload of x86 cores to do the graphics processing, and as you doubtless know, it went pretty shit.
All they have is a hammer, and they were shocked when they couldn't saw through that tree limb.
Re: (Score:2)
Yep. Is always fascinating to see mega-corps do really stupid things when they should know better and probably had enough people that knew. Just not people placed high enough,as it seems.
Re: (Score:2)
Yes, I am aware. They essentially tried to replace ther non-existent GFX skills with their mediocre CPU skills (Intel can only keep up because of in-house manufacturing and cutting corners). With predictable results. Lets see whether what they do now will turn out better in the long run.
Re: (Score:2)
You may have noticed that Alchemist didn't work out that well for them, from a technical angle. Here's hoping Battlemage is a better product.
Re: (Score:2)
I think the real problem is the market for graphics cards dropped out before they could get to market.
The problem is that our brains are apparently OK with three competitors, but we seem to prefer two. Our minds, when it comes to competition, seem to have kind of a Sith Lord thing. "Always two there are".
Back in the day it was Intel or PPC?
Windows or Mac?
Then iOS or Android?
Boeing or Airbus?
Uber or Lyft?
Since about, oh, the late 1970's or so, established industries have all been heading toward a "two for the cage-match" thing (AMC got eaten by Chrysler, then Chrsyler got eaten by various foreign companies).
Re:No kidding? (Score:5, Informative)
https://www.gartner.com/en/new... [gartner.com]
The proof is that AMD's earnings have also plummeted:
https://seekingalpha.com/symbo... [seekingalpha.com]
Re: (Score:2)
Re: No kidding? (Score:2)
NVidia will be interesting. The estimates are for growth, but maybe they are still betting on crypto sales. Will AI boost them enough?
Re: (Score:2)
NVidia will be interesting. The estimates are for growth, but maybe they are still betting on crypto sales. Will AI boost them enough?
Probably. AI users want lots of RAM so they are buying high-end parts, and there is typically more profit on higher-dollar products. People (corporations, really) are still in the process of building GPU-based clusters for this purpose, let alone all the hobbyists.
Re: (Score:1)
Re: (Score:1)
You're also forgetting COVID.
During the whole thing, you initially had businesses shutting down and panicking.
Intel, AMD, NVIDIA. All of them dropped significant fab contracts, thinking they couldn't meet the sales expectations.
Then, once the fight or flight reaction wears off, it occurs to them "Hey! Remote access is kinda a thing!"
And over the next 6-12 months, we see massive shifts in computer purchases and one of the largest spending binges in history.
People were buying everything in sight. More or l
Re: (Score:2)
Intel is not behind on process.
They measure feature size differently than AMD.
those marketing decisions (Score:1)
Market has a different outlook (Score:4, Informative)
Extended: 31.35 +1.49 (+4.99%)
This is with the news priced in.
Thank You, AMD (Score:5, Informative)
The suits to engineers ratio at Intel got out of hand and this is what happens.
Re: (Score:1)
Re: (Score:3)
Won't that be poetic.
Re: (Score:2)
Re: (Score:2)
Apparently I kicked your arse so hard last time you've had a stroke.
Re: Thank You, AMD (Score:2, Funny)
They were hiring and promoting based on diversity. You would have to compete with your team mates for the opportunity of promotion and then rely on your team manager's arguing skills to get you actually promoted over those who were being put forward from other teams.
The motivation is to join a team who are otherwise less capable than you and with fewer minorities (read:women), and with an aggressive manager. None of that is good for the company, imo, and the result is many middle aged white males getting pa
Re: (Score:3)
Re: (Score:2)
I should think having a lot of different viewpoints would be an asset. If you put a bunch of clones on a project, you'll guarantee a mediocre product.
Re: Thank You, AMD (Score:5, Interesting)
Indeed, but it seemed they were prioritising diversity over competence, in some situations anyway, or at least giving 'diverse' people more opportunity to succeed, by (eg) giving them second chances in interviews/etc.
Also, I'm not sure where you get 'a bunch of clones' from...that's not the alternative. Hiring/promoting based purely on competence is the key.
Their system was quite obtuse, imo, and worked directly against teamwork...performance appraisals were on a normal curve within each team, so teams that all did excellently could still only have (eg) 1 person who 'exceeded', while the same was true of a team with a bunch of idiots who completely failed (not that I knew of any such people, you understand). Ditto for people who had to improve their performance.
In layoffs, they let people go who hadn't been promoted in (eg) 2 years, irrespective of what managers said.
I seem to recall it being named 'stack ranking' or something close to that. I forget the details now...it's been a long time, and I'm sure they've changed their system.
Re: (Score:2)
No. Hiring based on pure competence is key to success of jobs requiring individuals to complete assigned predetermined tasks. There is ample evidence that diverse teams produce better outcomes for problem solving and complex tasks, or any task requiring creativity. Having just a bunch of single money experts promoted groupthink.
In any case none of this has anything to do with Intel's current result. There been an industry wide crash. If your taking point is about diversity it says more about you then anyone
Re: Thank You, AMD (Score:1)
You're just describing competence.
Re: (Score:1)
Re: (Score:2)
Intel's performance has been horrible, especially in the data centre group (DCG); however, the market as a whole for semiconductor firms is also particularly awful right now (see: Samsung). Everyone is getting beaten up.
In a stronger market, Intel would really feel the pain since AMD would be picking off their customers due to Sapphire Rapids being such a terrible product. If Intel can remain solvet (big if!) they at least have a chance to deliver a few product improvements before it's too late.
Re: (Score:2)
Some of these tech companies probably need to "lose jobs".
Re: (Score:2)
Money sent to Ukraine supports the US tech industry. Money buys arms. Arms use US tech products. The arms that are bought are the ones that are actually needed, not the ones that some diversity department decided are better for the climate.
Re: (Score:1)
Your kind seem to think "all together now" is freedom.
Andy Grove (Score:2)
Intel died with Andy Grove.
Intel is infected (Score:2)
I think 2 reasons
https://en.wikipedia.org/wiki/... [wikipedia.org] /
https://en.wikipedia.org/wiki/... [wikipedia.org]