Suddenly, what Intel's CEO means by new products must deliver 50% gross profit [1], and it being too late to catch up with AI [2] is starting to be clearer.
CEO of a silicon company saying his business is "too late for AI" is a CEO either without a vision or guts, an accountant, the safe option. If it's anywhere close to true, Intel is looking to sell itself for parts.
BeetleB 5 hours ago [-]
There's a whole backstory to this.
When he joined only a few months ago, he set the vision of making Intel a worthy participant in the AI space.
Then just a few months later, he announced "we cannot compete".
What happened in the middle? Recent articles came out about the conflict between him and Frank Yeary, the head of the Intel board. He wanted to acquire a hot AI startup, and Frank opposed it. Two factions were formed in the Board, and they lost a lot of time battling it out. While this was going on, a FAANG came in and bought the startup.
I think his announcement that Intel cannot compete was his way of saying "I cannot do it with the current Intel board."
horsawlarway 5 hours ago [-]
Feels like a fair statement.
My read is basically that Intel's board is frustrated they can't part the company out, take their cash, and go home.
I'd also be incredibly frustrated working with a board that seems deadset on actively sabotaging the company for short term gains.
Scramblejams 5 hours ago [-]
What startup was it?
BeetleB 3 hours ago [-]
The news articles didn't name it.
downrightmike 4 hours ago [-]
What happened is they fired tens of thousands of their workforce. They knee-capped themselves. Having people with experience and institutional knowledge is required. You can't just toss that out the windows and expect things to work.
BeetleB 3 hours ago [-]
By any measure, Intel was bloated. All their competitors are doing a lot more with a lot fewer people.
Now whether they fired the right tens of thousands is another matter.
snerbles 3 hours ago [-]
> All their competitors are doing a lot more with a lot fewer people.
According to Wikipedia, for FY25:
Intel: 102,000 employees
AMD: 28,000 employees
Nvidia: 36,000 employees
I'm pretty sure the latter two have been growing headcount lately, and even then combined they still have fewer employees than Intel.
AnotherGoodName 3 hours ago [-]
You need to add tsmcs employees to the amd and nvidia counts since intel is a foundry. +82k puts it around equal.
Now intel isn’t doing as well on the chip design side as amd or nvidia nor as well on the fab side as tsmc but i suspect that’s on leadership thrashing constantly like it is more than anything else.
BeetleB 2 hours ago [-]
Yes, but TSMC does a ton more volume than Intel's fabs.
tester756 2 hours ago [-]
Intel (100k +-) should be compared with AMD (28k) + TSMC (84k)
HPsquared 3 hours ago [-]
Both statements can be true.
panick21_ 48 minutes ago [-]
Intel has bought and destroyed a large number of startup already. Not sure that is the solution.
tester756 5 hours ago [-]
The real quote is:
>"On training, I think it is too late for us,"
Not too late for AI, but too late for training meanwhile there's inference opportunity or something like that
freedomben 6 hours ago [-]
Agreed, either their business situation is far more critical than we know, this is a gross indictment of their R&D, or this is malpractice on the part of the leadership
aDyslecticCrow 5 hours ago [-]
Too late for the AI boom if they have to spend another 2 years and manufacturing investments to get a product out for the segment. We are over-inflated on AI hype. Its relevance will remain but betting a company on it isn't a wise idea.
checker659 5 hours ago [-]
> of a silicon company
With their own fabs, at that
h2zizzle 5 hours ago [-]
Or, a sly way of calling the AI bubble.
moralestapia 5 hours ago [-]
Well, but if it's true and there's a better strategy, why wouldn't he do it?
Seems like you'd prefer yet another +1 selling AI oil and promises ...
phkahler 5 hours ago [-]
Cutting products that don't have 50 percent margins seems like a bad choice when their goal should be filling their advanced fabs. Keeping that investment at or near capacity should be their goal. They said they'd have to cancel one node if the foundry business couldn't get enough customers, and yet they're willing to cut their own product line? Sure they need to make a profit, but IMHO they should be after volume at this point.
KronisLV 5 hours ago [-]
Even the Arc B580 GPUs could have been a bigger win if they were actually ever in stock at MSRP, I say that as someone who owns one in my daily driver PC. Yet it seemed oddly close to a paper launch, or nowhere near the demand, to the point where the prices were so far above MSRP that it made the value really bad.
Same as how they messed up the Core Ultra desktop launch, of their own volition - by setting the prices so high that they can’t even compete with their own 13th and 14th gen chips, not even mentioning Ryzen CPUs that are mostly better in both absolute terms and in the price/perf. A sidegrade isn’t the end of the world but a badly overpriced sidegrade is dead on arrival.
Idk what Intel is doing.
FirmwareBurner 4 hours ago [-]
Where do you live? Because Intel Arc cards ARE available at MSRP almost everywhere in western nations. It really isn't a paper launch.
AnotherGoodName 2 hours ago [-]
To be fair to the parent they absolutely were a paper launch. Looks like supply has finally caught up now but we’re also a long way from launch.
AnthonyMouse 2 hours ago [-]
> Cutting products that don't have 50 percent margins seems like a bad choice when their goal should be filling their advanced fabs.
It seems like a bad choice at all times. A product with a 45% margin -- or a 5% margin -- is profitable. It's only the products with negative (net) margins that should be cut.
And meanwhile products with lower margins are often critical to achieving economies of scale and network effects. It's essentially the thing that put Intel in its current straits -- they didn't want to sell low-margin chips for phones and embedded devices which gave that market to ARM and TSMC and funded the competition.
panick21_ 42 minutes ago [-]
Well, often you cut 5% margin product because you should focus your people and your capability on growing your 50% products. Sure if the 5% products are well established keep selling them, but usually in tech, you need to continue to invest in the 5% product to keep t up to date.
Intel did this for memory in the 80s. Memory was still profitable, and could be more so again (see Micron), but it required much investment.
But Intel might not be in this position, and filling the fabs by itself can defiantly be worth it.
But if you don't have the capacity in the new fab, maybe that isn't an issue, so its hard to say from the outside.
ndiddy 3 hours ago [-]
Intel's Arc GPUs are fabricated by TSMC, not Intel.
tester756 5 hours ago [-]
The real quote is:
>"On training, I think it is too late for us,"
Not too late for AI, but too late for training meanwhile there's inference opportunity or something like that
risho 5 hours ago [-]
i will note that their source appears to be moore's law is dead which is a speculative youtube channel that has a long history of being wrong about the death of arc. dude has been predicting the imminent death of arc since the first one released years ago. it wouldn't surprise me if this did lead to the death of arc, but it certainly isn't because this moron predicted it.
nomel 2 hours ago [-]
> because this moron predicted it.
I don't think that's fair, at all, for two reasons:
My whole career has been in HW. In the few startups I was in, where I was privy to the higher decisions, the executives themselves were predicting if hardware would continue next year. It's normal for things to flip-flop, decisions to change, normally, but especially when a product isn't competitive, and you don't have the resources to keep it that way. A leaker can be 100% correct in what they say the current decisions are, and 100% wrong the next week. I've seen it happen, with whole teams fired because they weren't needed anymore.
He makes it very clear when he's speculating and when he's relaying what he's been told, and his confidence for them all (some leakers are proven trustworthy, some are not). This alone is why he doesn't deserve to be called a moron. Morons can't comprehend or communicate uncertainty.
immibis 3 hours ago [-]
It is because the moron predicted it, if the CEO read the prediction, and killed arc
chao- 6 hours ago [-]
It is very hard to put any belief in the rumor mill surrounding Intel's discrete desktop GPUs. Already this year, there have been at least three "leaks" saying "It's canceled!", and every time, a counter-rumor comes about saying "It isn't canceled!"
In all accounts I have seen, their single SKU from this second generation consumer lineup has been well-received. Yet the article says "what can only be categorized as a shaky and often rudderless business", without any justification.
Yes, it is worth pondering what the Nvidia investment means for Intel Arc Graphics, but "rudderless"? Really?
belval 5 hours ago [-]
Honestly, the rumor mill surrounding Intel is actually very similar to AMD 2015-2016 pre-Zen (not saying that they will see the same outcome). I swear I have seen the same "x86 license is not transferable [other company] might sue them" 9 years ago or "Product Y will be discontinued".
When it comes to GPUs, a $4T company probably couldn't care less what their $150B partner does in their spare time as long as they prioritize the partnership. Especially when the GPUs in question are low-end units, in a segment that Nvidia has no competition in and not even shipping that many. If they actually asked them to kill it, it would be 100% out of pettiness.
Sometimes I wonder if these articles are written for clicks and these "leakers" are actually just the authors making stuff up and getting it right from time to time.
chao- 5 hours ago [-]
From a corporate strategy perspective, cancel Arc or keep Arc, I can see it both ways.
Intel has so many other GPU-adjacent products and they will doubtless be continuing most of them, even if they don't pursue Arc further: Jaguar Shores, Flex GPUs for VDI, and of course their Xe integrated graphics. I could possibly see Intel not ship a successor to Flex? Maybe? I cannot see a world where they abandon Xe (first-party laptop graphics) or Jaguar Shores ("rack scale" datacenter "GPUs").
With all of that effort going into GPU-ish designs, is there enough overlap that the output/artifacts from those products support and benefit Arc? Or if Arc only continues to be a mid-tier success, is it thus a waste of fab allocation, a loss of potential profit, and an unnecessary expense in terms of engineers maintaining drivers, and so forth? That is the part I do not know, and why I could see it going either way.
I want to acknowledge that I am speaking out of my depth a bit: I have not read all of Intel's quarterly financials, and not followed every zig and zag of every product line. Yet while can see it both ways, in no world do I trust these supposed leaks.
AnthonyMouse 2 hours ago [-]
GPUs are parallel compute engines. The difference between a high performance CPU core design from Intel/AMD/Apple and the low end stuff is a bunch of fancy branch prediction, out of order execution, cache hierarchies, etc. all designed to improve single-thread performance. The primary difference between a large GPU and a small GPU is that a large GPU has more cores.
Sufficiently far in the past you might have been able to get away with an integrated GPU that didn't even have meaningful 3D acceleration etc., but those days are gone. Even web browsers are leaning on the GPU to render content, which matters for iGPUs for battery life, which makes performance per watt the name of the game. And that's the same thing that matters most for large GPUs because the constraint on performance is power and thermals.
Which is to say, if you're already doing the work to make a competitive iGPU, you've done most of the work to make a competitive discrete GPU.
The thing Intel really needs to do is to get the power efficiency of their own process on par with TSMC. Without that they're dead; they can't even fab their own GPUs.
belval 4 hours ago [-]
> From a corporate strategy perspective, cancel Arc or keep Arc, I can see it both ways.
Me too, I just really really doubt that it would come from Nvidia
cubefox 5 hours ago [-]
Yeah that is bizarre. They have been very focused and even managed to upstage AMD by several years in the ML acceleration department (XeSS).
jeffbee 6 hours ago [-]
There has never been any information conveyed by the "Moore's Law Is Dead" account. If you want to know whether Intel has cancelled their next dGPU, you might as well ask a coin.
TiredOfLife 6 hours ago [-]
Source is "Moore's law is dead" youtuber. A coin toss is more reliable than him.
gregbot 6 hours ago [-]
Really? Ive been following him for years and he has always been 100% accurate. What has he been wrong about?
dralley 6 hours ago [-]
I agree that he's not that bad, but he's definitely not 100% accurate, in particular with respect to Intel.
Notably this is about the 3rd time in 2 years that he's reported that the Intel dGPU efforts are being killed off.
So far everything he said in that video has happened and he did not say that intel would never release another dGPU just that it would be a token release which is absolutely exactly what has happened
dralley 2 hours ago [-]
IIRC the "token release" claim was more of a cleanup after the first and strongest version of his "reporting" didn't really pan out. IIRC he also said that there would be no gen 3 dGPU release after the "token" gen 2 dGPU release and that doesn't seem accurate if the gen 3 dGPU efforts still exist 2 years later, in order to be cancelled now.
nodja 5 hours ago [-]
They had videos saying intel was gonna cancel the dGPU division and focus on datacenter pretty much since the intel cards came out. Amongst many other things they've said. I used to follow them too, but they speak with too much confidence about things they know nothing about.
They're a channel focused on leaks, but most of their leaks are just industry insider gossip masked as factual to farm clicks. Their leaks are useless for any sort of predictions, but may be interesting if you'd like to know what insiders are thinking.
A quick google search also yielded this[1] 2-year old reddit thread that shows videos they deleted because their predictions were incorrect. There's probably many more. (That subreddit seems to be dedicated to trashing MLID.)
Instead of invectives could you just say what specific leak of his was inaccurate? Everything he said about intel dGPU has happened exactly as he said it would. Have you watched his video about that yourself?
nodja 3 hours ago [-]
Did you not look at the link I provided?
I stopped watching him completely around the time of the intel dGPU release. He would show leaked roadmaps of intel's dGPU launch with Celestial and Druid on there, but the video would be him basically repeating the narrative that the division is on the verge of cancellation and has no future, etc. The documents he has leaked almost never match the titles and narratives he pushes. He's not always wrong, but his biases are clear and the titles are more often misleading clickbait than factual.
Tom Petersen (Lead Intel GPU Engineer) has showed up on multiple interviews with LTT and GN among other tech channels and has talked at length the things his team did on the current gen, with heavy implications of what's coming up next (as far as he can without breaking NDA). His in depth analysis of GPU architecture are a much useful use of my time than listening to a guy that was given 2 leaked slides of a 6-month old powerpoint speculate how it spells doom for whatever company.
If you think logically, it makes zero sense to cancel Celestial now. According to Petersen Arc's hardware team has been working on Druid for months now, and unless the software team is severely behind with the drivers and support then at the very least Celestial will receive a limited release. They already did a limited release of Battlemage to put more resources on Celestial, it would be a shame to throw all that effort away now.
carlhjerpe 5 hours ago [-]
Yeah all the videos I saw where he was right had 100% accuracy, which you'll be reminded of in the next video, the times he was wrong won't be advertised the same.
gregbot 5 hours ago [-]
why dont you just say what he’s been wrong about?
carlhjerpe 5 hours ago [-]
Because I won't invest my time or money into rewatching every video, I don't get paid to be here.
gregbot 5 hours ago [-]
You replied to a comment that asked:
> What has he been wrong about
…
carlhjerpe 5 hours ago [-]
And I said he's been right about everything he's been right about, because that's the stuff you'll remember.
cyberax 3 hours ago [-]
Well, remember Elop and Nokia? The playbook is the same. Start with "we can't compete" ("burning platform memo"), then decimate the internal talent, and then sell off the company for cheap.
2OEH8eoCRo0 6 hours ago [-]
I think we overestimate desktop GPU relevance. Are gaming GPUs really that lucrative?
mrweasel 5 hours ago [-]
If they weren't why would Nvidia keep making them? They do seem like an increasingly niche product, but apparently not enough that Nvidia is willing to just exit the market and focus on the datacenters.
They aren't just for gaming, there's also high-end workstations, but that's probably even more niche.
tempest_ 5 hours ago [-]
Have you seen the latest generation of Nvidia gaming cards? They are increasingly looking like an after thought.
MostlyStable 5 hours ago [-]
I'm honestly curious why they keep making them. As far as I can tell, NVIDIA can sell literally as many datacenter AI chips as they can produce, and that would probably continue to be true even if they significantly increased prices. And even without increasing prices, the datacenter products are considerably higher margin than the consumer GPUs. Every consumer GPU they sell is lost revenue in comparison to using that fab capacity for a datacenter product.
The only reason I can imagine for them leaving the money on the table is that they think that the AI boom won't last that much longer and they don't want to kill their reputation in the consumer market. But even in that case, I'm not sure it really makes that much sense.
Maybe if consumer GPUs were literally just datacenter silicon that didn't make the grade or something, it would make sense but I don't think that's the case.
nagisa 3 hours ago [-]
How would one learn to be a marketable AI dev/researcher if not playing with the ecosystem/tooling on a consumer hardware? If nobody is exploring AI at home, the influx of fresh minds ceases, the development of the field slows down or stops entirely, market gets disillusioned and the field eventually disappears.
kccqzy 4 hours ago [-]
This is classic short term thinking. Whether or not AI is a bubble like the dot com bubble remains to be seen. But gamers have been buying Nvidia since before the dot com bubble and it is a market demand that has existed for a long time and will continue indefinitely. It doesn't make sense to cede this market to AMD.
I purposefully compare AI boom with the dot com bubble because we all knew how important the internet became eventually, but investments in it were way ahead of its time.
MostlyStable 3 hours ago [-]
I pretty explicitly mentioned that possibility. I'm just a bit skeptical that, even if they completely abandoned the consumer GPU market, that they wouldn't be able to get back into it in 5 years or so when/if the bubble bursts. The longer they are out, the harder it would be to get back in with non-trivial market share (since the longer they are out the more their brand would have eroded), but also, the longer they are out, the more money they would have left on the table by staying in for so long.
tempest_ 3 hours ago [-]
They pretty much are doing the bare minimum in the gaming space already.
It is a false dichotomy. They can spend the bare minimum to stay in the game card market while fabing AI cards. At this point that is just an insurance premium.
hhh 6 hours ago [-]
No. It used to be more even between datacenter and gaming for NVIDIA, but that's not been the case for a few years. Gaming has brought in less money than networking (mellanox) since '24 Q4.
But the same thing that makes GPUs powerful at rendering is what AI needs - modern gaming GPUs are basically supercomputers that provide Hw and Sw to do programmable embarrassingly parallel work. That is modern game rendering but also AI and crypto (and various science engineering) which is the second revolution that Intel completely missed (the first one being mobile).
patagurbon 5 hours ago [-]
AI (apparently) needs much lower precision in training and certainly in inference than gaming requires though. A very very large part of the die on modern datacenter GPUs is effectively useless for gaming
vlovich123 5 hours ago [-]
I disagree that HW blocks for lower precision take up that much die space. Data center GPUs are useless for gaming because it's tuned that way. H100 still has 24 raster operating units (4050 has 32) and 456 texture mapping units (4090 has 512). That's because there's only so much they can tune the HW architecture to one use-case or the other without breaking some fundamental architecture assumptions. And consumer cards still come with tensor units and support for lower precision. This is because the HW costs and unit economics are such that it's much more in favor of a unified architecture that scales to different workloads vs discrete implementations specific to a given market segment.
They've also not bothered investing in SW to add the H100 to their consumer drivers to work well on games. That doesn't mean it's impossible and none of that takes away from the fact that H100 and consumer GPUs are much more similar and could theoretically be made to run the same workloads at comparable performance.
jlarocco 5 hours ago [-]
I don't think anybody is using gaming GPUs to do serious AI at this point, though.
vlovich123 5 hours ago [-]
But you can use a gaming card to do AI and you can use H100 to game. The architecture between them is quite similar. And I expect upcoming edge AI applications to break down and end up using GPUs more than having dedicated AI accelerator HW because A) you need something to do display anyway B) the fixed function DSPs that have been called "AI accelerators" are worse than useless for running LLMs.
pjmlp 6 hours ago [-]
Depends if one cares about a PlayStation/XBox like experience, or Switch like.
eYrKEC2 5 hours ago [-]
Intel has always pursued agglomeration into the main CPU. They sucked up the math co-processor. They sucked up the frontside bus logic. They sucked up the DDR controllers more and more. They have sucked in integrated graphics.
Everything on-die, and with chiplets in-package, is the Intel way.
Default, average integrated graphics will continue to "statisfice" for a greater and greater portion of the market with integrated graphics continuing to grow in power.
carlhjerpe 5 hours ago [-]
Intel made fun of AMD for "taping chips together". Intel did everything on a monolithic die for about way too long.
The smaller the node the smaller the yield, chiplets is a necessity now (or architectural changes like Cerebras).
Running tests and then fusing off broken cores or shared caches helps to recover lots of yield for bigger chips. Certain parts of the silicon is not redundant, but Intel's designs have redundancy for core pieces and chunks that are very large and hence probabilistically more prone to a manufacturing error.
carlhjerpe 5 hours ago [-]
Yep, cerebras takes that thing to the next level with their "wafer chips". A common technique is killing defective cores entirely (how all cheaper CPUs are made).
But reducing size will still increase yield since you can pick and choose.
nodja 5 hours ago [-]
They're a market entry point. CUDA became popular not because it was good, but because it was accessible. If you need to spend $10k minimum on hardware just to test the waters of what you're trying to do, that's a lot to think about, and possibly tons of paperwork if it's not your money. But if you can test it on $300 hardware that you probably already own anyway...
justincormack 5 hours ago [-]
Gaming GPUs make up 7% of Nvidia's business, 93% is datacentre. So, no.
3 hours ago [-]
gpderetta 6 hours ago [-]
they kept nvidia in business for a long time until their datacenter breakthrough.
anonym29 5 hours ago [-]
the value proposition of Intel's graphics division wasn't in the current generation gaming GPUs, it was the growth of talent internally that could target higher and higher end chips at a much lower price than Nvidia until they were knocking on the door of the A100/H200-class chips - the chips that Nvidia produces for $2k and then sells for $40k.
Not to mention, Intel having vertical integration gave Intel flexibility, customization, and some cost saving advantages that Nvidia didn't have as much of, Nvidia being a fabless designer who are themselves a customer of another for-profit fab (TSMC).
If TFA is true, this was an anticompetitive move by Nvidia to preemptively decapitate their biggest competitor in 2030's datacenter GPU market.
flufluflufluffy 5 hours ago [-]
Can someone explain what the heck Battlemage means in this context?
nodja 5 hours ago [-]
Intel Arc - Intel's dedicated GPUs, each GPU generation has a name in alphabetical order, names are taken from nerd culture.
Alchemist - First gen GPUs A310 GPUs are the low end, A770 are the high end. Powerful hardware for cheap, very spotty software at release. Got fixed up later.
Battlemage - Second gen (current gen), only B570 and B580 GPUs came out. They said weren't gonna release more Battlemage GPUs after these because they wanted to focus on Celestial, but probably went back on it seeing how well the B580 was reviewed and the B770 is due to be released by the end of the year.
Celestial - Next gen GPUs, they were expected for release early 2026. This article claims it was cancelled, but personally I think it's too late to cancel a GPU this late in production. Especially when they basically skipped a generation to get it out faster.
daemonologist 5 hours ago [-]
Battlemage, aka X^e2, is Intel's current and second-generation GPU architecture. (Like RDNA 4 for AMD or Blackwell for Nvidia.)
[1]: https://www.tomshardware.com/tech-industry/semiconductors/in...
[2]: https://www.tomshardware.com/tech-industry/intel-ceo-says-it...
When he joined only a few months ago, he set the vision of making Intel a worthy participant in the AI space.
Then just a few months later, he announced "we cannot compete".
What happened in the middle? Recent articles came out about the conflict between him and Frank Yeary, the head of the Intel board. He wanted to acquire a hot AI startup, and Frank opposed it. Two factions were formed in the Board, and they lost a lot of time battling it out. While this was going on, a FAANG came in and bought the startup.
I think his announcement that Intel cannot compete was his way of saying "I cannot do it with the current Intel board."
My read is basically that Intel's board is frustrated they can't part the company out, take their cash, and go home.
I'd also be incredibly frustrated working with a board that seems deadset on actively sabotaging the company for short term gains.
Now whether they fired the right tens of thousands is another matter.
According to Wikipedia, for FY25:
Intel: 102,000 employees
AMD: 28,000 employees
Nvidia: 36,000 employees
I'm pretty sure the latter two have been growing headcount lately, and even then combined they still have fewer employees than Intel.
Now intel isn’t doing as well on the chip design side as amd or nvidia nor as well on the fab side as tsmc but i suspect that’s on leadership thrashing constantly like it is more than anything else.
>"On training, I think it is too late for us,"
Not too late for AI, but too late for training meanwhile there's inference opportunity or something like that
With their own fabs, at that
Seems like you'd prefer yet another +1 selling AI oil and promises ...
Same as how they messed up the Core Ultra desktop launch, of their own volition - by setting the prices so high that they can’t even compete with their own 13th and 14th gen chips, not even mentioning Ryzen CPUs that are mostly better in both absolute terms and in the price/perf. A sidegrade isn’t the end of the world but a badly overpriced sidegrade is dead on arrival.
Idk what Intel is doing.
It seems like a bad choice at all times. A product with a 45% margin -- or a 5% margin -- is profitable. It's only the products with negative (net) margins that should be cut.
And meanwhile products with lower margins are often critical to achieving economies of scale and network effects. It's essentially the thing that put Intel in its current straits -- they didn't want to sell low-margin chips for phones and embedded devices which gave that market to ARM and TSMC and funded the competition.
Intel did this for memory in the 80s. Memory was still profitable, and could be more so again (see Micron), but it required much investment.
But Intel might not be in this position, and filling the fabs by itself can defiantly be worth it.
But if you don't have the capacity in the new fab, maybe that isn't an issue, so its hard to say from the outside.
>"On training, I think it is too late for us,"
Not too late for AI, but too late for training meanwhile there's inference opportunity or something like that
I don't think that's fair, at all, for two reasons:
My whole career has been in HW. In the few startups I was in, where I was privy to the higher decisions, the executives themselves were predicting if hardware would continue next year. It's normal for things to flip-flop, decisions to change, normally, but especially when a product isn't competitive, and you don't have the resources to keep it that way. A leaker can be 100% correct in what they say the current decisions are, and 100% wrong the next week. I've seen it happen, with whole teams fired because they weren't needed anymore.
He makes it very clear when he's speculating and when he's relaying what he's been told, and his confidence for them all (some leakers are proven trustworthy, some are not). This alone is why he doesn't deserve to be called a moron. Morons can't comprehend or communicate uncertainty.
In all accounts I have seen, their single SKU from this second generation consumer lineup has been well-received. Yet the article says "what can only be categorized as a shaky and often rudderless business", without any justification.
Yes, it is worth pondering what the Nvidia investment means for Intel Arc Graphics, but "rudderless"? Really?
When it comes to GPUs, a $4T company probably couldn't care less what their $150B partner does in their spare time as long as they prioritize the partnership. Especially when the GPUs in question are low-end units, in a segment that Nvidia has no competition in and not even shipping that many. If they actually asked them to kill it, it would be 100% out of pettiness.
Sometimes I wonder if these articles are written for clicks and these "leakers" are actually just the authors making stuff up and getting it right from time to time.
Intel has so many other GPU-adjacent products and they will doubtless be continuing most of them, even if they don't pursue Arc further: Jaguar Shores, Flex GPUs for VDI, and of course their Xe integrated graphics. I could possibly see Intel not ship a successor to Flex? Maybe? I cannot see a world where they abandon Xe (first-party laptop graphics) or Jaguar Shores ("rack scale" datacenter "GPUs").
With all of that effort going into GPU-ish designs, is there enough overlap that the output/artifacts from those products support and benefit Arc? Or if Arc only continues to be a mid-tier success, is it thus a waste of fab allocation, a loss of potential profit, and an unnecessary expense in terms of engineers maintaining drivers, and so forth? That is the part I do not know, and why I could see it going either way.
I want to acknowledge that I am speaking out of my depth a bit: I have not read all of Intel's quarterly financials, and not followed every zig and zag of every product line. Yet while can see it both ways, in no world do I trust these supposed leaks.
Sufficiently far in the past you might have been able to get away with an integrated GPU that didn't even have meaningful 3D acceleration etc., but those days are gone. Even web browsers are leaning on the GPU to render content, which matters for iGPUs for battery life, which makes performance per watt the name of the game. And that's the same thing that matters most for large GPUs because the constraint on performance is power and thermals.
Which is to say, if you're already doing the work to make a competitive iGPU, you've done most of the work to make a competitive discrete GPU.
The thing Intel really needs to do is to get the power efficiency of their own process on par with TSMC. Without that they're dead; they can't even fab their own GPUs.
Me too, I just really really doubt that it would come from Nvidia
Notably this is about the 3rd time in 2 years that he's reported that the Intel dGPU efforts are being killed off.
Even on the latest developments the reporting is contradictory, so someone is wrong and I suspect it's him. https://www.techpowerup.com/341149/intel-arc-gpus-remain-in-...
They're a channel focused on leaks, but most of their leaks are just industry insider gossip masked as factual to farm clicks. Their leaks are useless for any sort of predictions, but may be interesting if you'd like to know what insiders are thinking.
A quick google search also yielded this[1] 2-year old reddit thread that shows videos they deleted because their predictions were incorrect. There's probably many more. (That subreddit seems to be dedicated to trashing MLID.)
[1] https://www.reddit.com/r/BustedSilicon/comments/yo9l2i/colle...
Instead of invectives could you just say what specific leak of his was inaccurate? Everything he said about intel dGPU has happened exactly as he said it would. Have you watched his video about that yourself?
I stopped watching him completely around the time of the intel dGPU release. He would show leaked roadmaps of intel's dGPU launch with Celestial and Druid on there, but the video would be him basically repeating the narrative that the division is on the verge of cancellation and has no future, etc. The documents he has leaked almost never match the titles and narratives he pushes. He's not always wrong, but his biases are clear and the titles are more often misleading clickbait than factual.
Tom Petersen (Lead Intel GPU Engineer) has showed up on multiple interviews with LTT and GN among other tech channels and has talked at length the things his team did on the current gen, with heavy implications of what's coming up next (as far as he can without breaking NDA). His in depth analysis of GPU architecture are a much useful use of my time than listening to a guy that was given 2 leaked slides of a 6-month old powerpoint speculate how it spells doom for whatever company.
If you think logically, it makes zero sense to cancel Celestial now. According to Petersen Arc's hardware team has been working on Druid for months now, and unless the software team is severely behind with the drivers and support then at the very least Celestial will receive a limited release. They already did a limited release of Battlemage to put more resources on Celestial, it would be a shame to throw all that effort away now.
> What has he been wrong about
…
They aren't just for gaming, there's also high-end workstations, but that's probably even more niche.
The only reason I can imagine for them leaving the money on the table is that they think that the AI boom won't last that much longer and they don't want to kill their reputation in the consumer market. But even in that case, I'm not sure it really makes that much sense.
Maybe if consumer GPUs were literally just datacenter silicon that didn't make the grade or something, it would make sense but I don't think that's the case.
I purposefully compare AI boom with the dot com bubble because we all knew how important the internet became eventually, but investments in it were way ahead of its time.
It is a false dichotomy. They can spend the bare minimum to stay in the game card market while fabing AI cards. At this point that is just an insurance premium.
https://morethanmoore.substack.com/p/nvidia-2026-q2-financia...
They've also not bothered investing in SW to add the H100 to their consumer drivers to work well on games. That doesn't mean it's impossible and none of that takes away from the fact that H100 and consumer GPUs are much more similar and could theoretically be made to run the same workloads at comparable performance.
Everything on-die, and with chiplets in-package, is the Intel way.
Default, average integrated graphics will continue to "statisfice" for a greater and greater portion of the market with integrated graphics continuing to grow in power.
The smaller the node the smaller the yield, chiplets is a necessity now (or architectural changes like Cerebras).
But reducing size will still increase yield since you can pick and choose.
Not to mention, Intel having vertical integration gave Intel flexibility, customization, and some cost saving advantages that Nvidia didn't have as much of, Nvidia being a fabless designer who are themselves a customer of another for-profit fab (TSMC).
If TFA is true, this was an anticompetitive move by Nvidia to preemptively decapitate their biggest competitor in 2030's datacenter GPU market.
Alchemist - First gen GPUs A310 GPUs are the low end, A770 are the high end. Powerful hardware for cheap, very spotty software at release. Got fixed up later.
Battlemage - Second gen (current gen), only B570 and B580 GPUs came out. They said weren't gonna release more Battlemage GPUs after these because they wanted to focus on Celestial, but probably went back on it seeing how well the B580 was reviewed and the B770 is due to be released by the end of the year.
Celestial - Next gen GPUs, they were expected for release early 2026. This article claims it was cancelled, but personally I think it's too late to cancel a GPU this late in production. Especially when they basically skipped a generation to get it out faster.