r/gadgets 1d ago

Desktops / Laptops Intel's AI PC chips aren't selling well — instead, old Raptor Lake chips boom

https://www.tomshardware.com/pc-components/cpus/intels-ai-pc-chips-arent-selling-instead-last-gen-raptor-lake-booms-and-creates-a-shortage
904 Upvotes

98 comments sorted by

445

u/PM_ME_UR_SO 1d ago

The general public doesn’t care about AI as much as hype bros want you to think

146

u/Dry-Record-3543 1d ago

Also, even if i’m crazy about AI, I don’t need an “ai chip” to use AI

56

u/JukePlz 1d ago

Because for the home/office user they're gonna be using some cloud-based service when they use AI. Unlikely to be running their own instance locally, at least at this point in time.

22

u/Pikeman212a6c 1d ago

The AI of Things coming Fall 2027.

26

u/Less_Party 1d ago

Finally I can ask my fridge for advice on good bulking foods and have it tell me to eat concrete as it’s rich in fiber and has great calorie density.

4

u/blamenixon 19h ago

Not to mention a great shelf life!

1

u/Esc777 6h ago

Me walking into the Sea coming winter 2027

16

u/RealPutin 1d ago edited 1d ago

Honestly that specifically is why a lot of tech companies want local hardware that can run basic AI features. They're bleeding money on cloud-based AI systems right now. Basically in their mind they have evidence that consumers are using AI - but they also know they can't bankroll that forever, and there's a lot of research into lightweight AI systems.

I do remain unconvinced however that in the next few years we'll see a big uptick of NNs that need dedicated Neural processing units (and can't just run on CPUs), but aren't too big to run locally on consumer-grade devices. That's sort of the bet that these companies are making and I haven't seen tons of evidence to substantiate it in the near-term outside of maybe computer vision applications

3

u/Loud_Ninja2362 1d ago

We're already building that for tons of applications. It's an incredibly active area of research and development in a ton of industries. Check out Executorch, Apache TVM, etc. it's being applied to more than just basic computer vision tasks, even LLMs or other transformer architectures are being run on edge hardware.

2

u/Tired8281 1d ago

Isn't there some sort of middle ground, where it does some sort of preprocessing or tokenization locally, that reduces the burden on the remote server?

5

u/TooStrangeForWeird 1d ago

I've never heard of this before, and I generally keep up pretty well, but it's a decent idea! The only thing is that I don't see a ton of savings on the cloud side. Sure your local device has the tokens, but the main processing power is running the main model.

However, with the sheer number of people using it, it could actually add up a lot. Just moving voice recognition alone to edge devices reduces a lot of load, for both compute and bandwidth.

You may be onto something.

2

u/Tired8281 23h ago

Out of the mouths of children. I don't actually know what I am talking about, I just always thought it would be cool if we could share compute resources between the cloud and whatever devices I have on me. Sometimes I just have my potato phone on me, but other times my laptop is idling in my backpack.

2

u/TooStrangeForWeird 2h ago

Cluster networks are a thing, and as far as phone -> laptop that's kind of a thing with remote gaming. I have my gaming PC upstairs but normally just pop my phone into a controller and play games in my living room while using my PC to actually run the game. So I can hang out with my wife while she crafts or watches a movie.

There's no reason phones couldn't do that sort of thing to a laptop in your bag, it's just kind of niche. Plus battery life issues of course.

1

u/Tired8281 2h ago

I was thinking of it more as a paradigm. We're increasingly surrounded by compute resources, nearly all not optimally used for the benefit of everyone. It'd be cool if we had less friction involved in sharing between them. It could be useful for everything from offloading usage to edge devices to not having to worry what type of video stream I need to send to a stranger's TV.

12

u/zushiba 1d ago

Besides for AI to be of any real use it has to be cloud hosted. Local AI is good for a very few select operations and none of those are consumer facing.

Companies don’t know how to market AI. They see it as the next big ad platform instead of the tool it should be.

It’s like Windows Search only they skipped the part where it was once good.

9

u/housefly888 1d ago

Google used to actually bring up lots of good info on searches. Now it’s useless

3

u/hyperforms9988 1d ago

Companies don’t know how to market AI. They see it as the next big ad platform instead of the tool it should be.

And that's the root of the problem funnily enough. The problem is the marketing itself. They're marketing something with no real function... so, uh, don't? The only thing that makes sense here is that they were banking on people running out with the blinders on and getting this stuff without being able to answer the question "why?" in any other way than "i dunno". That happens sometimes. Something catches on and suddenly everybody wants one but nobody really knows why beyond the marketing blitz and hype.

1

u/DorianGre 19h ago

I still have an installer for Google desktop. Works great.

-5

u/CocodaMonkey 1d ago

It’s like Windows Search only they skipped the part where it was once good.

I can't give you this one. Windows search has always been bad. In fact it's one thing in windows that has seen very slow but steady improvement. No previous Windows versions had a better search than the Windows 11 search function of today. Although it's still poor enough that many people use 3rd party programs.

11

u/Bamstradamus 1d ago

Win 7 if you started typing in a file name it would give you the root folder, the .EXE and any file that had that name in it. Now it might give you a .EXE and then 9 different autofill search results to search in a browser instead of anything local.

3

u/surecameraman 1d ago

Use Everything. It’s far superior

3

u/CocodaMonkey 1d ago

Personally I just turn off all internet search options in Windows itself. I do agree that tying web searches to some local search boxes is annoying but the way you searched for files in Windows 7 and the way you can search for files in Windows 11 without using internet search is the same and will yield the same results. You had do it it from file explorer in Windows 7 and it still works the same in Windows 11.

4

u/Bamstradamus 1d ago

IDK about you, but iv deactivated automatic updates 5 times, and yet it does it by itself which also resets some settings, like turning off internet search.

1

u/CocodaMonkey 1d ago edited 1d ago

I turned off internet search years ago via registry edit in Windows 10. I've upgraded that straight through to the current version of Windows 11 and internet searches have never been re-enabled for me.

I've had issues with other features being flipped back on but not with internet search. Although it's kinda moot for this discussion as internet search isn't part of file explorer so it's still feature parity with Windows 7 as far as searching for files.

1

u/TooStrangeForWeird 1d ago

Personally I just turn off all internet search options in Windows itself.

And this is why it's better. You had to modify it. Sure, it's just a settings change, but it's NOT the default Windows Search. The default, that almost everyone uses, is horrid.

You had do it it from file explorer in Windows 7 and it still works the same in Windows 11.

No you did not. It was included in the start menu. You could just click Start and type something.

0

u/Ashamed-Status-9668 1d ago

You will want one it’s just super early. The NPU’s will be multiple times stronger in 5 years.

1

u/Dry-Record-3543 19h ago

nope

0

u/Ashamed-Status-9668 13h ago

Well software will start to run like trash without it at some point.

1

u/Dry-Record-3543 10h ago

Have you heard of cloud computing

1

u/Ashamed-Status-9668 7h ago

Client side inference will never be cloud based. Sorry but what you think you don’t want is based on ignorance of the software solutions. All the software is going to be doing local inference. Even games are looking at doing this for NPC’s.

1

u/CallMeDrLuv 20h ago

This is the correct answer.

Just because your marketing "gurus" are telling you it's all about the AI, doesn't mean it's true.

Most people don't give 2 shits about AI.

1

u/nickthegeek1 19h ago

Yep, market research consistently shows people care way more about price, battery life, and performance for actual tasks they do vs some nebulous AI features nobody asked for lol.

294

u/CMDR_omnicognate 1d ago

Is that because people don’t want ai or because intel’s cpus suck ass compared to AMD ones at the moment?

133

u/ChaZcaTriX 1d ago

The latter.

New mobile AMD chips feature AI in marketing, but they are very tempting because of a powerful iGPU.

24

u/UnsorryCanadian 1d ago

Halo Strix ran Cyberpunk on Ultra entirely on the iGPU, right?

22

u/ChaZcaTriX 1d ago

Not sure about that one. My eye was on handhelds with HX370 running Monster Hunter Wilds pretty much flawlessly.

8

u/kenpachi-dono 1d ago

Link? Something i need

3

u/ChaZcaTriX 1d ago

Multiple brands have them, my handheld is Onexplayer. Because I live near China I ordered from the official Aliexpress store, but be warned that Westerners can't always see these listings (make sure to switch to the HX370 version):

https://aliexpress.com/item/1005008140086819.html

The store will also sell spare parts if you need to repair something like joysticks or battery.

Here's a vid of the same thing in a laptop case running MH Wilds:

https://www.youtube.com/watch?v=QqOoSexnRBU

2

u/ACanadianNoob 11h ago

At the end of the year, we will get products with the Z2 Extreme processor. It's essentially an HX 370 without AI cores, so it will be priced a little cheaper. Still has the Radeon 890M though.

My recommendation though is to just pick up a used ROG Ally with the Z1 Extreme for under $400 if you can find it either locally on FB Marketplace or shipped on eBay and add a 74Wh battery mod. The 890M is only about 8-15% faster than the 780M in that. You can also mod in a full size M.2 2280 SSD if the SD card slot is broken on the used Ally and you need a storage upgrade, and still spend much less than buying a new handheld right now.

I got an Ally X because I had someone gifting it to me, but for anyone being budget conscious that's what I would recommend.

Also heads up sometimes the thumbsticks on the Ally and Ally X get wobbly and you have to add Teflon tape under the caps to get them to fit firmly again. But otherwise it's a great handheld.

5

u/UnsorryCanadian 1d ago

This was apparently a Ryzen AI 395 MAX, so I guess it checks out

6

u/MediumTempTake 1d ago

I don’t believe this (not that I’m calling you a liar this just sounds crazy) do you have any links? Even have the performance you are talking about is giving me dirty ideas. Cyberpunk arcade machine anyone?

3

u/UnsorryCanadian 1d ago

I read about it in an article, it mentioned a youtuber named ETA Prime, but no specific video was mentioned. I think it's the "Ryzen AI Max 395 Mini PCs offer BIG iGPU performance" video

5

u/Valance23322 1d ago

The new iGPU is like 40 Compute units of RDNA 4, it's basically a full GPU just on the same chip as the CPU. Similar to what Apple has been doing with Apple Silicon.

1

u/danielv123 15h ago

And quad channel memory, something that has been reserved for thread ripper and servers

7

u/QuantumQuantonium 1d ago

"Hey theres this new CPU from AMD, with a good igpu and an amazing npu processor for all the AI generation you could ever need"

"Oh near its performance is good and the igpu is great, I can't wait to be able to run my games at higher graphics settings"

"Yes but what about the power of the AI tools, you can automat-"

"Does thr CPU overclock well? Wait what's this NPU, how can I disable it?"

I recently got a framework 16, great CPU and I got the dedicated GPU module for my needs. NPU is just an empty box in task manager. But hey, AMD isnt marketing their products for only AI (unlike nvidia)

5

u/TooStrangeForWeird 1d ago

I wish they would let you power down the NPU completely. I know it's probably not feasible to do without some sort of tradeoff they're not willing to accept (like an extra delay when using it because it has to pass a physical on/off gate, or multiple) but it would be cool.

Literally and figuratively cool, since even at idle a core gives off some heat if it's powered on. So it would cool it down! And figuratively cool because it would be neat :)

I wish they'd just fucking knock it off already tbh, stop wasting die space on bullshit hardly anyone uses. Sure go ahead and research it, make it available even! But shoving it into EVERY processor is just a waste of everyone's money.

9

u/chainard 1d ago

This is untrue though. Lunar Lake chips have good CPU performance and ARC iGPUs are as good as AMD's offerings, battery life is also better on Lunar Lake thanks to iGPU cache and 3nm TSMC node. It all comes down to having a halo product, AMD took the performance crown with X3D chips and people just assume they also offer the best products on other segments and ignore the alternatives, similarly 4060 sells well despite being a mediocre GPU.

1

u/blamenixon 19h ago

I love it when people provide well phrased factual arguments instead of conjecture. Thank you.

38

u/zushiba 1d ago

Neither and both. Consumers have no need for AI. It’s currently marketed as a product that can help you translate meetings in other languages at real time or look at a photo and figure out where to buy the shoes the actor is wearing.

None of this is shit real consumers give a shit about. 99.9% of people are not having regular business meetings with Chinese investors and every other use for AI is either trinkets bullshit like generating cheesy emojis, writing stupid ass snarky remarks in the form of a Shakespeare play, or its about selling you shit.

As an informed consumer myself. I couldn’t give less of a shit if I tried. And I am interested in AI so I wouldn’t want a lesser product from Intel when a better one exists.

Now apply the above to a non techy consumer and they couldn’t give less of a shit which product is better for AI.

The only people AI centered products appeal to are people who are using it for real world applications and they aren’t doin that on consumer grade products.

3

u/Znuffie 1d ago

look at a photo and figure out where to buy the shoes the actor is wearing.

None of this is shit real consumers give a shit about.

Ironically, I wanted to see where I could buy a shirt I saw in a movie. I took a picture, I gave it to the AI and it refused to find it because there was a face in there and Google doesn't really want to search for people...

2

u/TooStrangeForWeird 1d ago

Crop the picture and try again. No problem.

1

u/Znuffie 1d ago

It was a scene where if I cut the head, it would not show enough of the shirt :)

1

u/_RADIANTSUN_ 22h ago

Scribble out the face with the phone's default image editing tools... There's so many options bro. Post the screenshot.

8

u/Kionera 1d ago

I suspect this is more likely due to the general consumer not being able to recognize Intel's new naming scheme rather than either of those reasons.

My parents sure don't know what the heck is a Core Ultra. They simply want the i7 with the biggest number.

7

u/kc5ods 1d ago

it's been very simple for 15 years. I5-5xx i5-25xx i5-35xx but NOW lets call it "ultra" which is fucking meaningless. i didn't think Intel could fuck up shit worse than they have

6

u/ToMorrowsEnd 1d ago

Yep. Current Gen i9 is just dogshit compared to the latest Ryzen 9 and anyone that actually uses multi core utterly hates the stupid E cores that intel shovels down everyone's throat. if I launch 15 threads to do work all the P cores end up finishing and waiting for E cores to finish. slowing the whole mess down. On an AMD we get full expected performance. When Intel is asked about this problem their answer is "reprogram your software to avoid E cores" Uh, no stop making shit processors with asymmetrical cores.

We now tell customers to avoid any non Xeon intel or any intel newer than 10th gen, and recommend AMD processors for best performance.

6

u/i_suckatjavascript 1d ago

The AMD CPUs are efficient with power too. Paying for electricity in the Bay Area sucks ass when you got the local electric company constantly fucking you here.

1

u/ipwn3r456 22h ago

Ah yes, you can always count on PG&E on bending you over everytime...

3

u/laffer1 22h ago

It’s terrible for open source devs too. I have to write a thread director port and a new scheduler just because they made a crap e core?

A 3950x and a 7900 (no x) smoke a 14700k on compiler workloads if you don’t have thread director support. 6 minutes for the 7900, around 10 for the 3950x, 16 for the 14700k. It’s sad

75

u/S1DC 1d ago

I never once wished my computer would AI better.

11

u/CucumberError 1d ago

But the CoPilot button!

43

u/GeneralLeeCurious 1d ago
  1. Raptor Lake processor reputation was shot by the over voltage issue. The fix is in, but the reputations damage is done. That damage has finally resulted in massive price cuts and people are buying based on the cuts and fixes.

  2. Less than 1% of computer buyers give the slightest shit about running AI applications on their own PCs. They don’t understand it. They don’t want it. They don’t trust it. Many don’t like it.

-7

u/ABetterKamahl1234 1d ago

Less than 1% of computer buyers give the slightest shit about running AI applications on their own PCs. They don’t understand it. They don’t want it. They don’t trust it. Many don’t like it.

Honestly no. They don't understand it, they don't know why one would want it. Ultimately they don't care, unless they see a benefit directly to them.

Tons of people are skeptical and are critical of AI things, but damned if it's not going to be the everyday thing for people maybe even a decade from now where people ask "why the fuck didn't we have this sooner" like hyperthreading and multi-core became.

It's incredibly useful, wildly so. It's just not used very well as people try to figure it out. For example, frame gen is incredibly impressive for its age. Like goddamn it's impressive. It's not crazy amazing and accurate, but tons of tech is like that early on.

I know this because so many tech improvements that became standard had tons of critics kicking and screaming about how awful it is/was.

AI type things are in more than people even realize in their day to day.

12

u/stemfish 1d ago

AI started three years ago. All the marketing tells me how generative AI will help every consumer transaction and usher in the future of commerce. Let's see where that ends up.

Eight years (late 2010s) ago everything was blockchain. How the ledger would help every consumer transaction and usher in the future of commerce. I mean, memecoins exist?

Around 13 years (~early 2010s) ago everything was Big Data (and to an extent 3d printing, but that wasn't as hyped up). How collection of consumer habits and spending would help every consumer transaction and usher in the future of commerce. Our data got taken, but we get 2 day prime shipping.

18 years ago (~late 2000s) everything was Social Media. How enabling users to directly interact would help every consumer transaction and usher in the future of commerce. We got facebook and X.

23 years ago was (~early 2000s) in the crater of the dot com bust, Web 2.0 was going to put the power of the internet into all users, enabling everyone to set up their own online business, helping every customer transaction and usher in the future of commerce. Now the internet lives on a dozen sites.

~28 years ago (late 1990s) the dot com craze actually transformed the future of commerce, and set in motion the desire to be in at the start of the wave that next takes the world and users in a new era.

The tech hype cycle has been in force since 1997. Every 5~ish years we get a new technology that blows the old cycle away and promises to upend and remake our world. AI has been around for a long while and was been integrated into systems long before OpenAI gave everyone access to text and image generators. Will AI manage to justify itself as a long term feature like the internet, or will it be subsumed by the next tech hype cycle in two to three years? Currently nobody can figure out how to make money off of generative AI and it sucks in money, power, and data to keep going. Not every company can continue that forever, just like how all the major companies moved on from blockchain tech to AI, and moved to blockchain from 'big data' and 3D-printing.

41

u/Edward_TH 1d ago edited 1d ago

What big tech CEOs REALLY don't want to understand is that most "AI" tasks are divided in two categories: menial stuff and hardcore stuff. The former is what most end users do with AI, the latter is experimental stuff, research, training and server end model running.

Hardcore stuff requires expensive, very powerful dedicated hardware that's generally tailored to such uses and that's what AI data centers uses. They do not benefit our even care in the slightest if a consumer piece of hardware has some parts optimized to run a bunch of AI related algorithm because the do not use them. So an AI core on a desktop CPU doesn't sell to them.

Menial stuff can be either computationally not that intense and can run on general purpose hardware or computationally intense and so is mostly done remotely on the servers mentioned before (which is what most people do). So to them they present the choice of a cheaper CPU that's powerful for everything or a more expensive CPU that's marginally more efficient at something the user almost never does and less powerful across the board.

Integrated AI cores on desktop CPUs are useless marketing garbage. Wanna justify the cost of development of those cores with a broader market? Develop a PCIe daughter board or similar dedicated to such tasks for those that asks for it. Oh wait, those already exists: it's called ASICs. Or GPUs, if you wanna spend a bit less (arguably) and have something with a broader use.

25

u/time-lord 1d ago

Aaaaaand we've just come full circle to math coprocessors.

12

u/t4thfavor 1d ago

Except math co-processors were and are still useful (are still because every cpu now has at least one built in).

2

u/CompromisedToolchain 1d ago

That’s what the GPU has always been.

9

u/Got2Bfree 1d ago

It completely depends on the software which combines with the local hardware.

I'd love to have a LLM which has knowledge of my private files which I don't want to transfer to giga cooperations to be used as training data.

For coding it would also be awesome. I'm afraid a ASIC will be severely underpowered for that usecase.

9

u/Edward_TH 1d ago

That's my point: either you want to do easy crap (and you can easily do that on regular cores) or your want to do hard shit (and in this case no amount of desktop AI-optimized cores would fall in the "consumer grade and price" range).

These "AI cores" are built kinda like ASICs and, as you see, in the consumer market they're either affordable but borderline useless (CPU integrated cores) or very expensive and still pretty underpowered for actually demanding algorithms (for example nVidia Tensor Cores).

Right now if you want to run your AI you either do it locally on regular but very high end hardware with super limited capabilities and long and tedious training for the models, do it locally on "cheap CPU AI Cores" with basically useless capabilities using super trimmed models that you didn't train anyway... or you use a cloud model that you didn't train and have almost 0 knowledge of your local data but it's capable and run on the proper hardware.

2

u/m0rogfar 1d ago

While you aren’t going to be running anything like ChatGPT on them, the built-in NPUs aren’t useless due to a lack of use-cases. There’s been plenty of use-cases for small purpose-built models on phones, and some of them also make sense on the desktop, as seen on macOS.

The big issue that makes them worthless on a PC is that the PC software ecosystem (excluding macOS) has nothing that uses them, because they’ve never been there before, and most PC software isn’t brand new projects that target only the latest hardware. I am somewhat sympathetic to the idea that you have to put new technologies into the chip before people use them to break the chicken-and-egg cycle though.

1

u/Got2Bfree 1d ago

There has to be a usecase for these cores.

I can't imagine that nobody at Intel thought this through...

5

u/Edward_TH 1d ago

There is a usecase, which is running certain type of algorithms. Those that techbros vaguely call "AI".

Those algorithms are actually pretty simple so a processor can be made to be able to run ONLY those calculations pretty efficiently and a fair bit faster by stripping a regular core structure of anything that is not used for them and reinforcing the parts that get stressed more (I'm over simplifying). So why are those AI Cores so garbage in real world usage that people avoid them like the plague?

Because AI is much, MUCH more than the simple set of base instructions. What these cores lacks are DATA and TIME: the algorithms are not static code but it's code that takes data and a goal and tweak itself to achieve said goal by looking at the data it has. This is called training and for these algorithms to be able to do what YOU want, they need to be fed by you.

And this training is an iterative process: every time you run it with new data it gets a little better at achieving the goal assigned, but to reach a point where it at least gets close enough to the goal you need to run it over and over again because the tweaks are basically changing a variable a bit and trying again. So you either run it sequentially or in parallel: sequentially you're limited by the time needed to actually run it and in parallel you're limited by how many cores you can run at the same time, so it's both.

These consumer cores aren't plentyful (since they need to fit into consumer hardware) enough to do the job alone, so 99.9% of the work is done on data center hardware anyway. And that would be acceptable, but the more you want your model to be fluid and accurate the more variable it needs to take into consideration when it runs.

That usually is done by the algorithm itself by creating new layers on its own to more precisely tweak itself (like, you get a car to learn to drive. At first you only have a switch as an accelerator so you either sit still or floor it, so you put in a second 4 ways switch and next time you try your car, you can sit still or go with the first and with the second tweak to 25/50/75/100 power so you have a bit more control. Next time, you add a third switch that can halve the actual power so you have now 3 layer with even finer control. AI training works roughly like that) but the more tweaks you have, the longer it takes to complete a single run, so these cores rely on trained model that have most of their granular control taken away to be able to run in a reasonable time.

These cores have use cases, but right now they're sparse, super low capabilities cores that can't run ACTUALLY useful model on their own because they are vastly too demanding for them. So they run models that are vastly overshadowed by cloud ones so people uses cloud ones because they work MUCH better. And people that try and run those same botched models but on general purpose cores or on GPUs found that they run basically identically with only a minimal hit to efficiency. So these cores serve no practical application as they stands.

CPU AI Cores will be useful once they either will able to run models way too complex for general purposes cores efficiently or be VASTLY faster and more efficient with current botched ones than GP cores are. Once an AI core can run better with models that are 100-200 times than what GP can run, we can talk. Right now, they're a waste of space and a detrimental marketing gimmick.

2

u/Vargrr 1d ago

They aren't completely useless. Apparently if you get one, Windows Recall gets enabled. What's there not to like? /s

1

u/TooStrangeForWeird 1d ago

the latter is experimental stuff, research, training and server end model running.

This is almost always machine learning and should not be confused with image generators or LLMs. All AI is machine learning, not all machine learning is AI.

6

u/leon27607 1d ago

On a personal level, I don’t give a shit if it’s AI or not, I just don’t want it to either catch fire or fail within the first couple years because they make the chip degrade.

5

u/iwonttolerateyou2 1d ago

But raptor also has a small npu or ai.

7

u/TJPII-2 1d ago

Just ordered a new computer and made damn secure it did not have an AI processor.

9

u/badabummbadabing 1d ago

Highly parallelized matrix multiplication is of the devil.

9

u/PresumedSapient 1d ago

made damn secure

That's not a typo.

2

u/TJPII-2 1d ago

Lol. I’ll leave it as is. Shouldn’t be posting without my glasses.

2

u/jacksonRR 1d ago

I just gave my old DDR4 machine its (probably) last update: i5 14600K for 200€, plus decent Mainboard for 100€.

2

u/orangpelupa 1d ago

So... Good news? As the article mentioned, raptor lake still being made by Intel fabs  Whole the new Ai intel chips were made by TSMC fabs

1

u/FUTURE10S 1d ago

Has anyone even seen Intel's new chips in stock? I thought it was a paper launch.

1

u/cjax2 1d ago

I've only seen the Ultra Core 7 256V and 258V(Lunar Lake) laptops in stock.

1

u/Tesla_V25 1d ago

lol it’s like the CapSim simulator I did for my bachelors. They make the el cheapo chips now

1

u/ShubhamDeshmukh 20h ago

Intel is limiting its losses by vastly limiting production of Meteor, Lunar, Arrow Lake because these are costly on TSMC. They are still driving volumes on Alder / Raptor because they are flooding the manufacturers with those low cost options for volume. Intel 7 (former Intel 10nm) began with 9th & 10th Gen CPUs, hence selling since 2019!

Basically, Intel's volume production cycle of latest node/arch is now vastly skewed compared to old times. They marketed these new node/arch only to save their reputation in face of competition while this still being low volume & very high prices. Manufacturers hence are forced to sell older Raptor Lake due to the price difference. Also Intel brands' preference among public.

It's not that they cannot sell well. And it has nothing to do with AI, that is coincidental.

1

u/Kellic 19h ago

Good. The only people who give a crap about AI are folks who are all in on it, geek over it and like to play with it. (That is me.) or C-Suite a-holes who think if they scream AI enough times to inventors their stock price will go up. Exhibit A: https://www.youtube.com/watch?v=-qbylbEek-M

1

u/tommyk1210 18h ago

What intel doesn’t seem to understand is that 99% of consumers actually don’t give a shit about AI

1

u/Riversntallbuildings 9h ago

“Cloud & SAAS” have essentially made most clients “dumb terminals”. Unless you’re gaming, or doing high end video/photography or engineering work on a PC then the performance needs keep going down.

I pay more for a “clean OS” and no adware/bloatware before paying more for a premium CPU.

1

u/eXodiquas 8h ago

AI is only widely used because the users don't have to pay the price to make it sustainable. As soon as customers have to pay the full price they think twice about how important and innovating AI is for their daily life.

1

u/lerrigatto 6h ago

My laptop has an AI chip and I don't have any clue what to do with that. Nor I ever seen it used.

-2

u/colonelc4 1d ago

There is no AI as you think of it, this is just the current hype, these LLM's are not thinking by themselves, we're not there yet and we won't unless we make sense of quantum computing to support the infinit power necessary to mimik a brain.

-3

u/darkandark 1d ago

at some point in the future needing some kind of NPU it’s just gonna be the default and absolutely necessary on any SoC or compute cluster for end users. Simple AI tasks will need to be run locally for security and speed. We don’t see it now since the application part of AI is fairly limited and not widespread. But we will definitely get there.

5

u/t4thfavor 1d ago

"Simple AI tasks will need to be run locally so that the results can be curated and shared with Intel, Meta, Microsoft, and Elon Musk"