72
submitted 11 months ago by L4s@lemmy.world to c/technology@lemmy.world

Intel CEO laments Nvidia's 'extraordinarily lucky' AI dominance, claims it coulda-woulda-shoulda have been Intel::Intel CEO Pat Gelsinger has taken a shot at his main rival in high performance computing, dismissing Nvidia's success in providing GPUs for AI modelling as "extraordinarily lucky." Gels

top 27 comments
sorted by: hot top controversial new old
[-] monkeyman512@lemmy.world 51 points 11 months ago

Having read the article it sounds like Pat is more complaining that Intel would have been positioned to milk the AI cow if the previous CEOs weren't fucking idiots.

[-] nomecks@lemmy.world 20 points 11 months ago

He's right. Intel dumped their Tick Tock fast development cycle and let their product lines languish. Real stupid way to generate short term investor gains.

[-] poopkins@lemmy.world 1 points 11 months ago

ALL the tech giants these days are solely focused on short term gains.

[-] trolololol@lemmy.world 5 points 11 months ago

Ya sure he woulda coulda done different if he were in their places, because he is super hindsight man

[-] CmdrShepard42@lemm.ee 2 points 11 months ago

Well Bob Swan did dump $20 billion on a single stock buyback that only elevated the share price for like 2 weeks and Brian Krzanitch saw delay after delay on 14nm which resulted in turning their two year lead into a two year defecit. There's definitely truth to Pat's words.

[-] pdxfed@lemmy.world 30 points 11 months ago

I think it's shorter to name the things Intel did strike gold on than the boats they missed despite all kinda of advantages.

They missed the entire mobile market processor explosion, inevitable as it was since 2009 with Android launch(or 2007 if you want to say they should have seen what Apple was doing and thought they could compete, which if you're an exec at Intel you should have).

They missed cloud computing.

Classic "big company hires/keeps overpaid check drawers instead of those with finger on the pulse". Intel deserves the very little innovation, success and relevancy theyve had post 86.

Bought my first AMD computer this year, an and 6800 Ryzen 7 with an on proc 680m gpu that is equivalent of ~ Nvidia 2050 discrete card. Game over for Intel.

[-] Ottomateeverything@lemmy.world 13 points 11 months ago

Bought my first AMD computer this year, an and 6800 Ryzen 7 with an on proc 680m gpu that is equivalent of ~ Nvidia 2050 discrete card. Game over for Intel.

While the rest of your post is logical, this is insane cope. No one is buying integrated graphics for gaming. 2050s are a joke in terms of power - you're talking about a 2 year old budget mobile gpu.... If anything this is basically a "I need to do some photoshop but don't want a dedicated gpu on my laptop" type card. Intel has never given a fuck about mobile graphics. Their offerings have always been "serviceable, but get a real gpu if you want one". Laptops are arguably better with ARM so there's competition there....

Intel is still selling their bread and butter and still has a huge stranglehold on their core market. Claiming "game over" because of an off case of an offshoot of one of their secondary markets is hugely overreacting.

[-] rambaroo@lemmy.world 8 points 11 months ago

Plenty of people game on integrated graphics and a 2050 is damn powerful for an integrated card. Not everyone is a hyper nerd who builds their own PC or pays big bucks for a good gaming laptop. There are tons of casual players using integrated GPUs

[-] pdxfed@lemmy.world 6 points 11 months ago

Of course they still have a stranglehold on computer CPUs, the point is they haven't done anything with that for 30 years. Their bread and butter is an ever lower margin game in PCs that have largely peaked; they missed mobile phones and are set to miss whatever comes next with ARM.

My purchase of a laptop without their proc isn't even a drop in the ocean, but the point is that the only market they do have they can't even dominate any more, just keep shipping manually faster CPUs once a year, just like they have been for the past 15.

[-] neshura@bookwormstory.social 3 points 11 months ago

I remember when Zen dropped and everyone thought Intel had some sort of secret super architecture stored in the archives that would allow them to compete. Turns out no, they had nothing. They still have nothing. All they do is make the silicon larger and crank up the clocks. Ironically reducing their margins in an attempt to out-compete a vastly superior technology.

At this point I'm afraid AMD will get into the same position Intel was in during the Bulldozer era and price gouge just as much but unless Intel somehow manages to make chiplets work without infinity fabric (they can't use it because patents) I really don't see them putting out compelling products in the next years. AMD is steadily gaining market share because year after year their products are objectively better in increasingly many categories. Zen was just plain cheap enough to counteract the lacking performance but no AMD has the cheaper AND faster tech for most use cases. The last bastion Intel really has are laptops and once that is gone I can see a lot of OEMs start selling AMD products en masse.

[-] Ottomateeverything@lemmy.world -1 points 11 months ago* (last edited 11 months ago)

just keep shipping manually faster CPUs once a year, just like they have been for the past 15.

Yeah, exactly. My disagreement is... So fucking what?

I'm much happier with a company that is satisfied with its market, does what it does well, and leaves it at that. I'm not a believer of "more money for the money gods, ever increasing profits, let's fuck over some more consumers and further line the shareholders pockets".

By moving into other markets, they'd be competing with people who know those spaces well and probably better than they do. If they push someone else out, that's more specialties lost.

I'm generally against this monopolistic machine mindset everyone has these days. I'm much happier with a content company continuing to do what it does, instead of taking up market space trying to do something else that someone else does.

Not that Intel is a perfect example here, but I'm much happier that their GPUs have generally flopped, they haven't made it in mobile, and they aren't trying to be another ARM manufacturer. That's not their thing. So I can continue to go to them for a reliable desktop CPU and they can continue being a force in that market instead of trying to wear 17 different hats and losing their way.

[-] neshura@bookwormstory.social 1 points 11 months ago

I’m much happier with a company that is satisfied with its market, does what it does well, and leaves it at that

No problem here, yet

I’m not a believer of “more money for the money gods, ever increasing profits, let’s fuck over some more consumers and further line the shareholders pockets”.

points at Skylake explain that then. Let's not pretend Intel was a Saint while AMD was busy running their business into the ground. Intel was price gouging the hell out of consumers back then. Intel had a whopping 9% performance gain over two CPU generations. We get more than that now in a single one (well if you're not Intel that is but more on that later). Intel is not one of those saintly companies you outlined in your previous statement and it never was. When they had the chance to price gouge time and time again they showed that they will do exactly that. Let's not get into their anti-competitive practices whenever AMD actually manages to get something good out.

And while at it Intel is not even good at making Desktop CPUs anymore. They are stuck on monolithic chips that cost a shitload to manufacture so while AMD is busy reducing their production costs and improving their flexibility Intel is still reeling from Zen 1 and seemingly can't do anything other than making bigger and bigger chips at higher and higher voltages. I don't have their internal financials but looking at what's publicly available they are running out of excess margin to bleed off. Their (high-end) products don't make them a lot of money anymore, if that, because the yield for them is just abhorrently bad. They got hit out of left field by Zen but instead of sitting down and acknowledging that they fucked up and "innovated" themselves into a corner they doubled down on monolithic chips and dug their grave deeper. And why? Simple: innovating costs money and Intel is all about profit so that was a big nono.

If they push someone else out, that’s more specialties lost.

History shows that rather happens due to monopolies preventing new players from entering a field (infamously the dozens of potential cancer cures that just landed in big pharma's drawer of patents that don't make enough money) but you do you I guess.

but I’m much happier that their GPUs have generally flopped

You shouldn't be, we have 2 companies competing there and it isn't going very well for the consumer. Fewer companies in an industry = less competition.

Oh and of course

reliable desktop CPU

yeah well that one is easy if you stop drastically changing your product while still increasing prices as if you were.

[-] EatYouWell@lemmy.world 0 points 11 months ago

Intel's mobile on proc gpu does beat AMDs, though.

[-] pdxfed@lemmy.world 2 points 11 months ago

Since 6/7 series Ryzen in the last 1-2 years?

[-] EatYouWell@lemmy.world 2 points 11 months ago

Yes. Their newest slightly outperforms AMDs

[-] TheGrandNagus@lemmy.world 2 points 11 months ago* (last edited 11 months ago)

Not really. It did for one generation, against AMD's ancient Vega iGPU, if you ignore that ~30% of games were buggy or straight up don't even work on Intel iGPUs.

[-] turbohz@programming.dev 2 points 11 months ago
[-] TheGrandNagus@lemmy.world 2 points 11 months ago* (last edited 11 months ago)

Meteor lake isn't even out yet, so I wasn't counting that. By the time it comes out, the 7840U will be 9 months old and about to be replaced.

Plus tbh I'm hesitant believing Intel benchmarks. In the past I've seen them do things like using much higher memory speeds for their own chips, which for an iGPU is critical.

[-] Shadywack@lemmy.world 22 points 11 months ago

From the article

Gelsinger explained how he thinks Nvidia and its CEO, Jensen Huang, just happened to be in the right place at the right time.

Well, no shit. That's how ALL rich people got rich, luck.

[-] Plopp@lemmy.world 5 points 11 months ago

What about the ones who planned and worked real hard to be born rich?

[-] TheGrandNagus@lemmy.world 15 points 11 months ago* (last edited 11 months ago)

I'm becoming tired of Pat's whining every other week. You don't hear Lisa Su crying about Nvidia (or Intel for that matter) just being lucky, spouting this woe is me bullshit. You hear her, and Jensen, shutting up unless they have something meaningful to show.

I've noticed this in a lot of Intel slides and presentations recently - they talk more about their competitor's products than they do their own!

Perhaps if intel had branched out into GPUs earlier than they did, while they still had mountains of money to do so, they could've leveraged the AI boom.

But what did they do? They spaffed billions up the wall on stupid acquisitions like fucking McAfee Antivirus.

Rather than swallowing their pride and using their manufacturing and CPU design skills to make ARM mobile CPUs, they engaged in the stupid decision to pay phone manufacturers to use x86 CPUs that simply weren't efficient enough for mobile use. Needless to say, they all flopped hard.

They let their foundries and design teams rot to the extent that AMD, a dying company, was able to surpass them.

Now they're struggling to release CPUs without them being 6 months to a year late. Sometimes they don't release at all (where is desktop Meteor Lake, Intel?); that's their bread and butter product, FFS! If Intel can't do CPUs, what can they do?!

Intel deserves everything they're getting right now. Them getting left behind in AI is a problem entirely of their own making.

[-] eager_eagle@lemmy.world 10 points 11 months ago

If anything it should've been AMD. Intel is barely keeping up with the CPU competition these days.

[-] EvergreenGuru@lemmy.world 6 points 11 months ago

Amd dropped the ball when it came to software and has now separated their GPU architecture so that they only have enterprise cards for data science. NVIDIA got in early and made CUDA default among all product lineups so that consumer cards could be used as entry-level cards by hobbyists. While it would’ve been nice to see more competition, the only company taking this space seriously has been nvidia.

[-] hansl@lemmy.world 6 points 11 months ago

Not really. ATI were always “G is for graphics” and built video games cards. They never really saw the potential (nor did they have the resources anyway) for GPGPU, which is why NVIDIA had a huge first-player advantage (CUDA is 16 years old, 2 years before AMD acquired ATI). When AMD bought them it was already very late.

Then AMD wanted to build cards for people to buy while NVIDIA was more than happy selling overpriced cards to crypto miners.

OpenCL was an ambitious project that was too big and too open for what was capable from the Khronos group. Vulkan was too late.

Intel could have done it but IIRC the CEO at that time (can’t remember the name) didn’t want to diversify their products after Itanium was a failure. They just doubled down on CPU.

[-] eager_eagle@lemmy.world 4 points 11 months ago* (last edited 11 months ago)

They never really saw the potential (nor did they have the resources anyway) for GPGPU

Maybe ATI, which ended in 2010.

AMD launched ROCm in 2016, after the first AI boom of 2012, but before GANs and transformers exploded. In recent years they're better positioned in than Intel ever was.

[-] TheGrandNagus@lemmy.world 1 points 11 months ago

Disagree. GCN cards were incredibly compute focused.

Shit, AMD even invented HBM memory because they saw the value in ridiculously high bandwidth, dense, energy efficient memory in data centre applications. HBM is still used today in the enterprise market.

AMD's problem was that they had no money at the time and couldn't build out their software ecosystem like Nvidia could - they had to bank on just getting the ball rolling and open sourcing their efforts in the hope that others would contribute, which didn't happen to the extent that they'd have liked, especially when Nvidia with their mountains of cash could just pump out CUDA and flood universities with free GPUs to get them hooked in the Nvidia software stack.

[-] bruhduh@lemmy.world 1 points 11 months ago

Brother, ati radeon hd 5000 series was basically vliw architecture single board computer in gpu package, and that was before amd bought radeon

this post was submitted on 20 Dec 2023
72 points (87.5% liked)

Technology

59598 readers
4434 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS