804

cross-posted from: https://lemm.ee/post/53805638

you are viewing a single comment's thread
view the rest of the comments
[-] Grandwolf319@sh.itjust.works 147 points 3 weeks ago

It’s a reaction to thinking China has better AI, not thinking AI has less value.

[-] GreatAlbatross@feddit.uk 69 points 3 weeks ago

Or from the sounds of it, doing things more efficiently.
Fewer cycles required, less hardware required.

Maybe this was an inevitability, if you cut off access to the fast hardware, you create a natural advantage for more efficient systems.

[-] sugar_in_your_tea@sh.itjust.works 38 points 3 weeks ago

That's generally how tech goes though. You throw hardware at the problem until it works, and then you optimize it to run on laptops and eventually phones. Usually hardware improvements and software optimizations meet somewhere in the middle.

Look at photo and video editing, you used to need a workstation for that, and now you can get most of it on your phone. Surely AI is destined to follow the same path, with local models getting more and more robust until eventually the beefy cloud services are no longer required.

[-] jmcs@discuss.tchncs.de 43 points 3 weeks ago

The problem for American tech companies is that they didn't even try to move to stage 2.

OpenAI is hemorrhaging money even on their most expensive subscription and their entire business plan was to hemorrhage money even faster to the point they would use entire power stations to power their data centers. Their plan makes about as much sense as digging your self out of a hole by trying to dig to the other side of the globe.

[-] sugar_in_your_tea@sh.itjust.works 17 points 3 weeks ago

Hey, my friends and I would've made it to China if recess was a bit longer.

Seriously though, the goal for something like OpenAI shouldn't be to sell products to end customers, but to license models to companies that sell "solutions." I see these direct to consumer devices similarly to how GPU manufacturers see reference cards or how Valve sees the Steam Deck: they're a proof of concept for others to follow.

OpenAI should be looking to be more like ARM and less like Apple. If they do that, they might just grow into their valuation.

[-] theunknownmuncher@lemmy.world 38 points 3 weeks ago* (last edited 3 weeks ago)

China really has nothing to do with it, it could have been anyone. It's a reaction to realizing that GPT4-equivalent AI models are dramatically cheaper to train than previously thought.

It being China is a noteable detail because it really drives the nail in the coffin for NVIDIA, since China has been fenced off from having access to NVIDIA's most expensive AI GPUs that were thought to be required to pull this off.

It also makes the USA gov look extremely foolish to have made major foreign policy and relationship sacrifices in order to try to delay China by a few years, when it's January and China has already caught up, those sacrifices did not pay off, in fact they backfired and have benefited China and will allow them to accelerate while hurting USA tech/AI companies

[-] Redditsux@lemmy.world 5 points 3 weeks ago

Oh US has been doing this kind of thing for decades! This isn't new.

[-] golli@lemm.ee 27 points 3 weeks ago

It’s a reaction to thinking China has better AI

I don't think this is the primary reason behind Nvidia's drop. Because as long as they got a massive technological lead it doesn't matter as much to them who has the best model, as long as these companies use their GPUs to train them.

The real change is that the compute resources (which is Nvidia's product) needed to create a great model suddenly fell of a cliff. Whereas until now the name of the game was that more is better and scale is everything.

China vs the West (or upstart vs big players) matters to those who are investing in creating those models. So for example Meta, who presumably spends a ton of money on high paying engineers and data centers, and somehow got upstaged by someone else with a fraction of their resources.

[-] YellowParenti@lemmy.wtf 11 points 3 weeks ago

...in a cave with Chinese knockoffs!

[-] sith@lemmy.zip 2 points 3 weeks ago

I really don't believe the technological lead is massive.

[-] golli@lemm.ee 1 points 3 weeks ago

Looking at the market cap of Nvidia vs their competitors the market belives it is, considering they just lost more than AMD/Intel and the likes are worth combined and still are valued at $2.9 billion.

And with technology i mean both the performance of their hardware and the software stack they've created, which is a big part of their dominance.

[-] sith@lemmy.zip 2 points 3 weeks ago

Yeah. I don't believe market value is a great indicator in this case. In general, I would say that capital markets are rational at a macro level, but not micro. This is all speculation/gambling.

My guess is that AMD and Intel are at most 1 year behind Nvidia when it comes to tech stack. "China", maybe 2 years, probably less.

However, if you can make chips with 80% performance at 10% price, its a win. People can continue to tell themselves that big tech always will buy the latest and greatest whatever the cost. It does not make it true. I mean, it hasn't been true for a really long time. Google, Meta and Amazon already make their own chips. That's probably true for DeepSeek as well.

[-] nieceandtows@lemmy.world 12 points 3 weeks ago

From what I understand, it's more that it takes a lot less money to train your own llms with the same powers with this one than to pay license to one of the expensive ones. Somebody correct me if I'm wrong

[-] Tramort@programming.dev 9 points 3 weeks ago
[-] CheeseNoodle@lemmy.world 4 points 3 weeks ago

I wouldn't be surprised if China spent more on AI development than the west did, sure here we spent tens of billions while China only invested a few million but that few million was actually spent on the development while out of the tens of billions all but 5$ was spent on bonuses and yachts.

[-] sirboozebum@lemmy.world 6 points 3 weeks ago

Does it still need people spending huge amounts of time to train models?

After doing neural networks, fuzzy logic, etc. in university, I really question the whole usability of what is called "AI" outside niche use cases.

[-] ayyy@sh.itjust.works 5 points 3 weeks ago

Ah, see, the mistake you’re making is actually understanding the topic at hand.

[-] wreckedcarzz@lemmy.world 1 points 3 weeks ago* (last edited 3 weeks ago)
If inputText = "hello" then
   Respond.text("hello there")
ElseIf inputText (...) ```
[-] tburkhol@lemmy.world 3 points 3 weeks ago

Exactly. Galaxy brains on Wall Street realizing that nvidia's monopoly pricing power is coming to an end. This was inevitable - China has 4x as many workers as the US, trained in the best labs and best universities in the world, interns at the best companies, then, because of racism, sent back to China. Blocking sales of nvidia chips to China drives them to develop their own hardware, rather than getting them hooked on Western hardware. China's AI may not be as efficient or as good as the West right now, but it will be cheaper, and it will get better.

this post was submitted on 27 Jan 2025
804 points (98.1% liked)

Technology

63023 readers
767 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS