47
submitted 6 months ago by misk@sopuli.xyz to c/hardware@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] suburban_hillbilly@lemmy.ml 28 points 6 months ago

Bet you a tenner within a couple years they start using these systems as distrubuted processing for their in house ai training to subsidize cost.

[-] 8ender@lemmy.world 6 points 6 months ago

That was my first thought. Server side LLMs are extraordinarily expensive to run. Download to costs to users.

this post was submitted on 20 May 2024
47 points (92.7% liked)

Hardware

5035 readers
1 users here now

This is a community dedicated to the hardware aspect of technology, from PC parts, to gadgets, to servers, to industrial control equipment, to semiconductors.

Rules:

founded 4 years ago
MODERATORS