690
submitted 4 months ago by mr_MADAFAKA@lemmy.ml to c/pcgaming@lemmy.ca
you are viewing a single comment's thread
view the rest of the comments
[-] bouldering_barista@lemmy.world 21 points 4 months ago

Who in the heck are the 16%

[-] Honytawk@lemmy.zip 16 points 4 months ago
  • The ones who have investments in AI

  • The ones who listen to the marketing

  • The ones who are big Weird Al fans

  • The ones who didn't understand the question

[-] Glytch@lemmy.world 10 points 4 months ago

I would pay for Weird-Al enhanced PC hardware.

[-] quicksand@lemmy.world 5 points 4 months ago

Those Weird Al fans will be very disappointed

[-] desktop_user@lemmy.blahaj.zone 1 points 4 months ago
  • The nerds that care about privacy but want chatbots or better autocomplete
[-] barfplanet@lemmy.world 5 points 4 months ago

I'm interested in hardware that can better run local models. Right now the best bet is a GPU, but I'd be interested in a laptop with dedicated chips for AI that would work with pytorch. I'm a novice but I know it takes forever on my current laptop.

Not interested in running copilot better though.

[-] x0x7@lemmy.world 3 points 4 months ago* (last edited 4 months ago)

Maybe people doing AI development who want the option of running local models.

But baking AI into all consumer hardware is dumb. Very few want it. saas AI is a thing. To the degree saas AI doesn't offer the privacy of local AI, networked local AI on devices you don't fully control offers even less. So it makes no sense for people who value convenience. It offers no value for people who want privacy. It only offers value to people doing software development who need more playground options, and I can go buy a graphics card myself thank you very much.

[-] 31337@sh.itjust.works 1 points 3 months ago

I would if the hardware was powerful enough to do interesting or useful things, and there was software that did interesting or useful things. Like, I'd rather run an AI model to remove backgrounds from images or upscale locally, than to send images to Adobe servers (this is just an example, I don't use Adobe products and don't know if this is what Adobe does). I'd also rather do OCR locally and quickly than send it to a server. Same with translations. There are a lot of use-cases for "AI" models.

this post was submitted on 17 Jul 2024
690 points (99.0% liked)

PC Gaming

8568 readers
494 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS