66
you are viewing a single comment's thread
view the rest of the comments
[-] pyr0ball@reddthat.com 10 points 1 day ago

The pricing question assumes the current model (cloud inference, centralized compute, hyperscaler margins) is the only model.

Local inference flips that math entirely. If the model runs on your hardware, the marginal cost to the provider is close to zero. The pricing problem is a distribution problem, not a compute problem.

What I think actually happens: cloud AI settles at $20-50/month for power users who need the latest frontier models and don't want to manage hardware. That's sustainable. The "free tier" disappears or gets severely throttled.

But for a large chunk of use cases (summarization, classification, drafting, local assistants) models small enough to run on a consumer GPU are already good enough. That market doesn't need to pay $50/month to Anthropic. It needs a good local runner and a one-time hardware investment.

The companies that will survive the pricing correction are the ones who either have genuinely differentiated frontier capability, or who make local deployment easy enough that users own their own stack.

[-] HobbitFoot@thelemmy.club 2 points 1 day ago

There are also going to be issues with how bleeding edge AI gets sold. If the AI that can detect security exploits is real, the AI owner isn't going to sell open access to that model.

I suspect that, if the AI is really that good for certain tasks, it won't get sold on a token model but something more akin to human work.

[-] nullify3112@lemmy.world 1 points 45 minutes ago* (last edited 45 minutes ago)

I have created a machine

It can know all your secrets

I will sell your own secrets back to you

Because I own my machine and it now owns you too

this post was submitted on 23 Apr 2026
66 points (97.1% liked)

Asklemmy

54075 readers
435 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 7 years ago
MODERATORS