94
top 27 comments
sorted by: hot top controversial new old
[-] QuadratureSurfer@lemmy.world 47 points 1 month ago

Someone just got the AWS bill.

[-] crunchy@lemmy.dbzer0.com 19 points 1 month ago

That's got to be it. Cloud compute is expensive when you're not being funded in Azure credits. One the dust settles from the AI bubble bursting, most of the AI we'll see will probably be specialized agents running small models locally.

[-] fmstrat@lemmy.nowsci.com 6 points 1 month ago

I'm still running Qwen32b-coder on a Mac mini. Works great, a little slow, but fine.

[-] And009@lemmynsfw.com 2 points 1 month ago

I'm somewhat tech savvy, how do I run llm locally. Any suggestions? How to know if my local data is safe

[-] Llak@lemmy.world 3 points 1 month ago

Checkout lm studio https://lmstudio.ai/ and you can pair it with vs continue extension https://docs.continue.dev/getting-started/overview.

[-] Retro_unlimited@lemmy.world 1 points 1 month ago

I have been using a program called GPT4ALL and you can download many models and run them locally. They give you a prompt at boot if you want to share data or not. I select no and use it offline anyway.

[-] Ulrich@feddit.org 15 points 1 month ago* (last edited 1 month ago)

Ah they're learning from the "unlimited" mobile carriers.

"Unlimited" until you meet your limit, then throttled.

[-] Jesusaurus@lemmy.world 14 points 1 month ago

Sounds like charge back territory

[-] Admax@lemmy.world 11 points 1 month ago

Hopefully (?) this is the start of a trend and people might begin to realize how all those products are not worth their price and AI is an overhyped mess made to hook users before exploiting them...

[-] cley_faye@lemmy.world 1 points 1 month ago

The whole industry is projecting something like negative $200B for next years. They know it's not worth the price.

[-] Glitchvid@lemmy.world 6 points 1 month ago

Imagine the price hikes when they need to get that return on hundreds of billions they've poured into these models, datacenters and electricity.

[-] napkin2020@sh.itjust.works 4 points 1 month ago
[-] TrumpetX@programming.dev 4 points 1 month ago

Well shit, I've been on vacation, and I signed up with Cursor a month ago. Not allowed at work, but for side projects at home in an effort to "see what all the fuss is about".

So far, the experience was rock solid, but I assume when I get home that I'll be unpleasantly surprised.

Has anyone here had rate limiting hit them?

[-] errer@lemmy.world 5 points 1 month ago* (last edited 1 month ago)

I’ve primarily use claude-4-sonnet in cursor and was surprised to see a message telling me it would start costing extra above and beyond my subscription. This was prolly after 100 queries or so. However, switching to “auto” instead of a specific model continues to not cost anything and that still uses claude-4-sonnet when it thinks it needs to. Main difference I’ve noticed is it’s actually faster because it’ll sometimes hit cheaper/dumber APIs to address simple code changes.

It’s a nice toy that does improve my productivity quite a bit and the $20/month is the right price for me, but I have no loyalty and will drop them without delay if it becomes unusable. That hasn’t happened yet.

[-] br3d@lemmy.world -5 points 1 month ago
[-] errer@lemmy.world 4 points 1 month ago

I mean yeah? I wasn’t counting in detail, it’s an estimate.

Previously you got 500 requests a month and then it’d start charging you, even on “auto.” So the current charging scheme seems to be encouraging auto use so they can use cheaper LLMs when they make sense (honestly a good thing).

[-] br3d@lemmy.world -5 points 1 month ago

I was questioning the use of the word "prolly"

[-] brutalist@lemmy.world 6 points 1 month ago

Nah, you should find a new bone to pick.

[-] naught@sh.itjust.works 4 points 1 month ago

it means "probably" 🤗

[-] SreudianFlip@sh.itjust.works 0 points 1 month ago

In the English language, specifically North American dialects, this is a form of idiom.

[-] Confused_Emus@lemmy.dbzer0.com 0 points 1 month ago

That’s not an idiom, it’s just an elided word.

[-] SreudianFlip@sh.itjust.works 0 points 1 month ago* (last edited 1 month ago)

Well we can argue over the niceties of the word idiom, but as it's referring to the way the word is pronounced in specific regions of North America, it qualifies as meeting one of the definitions of idiom.

~~Elision refers more to the absence of an understood word, such as saying 'my bad'.~~

My bad, elision can also refer to slurring syllables together, so it's both.

[-] Confused_Emus@lemmy.dbzer0.com 0 points 1 month ago* (last edited 1 month ago)

An elision is the absence of a sound or syllable in a word. An idiom is an entire phrase or expression that does not mean what it literally says.

There’s no argument here, you’re just wrong.

No, it isn't both.

[-] SreudianFlip@sh.itjust.works 0 points 1 month ago

I dunno, cf. 1.b definition of idiom in the OED: dialect usage, and 2.a is dialect usage for effect. Maybe the definition is changing with the ages, or your usage is overly strict.

[-] Confused_Emus@lemmy.dbzer0.com 0 points 1 month ago* (last edited 1 month ago)

Idiom. Elide. It's really not that confusing. Idioms are about meaning, elision is about sound.

[-] SreudianFlip@sh.itjust.works -1 points 1 month ago

Hm, I guess an encyclopedia article is more relevant than a dictionary definition, so sure. I was using the looser secondary definition... in this case an elision that references a dialect in order to call up regional relevance to the opinion expressed.

[-] axEl7fB5@lemmy.cafe 2 points 1 month ago

Common People

this post was submitted on 06 Jul 2025
94 points (97.0% liked)

Technology

74251 readers
1068 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS