125
They're going to try to put AI in everything.
(hexbear.net)
On the road to fully automated luxury gay space communism.
Spreading Linux propaganda since 2020
Rules:
I dont think this LLM in everything trend is going to last very long. It's way too expensive for it to be in literally all consumer things. I can imagine it finding some success in B2B applications but who is going to pay Logitech to pay OpenAI $30 per million tokens? (Lambda for comparison is $0.20 per 1M requests if you pay the public rate)
There will be another massive financial recession when it finally dawns on them this shit was never gonna make any fucking money for anyone
What’s wild to me is that there’s continuing mass layoffs in tech in the middle of a huge AI bubble…when it finally bursts it’s going to be utterly brutal.
Yeah, this feels like it should be obvious to everyone it’s a bubble now
The crypto bubble lasted a long time, and unlike it, AI actually does something (not anything useful, or terribly well, but something), so I expect the bubble will last a while yet.
Throwing unlimited money and resources at the "make customer support chat bots 3% better" technology while the world burns.
Throwing unlimited money and burning fossil fuels to do that which makes the world burn faster.
I disagree, because I think what will happen is that these companies won’t use “AI” that is hosted in the cloud, but will instead send some minimally functional model to users that runs on their GPU, and later NPU (as those become common), and engage in screen recording and data collection about you and everything the mouse clicks on.
Disabling AI/data collection will disable any mouse technology or feature implemented after 1999, because AI or something.
At this point, I think AI stands for “absolute intrusion” when it comes to consumer products.
I don't really see why they need AI for that but yes I imagine companies will want to deploy AI on user equipment. These aren't going to be nearly as sophisticated or useful as what can run in the cloud though.
That’s sort of the point. It’s not really that the AI is useful, it’s that it’s the next big unregulated and misunderstood thing.
Companies are using the idea of “training models” to harvest user data well beyond any reasonable scope, so they can sell it.
The breadth of information that’s being openly collected under the guise of ‘AI’ was unconscionable 10 years ago, and even 5 years ago, folks would have been freaked out. Now businesses are pretending it’s just a run of the mill requirement to make their software work.
Case in point of how commodified our data is: Kaiser Permanente intentionally embedded tracking software in their site and now has to class the collected data as a user data breach. These products are likely from Google, Facebook, Adobe, Microsoft, or Salesforce. And they share the collected data, which can easily be de-anonymized to their advertising partners, who share it with their partners, until it winds up in the database of a data broker. This has been known to be an issue for awhile: Some Hospital Websites May Be Violating Privacy Rules By Sharing Data With Third-Party Trackers.
Anyway, sorry. Soapbox. I’ll put it away.