46
submitted 2 months ago by corbin@infosec.pub to c/technology@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] corbin@infosec.pub -1 points 2 months ago* (last edited 2 months ago)

I wouldn't really trust Ed Zitron's math analysis when he gets a very simple thing like "there is no real AI adoption" plainly wrong. The financials of OpenAI and other AI-heavy companies are murky, but most tech startups run at a loss for a long time before they either turn a profit or get acquired. It took Uber over a decade to stop losing money every quarter.

OpenAI keeps getting more funding capital because (A) venture capital guys are pretty dumb, and (B) they can easily ramp up advertisements once the free money runs out. Microsoft has already experimented with ads and sponsored products in chatbot messages, ChatGPT will probably do something like that.

[-] JeremyHuntQW12@lemmy.world 6 points 2 months ago

I wouldn’t really trust Ed Zitron’s math analysis when he gets a very simple thing like “there is no real AI adoption” plainly wrong

Except he doesn't say that. the author of this article simply made that up.

There is a high usage rate (almost entirely ChatGPT btw, despite all the money sunk into AI by others like Google) but its all the free stuff and they are losing bucketloads of money at a rate that is rapidly accelerating.

but most tech startups run at a loss for a long time before they either turn a profit or get acquired.

There is no path to profitability.

[-] corbin@infosec.pub 0 points 2 months ago

I wrote the article, Ed said that in the linked blog post: "There Is No Real AI Adoption, Nor Is There Any Significant Revenue - As I wrote earlier in the year, there is really no significant adoption of generative AI services or products."

There is a pretty clear path to profitability, or at least much lower losses. A lot more phones, tablets, computers, etc now have GPUs or other hardware optimized for running small LLMs/SLMs, and both the large and small LLMs/SLMs are becoming more efficient. With both of those those happening, a lot of the current uses for AI will move to on-device processing (this is already a thing with Apple Intelligence and Gemini Nano), and the tasks that still need a cloud server will be more efficient and consume less power.

[-] meowgenau@programming.dev 1 points 2 months ago

a lot of the current uses for AI will move to on-device processing

How exactly will that make OpenAI and the likes more profitable?! That should be one of the scenarios that will make them less profitable.

this post was submitted on 16 Aug 2025
46 points (97.9% liked)

Technology

76318 readers
558 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS