141
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 25 Dec 2025
141 points (86.5% liked)
Programming
24063 readers
902 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
founded 2 years ago
MODERATORS
Ok there's a whole lot of wtf going on here.
AI codebots in the cloud doing your code for you, cool, I guess.
So you need to watch them? And presumably intervene if necessary? Ok.
So then:
They decided that they'd stream a video of the AI codebots doing their thing.
At 40Mbps per stream.
For "enterprise use".
Where presumably they want lots of users.
And then they didn't know about locked down enterprise internet and had to engineer a fallback to jpeg for when things aren't great for them. Newsflash - with streaming video peaking at 40Mbs per user, things will never be great for your product in the real world.
How, in any way, does this scale to anything approaching success? Their back end now has to have the compute power to encode and serve up gigabits of streaming video for anything more than ~50 concurrent users, let alone the compute usage of the actual "useful" bit , the AI codebots.
For say, 5 users out of a site of 200, IT departments will now see hundreds of megabits of streaming traffic - and if they're proactive, they will choke those endpoints to a crawl so that their pathetic uplink has a chance to serve the other 195 users.
All of this for a system that is fundamentally working on maybe 5kB of visible unicode text at any particular moment.
My favorite line was (paraphrasing); "What if your video feed backs up and you're hung watching the AI code with 30s of lag, then BOOM, it's merged directly into main and your repo is toast."
Way to gloss over at least 3 layers of permission your agentic AI should absolutely never have been given. Surely the idea of sandboxing your AI models would demonstrate some awareness of the issue of automating AI permissions? Apparently not.
it's actually the customers' bandwidth, not theirs. everything is self-hosted by the customer.
not sure if that's worse
Quit reading at:
But your comment made me go back and look out of disbelief. How does a person get this far down a rabbit hole?
I don't know. Software engineering is tangential to my field but I have to wonder, is software efficiency even a consideration these days?
It seems that maybe a week of just solid thinking about what they have and what they need - WITHOUT touching a keyboard - could have put them in a better position. But move fast and break things doesn't seem to accommodate that kind of approach.
I think it's more of a "chatgpt, design my product for me" type situation.
Sometimes I wonder if people come up with the most inefficient application on purpose just to come up with a bandwidth heavy use case for some new communication tech (5G wireless, or 1Gb fiber, or..)
What a glorious future AI is heralding.
AI psychosis is a real thing, and it's apparently a lot easier to fall into these rabbit holes than most people think (unless, I suspect, like me, you have a thick foundation of rock-solid cynicism that the AI simply will never penetrate). This is probably another interesting example of it.
This is a really good point.
This post is a great example of what will skipping a research and just trusting the first solution you find lead to.
When you are researching the thing yourself, you usually don't find the solution immediately. And if you immediately have something that seems to work, you're even less likely to give up on that idea.
However, even taking this into account (because the same can probably happen even if you do research the thing yourself - jumping to a first solution), I don't understand how it's possible that the post doesn't make a single mention of any remote desktop protocols. I'm struggling to figure out how would you have to phrase your questions/promts/research so that VNC/RDP, you know - the tools made for exactly the problem they are trying to solve - does not comes up even once during your development.
Like, every single search I've tried about this problem has immediately led me to RDP/VNC. The only way how I can see the ignorance displayed in the post is that they ignored it on purpose - lacking any real knowledge about the problem they are trying to solve, they simply jumped to "we'll have a 60 FPS HD stream!", and their problem statement never was "how to do low-bandwith remote desktop/video sharing", but "how to stream 60 FPS low-latency desktop".
It's mindboggling. I'd love to see the thought and development process that was behind this abomination.
Do we know each other or something :).
Honestly great comment, couldn’t agree more.
You are right but this comment gives me iPod introduction feelings. That company will be huge in some years.