75
It's rude to show AI output to people | Alex Martsinovich
(distantprovince.by)
This is a most excellent place for technology news and articles.
I’ve had these interactions with the head of my IT department. I asked to procure a license for jfrog artifactory. He literally copy/pasted a ChatGPT response to me that began like this:
It came with a bunch of theoretical risks that are completely resolved by the simple ability of just not being a complete fucking moron.
It was really frustrating that I tried to talk with my IT leader, and instead found a proxy for ChatGPT.
After that, he created a group chat with him, I, and my colleagues in security. He proceeded to paste ChatGPT output outlining bullshit risks and theories, with the implicit expectation that I rhetorically address each of them via my own response. I’d explain things like,
In some cases, I even had to explain that the problems he’s raising are already problems faced in the current ecosystem. Completely unrelated to the software I’m talking about… ChatGPT just straight up implying that an architectural problem is a software risk.
I’d reply, and I swear to god he’d just give ChatGPT my text and paste the reply from ChatGPT back to me.
I lost a lot of respect for him. Why the fuck would you do that?
That guy to all his friends: "AI makes me 10x more productive!"
I'm fast coming to the conclusion that AI can indeed replace jobs. The thing is that the only job it can actually replace is that of a lazy middle manager. AI is great at responding to email if A:) you don't know what your talking about or B:) you don't respect the other person enough to waste the time formulating an actual response. AI in my experience is only really good at faking that there's someone on the other end. The fact that there's an entire management class it can convenienceingly impersonate is a pretty searing indictment as far as I'm concerned.