104
Stack Overflow in freefall: 78 percent drop in number of questions
(www.techzine.eu)
This is a most excellent place for technology news and articles.
I don’t think so. All AI needs now is formal specs of some technical subject, not even human readable docs, let alone translations to other languages. In some ways, this is really beautiful.
Technical specs don't capture the bugs, edge cases and workarounds needed for technical subjects like software.
I can only speak for myself obviously, and my context here is some very recent and very extensive experience of applying AI to some new software developed internally in the org where I participate. So far, AI eliminated any need for any kind of assistance with understanding and it was definitely not trained on this particular software, obviously. Hard to imagine why I’d ever go to SO to ask questions about this software, even if I could. And if it works so well on such a tiny edge case, I can’t imagine it will do a bad job on something used at scale.
If we go by personal experience, we recently had the time of several people wasted troubleshooting an issue for a very well known commercial Java app server. The AI overview hallucinated a fake system property for addressing an issue we had.
The person that proposed the change neglected to mention they got it from AI until someone noticed the setting did not appear anywhere in the official system properties documented by the vendor. Now their personal reputation is that they should not be trusted and they seem lazy on top of it because they could not use their eyes to read a one page document.
Lol no, AI can't do a single thing without humans who have already done it hundreds of thousands of times feeding it their data
I used to push back but now I just ignore it when people think that these models have cognition because companies have pushed so hard to call it AI.
The whole point of StackExchange is that it contained everything that isn't in the docs.
It can't handle things it's not trained on very well, or at least not anything substantially different from what it was trained on.
It can usually apply rules it's trained on to a small corpus of data in its training data. Give me a list of female YA authors. But when you ask it for something more general (how many R's there are in certain words) it often fails.