Meanwhile, some new details emerged about the days leading up to Altman's firing. "In the weeks leading up to his shocking ouster from OpenAI, Sam Altman was actively working to raise billions from some of the world's largest investors for a new chip venture," Bloomberg reported. Altman reportedly was traveling in the Middle East to raise money for "an AI-focused chip company" that would compete against Nvidia.
As Bloomberg wrote, "The board and Altman had differences of opinion on AI safety, the speed of development of the technology and the commercialization of the company, according to a person familiar with the matter. Altman's ambitions and side ventures added complexity to an already strained relationship with the board."
"According to people familiar with the board's thinking, members had grown so untrusting of Altman that they felt it necessary to double-check nearly everything he told them," the WSJ report said. The sources said it wasn't a single incident that led to the firing, "but a consistent, slow erosion of trust over time that made them increasingly uneasy," the WSJ article said. "Also complicating matters were Altman's mounting list of outside AI-related ventures, which raised questions for the board about how OpenAI's technology or intellectual property could be used."
When I google an issue I quickly get a list of possible solutions with other developers commenting on them with corrections. People can often upvote and downvote answers to indicate if they work or not and if they stop working.
With ai I get a single source of information without the equivalent to peer review. The answer may be out of date and it may misunderstand my request. It may also make the same mistake I am making that I would have caught with a quick googling.
The ai may be able to make boilerplate code occasionally without too much rework, but boilerplate code is not that hard to make already.
The AI is massively more expensive than a search engine and I have not seen any indication that will change soon. This is the biggest problem in my mind. I don't ever expect to have to pay for google. I expect in the future the ai will need to be paid for somehow and I have a feeling they will have to charge too much to justify the use of AI for software development work.
AI has plenty of good uses, but I do not believe software development is the winner. Block chain for instance was massively useful for git repositories, but not useful for many of the crazy things companies attempted to use it for.
If you use bing search AI it sources its answers. It basically does what you would do when looking through sources and at ratings. But when you find the info you want you can click the link it used to generate it.
It's also free I believe.
Right now AI like that is heavily subsidized by investors. My concern with AIs feasibility is that training is so expensive that it won't be able to stay free. Remember we can only stop ai training if the AI topic is no longer developing. Also if the AI can source its answer with a link, did it provide me with a new service that is better than a search engine?
Yes because you have your answer and further reading if needed.
Rather than having to read through search results and figure out which were relevant.