35
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 21 Nov 2024
35 points (100.0% liked)
Futurology
1812 readers
29 users here now
founded 1 year ago
MODERATORS
It will boil down to open source tooling and corporate data sets. This is the same pattern as anything else, but these people never learn their lessons. The fad will fade, free tools will be used by most, but corporate interests will pay the bigger players for their models and datasets.
So do you think a centralisation of AI "power" will lie in who has the most/best data? Was just reading Nexus, and Yuval Harari suggests that a centralisation of power is due to the most resources (data centers) and the data itself. Wondering what your take is, or if I'm not getting it.
The problem for all the investor funded AIs, is that data centers are huge costs. They're burning through billions of dollars every month. That makes sense if one of two of them emerge as dominant players who own most of the market share for future AI businesses.
If they all keep under-cutting each other by using open-source. It's more likely companies like OpenAI will crash and burn first.
It really does feel like a trillion dollar investor game of chicken, where some investors or company has to admit defeat at some point.
The companies that are illegally training on copyrighted data that can keep moving forward with an obfuscated dataset will hang in there. The ones who can't-or get sued into oblivion-will eventually just get acquired or give up. If "centralized" means anything in this arena, it's the generalized training data, yes.
Think of it like this: all companies want these "AI" platforms for is to make their own data more easily parsed and accessible, right? The ones that have engineering resources may be paying for OpenAI now, but once the tooling in the FOSS side is a bit more complete, don't you think these customers of OpenAI would rather just host their own and run off their own trained data? That's where things are already shifting.
This all follows a pattern that has happened time and time again. Last decade it was all the stupid "smart" assistant craze (those are all dead, btw), and now it's this stupid thing. Nothing new to see here.
Thanks for further elaborating. I think I get your point a bit better now.