330m is not much.
It's more than 0m
Which is more than -$1.00.
Maybe they can buy... three nVidia GPUs and electricity to run them!
In that case, I'll let them get a discount on the one I'm selling... Now only $100m.
That's thousands of salaries for a year, that's not too bad for an unknown company. More than enough to produce something that can attract more funding. Many startups became successful with less funding.
$330m is not nothing. But, with a funding split between a telecom CEO, and a shipping & logistics CEO - person has to wonder what sort of direction & tuning the team might be encouraged to explore. How will they stack up against existing & proven open source non-profits with impressive releases like EleutherAI?
These open source projects are neat, in that they give the average person the opportunity to peek under the hood of an LLM that they'd never be able to run on consumer level hardware. There are some interesting things to find, especially in the dataset snapshots that Eleuther made available.
In general, kind of cool to see France being on the cutting edge of these things. And I think it's worth saluting any project that moves to decentralize power from states and megacorps, who seal wonderful, powerful things in black boxes.
France is on the cutting edge of AI indeed, the FAIR (Facebook AI lab) has a big office in Paris and its boss is Yann Le Cun. So there are plenty of researchers getting trained on the state of the art.
Makes sense it'd be the French again. They pioneered the internet after all.
Ideally, they'd just blow the entire $330M training an LLM, and release the weights. In reality, much of that money will probably go into paying salaries, various smaller research projects, etc.
Ideally, they wouldn't be paying salaries? What?
The context is that LLMs need a big up front capital expenditure to get started, because of the processor time to train these giant neural networks. This is a huge barrier to the development of a fully open source LLM. Once such a foundation model is available, building on top of it is relatively cheaper; one can then envision an explosion of open source models targeting specific applications, which would be amazing.
So if the bulk of this €300M could go into training, it would go a long way to plugging the gap. But in reality, a lot of that sum is going to be dissipated into other expenses, so there's going to be a lot less than €300M for actual training.
Is there any way we can decentralize the training of neural networks?
I recall something being released awhile ago that let people use their computers for scientific computations. Couldn't something similar be done for training AI?
There is a project (AI Horde) that allows you to donate compute for inference. I'm not sure why the same doesn't exist for training. I think the RAM/VRAM requirements just can't be lowered/split.
Another way to contribute is by helping with training data. LAION, which created the dataset behind Stable Diffusion, is a volunteer effort. Stable Diffusion itself was developed at a tax-funded public university in Germany. However, the cost of the processing for training, etc. was covered by a single rich guy.
Good luck training an LLM without any developer.
I hope they actually do, unlike "Open"AI
This is the best summary I could come up with:
This morning at Scaleway’s ai-PULSE conference, French billionaire and Iliad CEO Xavier Niel gave some extra details about his plans for an AI research lab based in Paris.
Six men took the stage this morning to talk about their previous work and what they have in mind for the research lab — Patrick Perez, Edouard Grave, Hervé Jegou, Laurent Mazaré, Neil Zeghidour and Alexandre Defossez.
Kyutai has also put together a team of scientific advisors who are well-known AI researchers — Yejin Choi, Yann LeCun and Bernhard Schölkopf.
“When it comes to the timeline, I don’t think our aim is necessarily to go as fast as Mistral, because our ambition is to provide a scientific purpose, an understanding and a code base to explain the results,” Defossez said at the press conference.
Macron also used this opportunity to define and defend France’s position on Europe’s AI Act, saying that use cases should be regulated, not model makers.
It’s not a question of defining good models, but we need to ensure that the services made available to our citizens are safe for them, for other economic players and for our democracy,” Macron said.
The original article contains 905 words, the summary contains 192 words. Saved 79%. I'm a bot and I'm open source!
Please put a space between the link and parenthesis so the link doesn't break
There I fixed the link. Sadly still in some weird arcane language but never mind
Smart. Even Google knows that they can't compete with open source models since open source development of AI models is much more optimized and a compliance serving model can't catch up with it.
So an open source model is their best way to leapfrog these giants.
nice :)
I would be happy to see the real story behind every kind of this tech news. You know we’re the real money will be.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed