389
you are viewing a single comment's thread
view the rest of the comments
[-] Action_Bastid@lemmy.world 148 points 1 year ago* (last edited 1 year ago)

I'm not terribly surprised. A lot of the major leaps we're seeing now came out of open source development after leaked builds got out. There were all sorts of articles flying around at the time about employees from various AI-focused company saying that they were seeing people solving in hours or days issues they had been attempting to fix for months.

Then they all freaked the fuck out and it might mean they would lose the AI race and locked down their repos tight as Fort Knox, completely ignoring the fact that a lot of them were barely making ground at all while they kept everything locked up.

Seems like the simple fact of the matter is that they need more eyes and hands on the tech, but nobody wants to do that because they're all afraid their competitors will benefit more than they will.

[-] goldenbug@kbin.social 47 points 1 year ago

You point out a very interesting issue. I am unsure how this ties up to GPT 4 becoming worse in problem solving.

[-] Action_Bastid@lemmy.world 46 points 1 year ago* (last edited 1 year ago)

I'd wager they're attempting to replicate or integrate tools developed by the open source community or which got revealed by Meta's leak of Llama source code. The problem is, all of those were largely built on the back of Meta's work or were cludged together solutions made by OSS nerds who banged something together into a specific use case, often without many of the protections that would be required by a company who might be liable for the results of their software since they want to monetize it.

Now, the problem is that Meta's Llama source code is not based on GPT-4. GPT-4 is having to reverse engineer a lot of those useful traits and tools and retrofit it into their pre-existing code. They're obviously hitting technical hurdles somewhere in that process, but I couldn't say exactly where or why.

[-] roguetrick@kbin.social 15 points 1 year ago* (last edited 1 year ago)

I think this is just a result of the same reason reddit is doing what it's doing, personally. Interest rates raised and companies are finding ways to make up the shortfall that accounting is now presenting them with. Reducing computing power by making your language model less introspective is one way to do that. It's less detrimental than raising your prices or firing your key employees.

[-] teft@sh.itjust.works 21 points 1 year ago

Money and greed holding us back and ruining everything as always.

[-] Czeron@lemmy.world 3 points 1 year ago

Greed and stupidity!

[-] AustralianSimon@lemmy.world 9 points 1 year ago

Code might be open source but the training material and data pipeline is important too.

[-] Steeve@lemmy.ca 4 points 1 year ago

Meta just fully open sourced their AI model

this post was submitted on 19 Jul 2023
389 points (95.8% liked)

Technology

59080 readers
3521 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS