34
submitted 11 months ago by misk@sopuli.xyz to c/technology@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] Mahlzeit@feddit.de 5 points 11 months ago

Took em long enough.

I wonder if they used ChatGPT to create any of the training data.

[-] eager_eagle@lemmy.world 4 points 11 months ago

The transformer architecture GPT is based on came from Google. I'm sure the delay has more to do with Google trying to mitigate liability issues that arise with large scale general public usage, and letting OpenAI "test the waters" first.

[-] Mahlzeit@feddit.de 0 points 11 months ago

Quite possible. Whatever the case, they apparently saw no pressure to innovate. It implies that tech development is being slowed down by the Big Tech monopolies.

[-] eager_eagle@lemmy.world 1 points 11 months ago

lack of innovation on what exactly? They're all exploring new things if you search for it.

[-] Mahlzeit@feddit.de 0 points 11 months ago

It just seems that Google should have been able to move faster. Yes, they did publish a lot of important stuff, but seeing the splash that came from Stability and OpenAI, they seem to have done so little with it. What their researchers published was important but I can't help thinking, that a public university would have disseminated such research more openly and widely. Well, I may be wrong. I don't have inside knowledge.

[-] theherk@lemmy.world 2 points 11 months ago

Not likely. They may have tested it as an adversarial feedback tool, but it would be much more accurate and efficient to get the source data rather than paying OpenAI for maybe correct information.

They did, I believe, trick ChatGPT into exposing some of its source data though, but it was only a few hundred MB’s.

[-] Mahlzeit@feddit.de 1 points 11 months ago* (last edited 11 months ago)

For the fine-tuning stage at the end, where you turn it into a chatbot, you need specific training data (eg OpenOrca). People have used ChatGPT to generate such data. Come to think of it, if you use Mechanical Turk, then you almost certainly include text from ChatGPT.

[-] theherk@lemmy.world 1 points 11 months ago

Yes it could be done that way, and maybe GPT models were used, but calling these API’s isn’t free and there are plenty of open and surely internal models that could be used for that purpose.

this post was submitted on 06 Dec 2023
34 points (78.3% liked)

Technology

59648 readers
1716 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS