1174
AI bros (fed.dyne.org)
submitted 3 months ago by jaromil@fed.dyne.org to c/memes@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] peto@lemm.ee 66 points 3 months ago

Isn't the entire purpose of copilot that it shouldn't need much in the way of training? I think the extent of it at my employer is "this is the one you use."

I've tried it a few times, the only thing it seems remotely good for is when your recollection of a source is too fuzzy to form a traditional search query around. "What's that book series I read in the early 2000s about kids who traveled to another world and the things they brought back from it just looked like junk." Kind of questions.

[-] Amanduh@lemm.ee 45 points 3 months ago

That's my favorite use of ai, remembering old ass movies I have fragments of memories about from my childhood

[-] Sc00ter@lemm.ee 17 points 3 months ago

This was our company too. They struck some sort of deal with chat gpt that we use their base code, but aren't connected to their machine learning. Feels like a pretty reasonable approach in my opinion.

So our training was, "use ours. Don't use anyone else's because we don't want our proprietary information out there to never be able to be scrubbed from the internet"

[-] Tar_alcaran@sh.itjust.works 6 points 3 months ago* (last edited 3 months ago)

It's pretty decent at unimportant optimisation tasks with limited options. Like "I'm driving from X to Y, my friend travels by train from Z, what are good places to pick them up?"

[-] ggppjj@lemmy.world 3 points 3 months ago

I'm a self-taught C# dev, I've found tremendous success specifically just describing what I want to do in dumb language that I'd feel stupid asking people IRL about and that aren't googleable without knowing what both the terms "null-coalescing" and "non-merchandise supergroup" are describing.

There are a lot of patterns that don't have obvious names and that aren't easily described without describing a specific scenario in a way that might only make sense institutionally, or with additional context that your average person might not have. ChatGPT is fairly good at being the "buddy that you have a bunch of in-jokes with that can remember things better than you". I can skip a lot of explaining why I need to do a thing a certain way like I can with my coworkers (who all aren't programmers), and I can get helpful answers for programming questions that my coworkers don't know the answers to.

It's frustrating to see this incredibly advanced context-aware autocorrect on steroids get used in ways that don't acknowledge the inherent strengths of what LLMs are actually great at doing. It's infuriating to have that potential be actively misused and packaged as a service and have that mediocre service sold to you once a month as a necessity by idiots in suits watching a line on a chart.

[-] peto@lemm.ee 1 points 3 months ago

Yeah, this is much the same kind of use. If you work on the assumption that it is just something that has read everything, and everything that has been written about everything you can find it's utility. Folk want it to be some kind of fact genie, but the only facts it knows are what words go together, and it literally doesn't know the difference between real and made up.

this post was submitted on 02 Sep 2024
1174 points (98.5% liked)

Memes

45912 readers
1467 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 5 years ago
MODERATORS