527
you are viewing a single comment's thread
view the rest of the comments
[-] waspentalive@lemmy.one 24 points 1 year ago

Every time I have asked ChatGPT to code something it seems to lose the thread halfway through and starts giving nonsensical code. I asked it to do something simple in HP41C calculator code and it invented functions out of whole cloth.

[-] averagedrunk@lemmy.ml 8 points 1 year ago

I asked it for something in Powershell and it did the same thing. I asked how it came up with that function and it said it doesn't exist but if it did that's how it would work.

[-] CloverSi@lemmy.comfysnug.space 4 points 1 year ago

Quality of output depends a lot on how common the code is in its training data. I would guess it'd be best at something like Python, with its wealth of teaching materials and examples out there.

[-] Cethin@lemmy.zip 5 points 1 year ago

It depends on how common the language is and how novel the idea is. It can not create something new. It isn't creative. It spits out what is predictable based on what other people have written before. It isn't intelligent. It's glorified auto-complete.

[-] Ubermeisters@discuss.online 3 points 1 year ago* (last edited 1 year ago)

When it starts going off the rails like that I also ask it to "check its work when its done", and it seems to extend the amount of usable time before it loses the plot and suggests i use VBA or something.

this post was submitted on 21 Sep 2023
527 points (97.0% liked)

Programmer Humor

32501 readers
510 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS