45
submitted 1 year ago by flashgnash@lemm.ee to c/asklemmy@lemmy.ml

I have a theory that it should have a very different "personality" (probably more like writing style) depending on language because it's an entirely different set of training data

In English chatGPT is rather academic and has a recognisable style of writing, if you've used it a bit you can usually get hints something was written by it just by reading it.

Does it speak in a similar tone, with similar mannerisms in other languages? (where possible, obviously some things don't translate)

I don't know a second language well enough to have natural conversation so I'm unable to test this myself, and may have worded things awkwardly from a lack of understanding

you are viewing a single comment's thread
view the rest of the comments
[-] Newtra@pawb.social 46 points 1 year ago

In two languages I'm learning, German and Chinese, I've found it to suffer from "translationese". It's grammatically correct, but the sentence structure and word choice feel like the answer was first written in English then translated.

No single sentence is wrong, but overall it sounds unnatural and has none of the "flavor" of the language. That also makes it bad for learning - it avoids a lot of sentence patterns you'll see/hear in day to day life.

[-] flashgnash@lemm.ee 8 points 1 year ago

Curious, maybe it was trained using existing translation tech rather than being trained on actual examples of the language like it was for English?

[-] relevants@feddit.de 24 points 1 year ago

As a native German speaker I agree that ChatGPT is very English-flavored. I think it's just because the sheer amount of English training data is so much larger that the patterns it learned from that bleed over into other languages. Traditional machine translations are also often pretty obvious in German, but they are more fundamentally wrong in a way that ChatGPT isn't.

It's also somewhat cultural. The output you get from ChatGPT often sounds overly verbose and downright ass-kissing in German, even though I know I wouldn't get that impression from the same output in English, simply because the way you communicate in professional environments is vastly different. (There is no German equivalent to "I hope this email finds you well", for example.)

[-] PlexSheep@feddit.de 1 points 1 year ago

"Ich hoffe, diese Nachricht erreicht Sie." Would work, but I haven't seen it used too. I also haven't seen the English version, but that makes sense, as I work for a German company.

[-] relevants@feddit.de 3 points 1 year ago

Yeah I mean you can translate it literally, but it means nothing. The English equivalent of what it communicates in German would be more like "I hope this email gets delivered to you." which is just a weird thing to say.

[-] MaggiWuerze@feddit.de 2 points 1 year ago

Wouldn't you just write "ich hoffe Ihnen geht es gut" If you wanted to express concern about the other person's well-being?

[-] relevants@feddit.de 1 points 1 year ago

Yeah, but even that is stretching it for a work email unless there is a concrete reason you'd be concerned, like you know they're dealing with stuff. Otherwise โ€“ at least in my northern German circles โ€“ that's already getting pretty personal

[-] MaggiWuerze@feddit.de 5 points 1 year ago

Yeah, as a fellow northern German I would never write that either. Maybe on a greeting card.

[-] TheGalacticVoid@lemm.ee 6 points 1 year ago

Doubt it. It was probably trained the most on English, and as a result, it applies English characteristics to other languages

[-] CanadaPlus@futurology.today 2 points 1 year ago

There's a lot more English-language data to start with, so it's inevitable they did this or else just trained it primarily in English.

[-] GiddyGap@lemm.ee 5 points 1 year ago

No single sentence is wrong, but overall it sounds unnatural and has none of the "flavor" of the language.

I've also found that it's often contextually wrong. Like it doesn't know what's going on around it or how to interpret the previous paragraph or even the previous sentence, let alone the sentence two pages back that was actually relevant to the sentence it's now working on.

[-] JulyTheMonth@lemmy.ml 1 points 1 year ago

Well probably because it does not know what's going on around it. It only knows the words. It can't interpret the words, only guess what is the most likely answer word by word.

this post was submitted on 22 Nov 2023
45 points (95.9% liked)

Asklemmy

43905 readers
1083 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS