22
submitted 12 hours ago by Ivysaur@hexbear.net to c/technology@hexbear.net

...or at least the kind of model that isn't fueled by burning a small forest for every query? I am wanting to play an old video game, but I'm still only just learning the language and could use an aid. I really, really want to avoid any of this incredibly wasteful AI stuff.

you are viewing a single comment's thread
view the rest of the comments
[-] lime@feddit.nu 2 points 6 hours ago

i knew this would happen. everything's AI now, so everything must be bad. translation utilities have used machine learning methods for years, long before gpts got useful. translation doesn't have nearly the power requirements of generative stuff. also, for the record, it's not the usage of generative ai that's wasteful, it's building the model. the models are already built.

but anyway, it sounds like you want to run something on your own machine. there are tools for this. some of them are engine-specific, others do video capture. RPG maker and various visual novel engines for example have tons of them, where you just run the game using the tool and it replaces all the text. there are similar things for emulators.

is this something that's desirable, or do you want to use the game more as a language learning guide so that the original text stays up? in that case the OCR path is probably better, using something like translumo or ugt. they will show the text in a separate window.

[-] Awoo@hexbear.net 1 points 5 hours ago

where you just run the game using the tool and it replaces all the text. there are similar things for emulators.

These are typically locally running LLMs that do the translating as you go along then cache it on the device. It's why there's a delay before the text is replaced, the delay is shorter on higher end machines that process it quicker.

[-] lime@feddit.nu 2 points 5 hours ago

well, not typically llms. these tools have been around longer than the term.

besides, i don't see why it matters? energy-wise, the problem isn't the tech, it's the immense scale it's deployed on in order to be instantly available to millions of people. running a translator locally is unlikely to show up on your electric bill if you play any games on your computer.

this post was submitted on 16 Nov 2024
22 points (95.8% liked)

technology

23307 readers
309 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 4 years ago
MODERATORS