97

The subjects that you can't even bring up without getting downvoted, banned, fired, expelled, cancelled etc.

you are viewing a single comment's thread
view the rest of the comments
[-] Mothra@mander.xyz 14 points 3 days ago

"I've asked ChatGPT about xyz" , and "how to use chatGPT for xyz" in my experience gets me downvotes fast.

People are quick to presume you have no ability to fact check anything and that you will be following its advice blindly, (which mind you - you were never asking for in the first place) instead of asking a human, ever ( for example about medical conditions but not limited to that topic). People presume you are trying to eliminate the human factor out of the equation completely and are quick to remind you of your sins, god forbid you ever use a chatbot to test ideas, ask for a summary on a topic so you can expand your research later or get creative with it in any way. If you do, most people don't like to know.

[-] BurgerPunk@hexbear.net 17 points 3 days ago

I think the bigger problem is that each answer it gives basically destroys a forest

[-] MonkeMischief@lemmy.today 4 points 2 days ago

To be fair: "For each answer it gives", nah. You can run a model on your home computer even. It might not be so bad if we just had an established model and asked it questions.

The "forest destroying" is really in training those models.

Of course at this point I guess it's just semantics, because as long as it gets used, those companies are gonna be non-stop training those stupid models until they've created a barren wasteland and there's nothing left....

So yeah, overall pretty destructive and it sucks...

[-] wuphysics87@lemmy.ml 1 points 2 days ago* (last edited 2 days ago)

Training a model takes more power than what? Generating a single poem? Using it to generate an entire 4th grade class's essays? To answer all questions in Hawaii for 6th months? What is the scale? The break even point for training is far far less than total usage.

Have you ever used one locally? Depending on your hardware it's anywhere between glacially to a morgue's AC slow. To the average person on the average computer it is nearly unusable, relative to the instant gratification of the web interface.

That gives you a sense of the resources required to do the task at all, but it doesn't scale linearly. 2 computers aren't twice as fast as one. It's logarithmic. With diminishly returns. In the end, this means one 100 word response uses the equivalent of 3 bottles of water.

How many queries are made per hour? How does that scale over time with increased usage of the same model? More than training a model. A lot more.

[-] Mothra@mander.xyz 2 points 3 days ago

Okay that's a valid point and one so far nobody comes up with. Congrats

load more comments (8 replies)
this post was submitted on 04 Nov 2024
97 points (96.2% liked)

Asklemmy

43783 readers
900 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS