165
submitted 1 year ago by floofloof@lemmy.ca to c/technology@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] LemmysMum@lemmy.world 5 points 1 year ago

I can read a copy written work and create a work from the experience and knowledge gained. At what point is what I'm doing any different to the A.I.?

[-] mkhoury@lemmy.ca 4 points 1 year ago

For one thing: when you do it, you're the only one that can express that experience and knowledge. When the AI does it, everyone an express that experience and knowledge. It's kind of like the difference between artisanal and industrial. There's a big difference of scale that has a great impact on the livelihood of the creators.

[-] LemmysMum@lemmy.world 3 points 1 year ago

Yes, it's wonderful. Knowledge might finally become free in the advent of AI tools and we might finally see the death of the copyright system. Oh how we can dream.

[-] Phanatik@kbin.social 0 points 1 year ago

I'm not sure what you mean by this. Information has always been free if you look hard enough. With the advent of the internet, you're able to connect with people who possess this information and you're likely to find it for free on YouTube or other websites.

Copyright exists to protect against plagiarism or theft (in an ideal world). I understand the frustration that comes with archaic laws and that updates to laws move at a glacier's pace, however, the death of copyright harms more people than you're expecting.

Piracy has existed as long as the internet has. Companies have been complaining ceaselessly about lost profits but once LLMs came along, they're fine with piracy if it's been masked behind a glorified search algorithm. They're fine with cutting jobs and replacing them with an LLM that produces less quality output at significantly cheaper rates.

[-] LemmysMum@lemmy.world 2 points 1 year ago

Information has always been free if you look hard enough. With the advent of the internet, you're able to connect with people who possess this information and you're likely to find it for free on YouTube or other websites.

And with the advent of AI we no longer have to look hard.

[-] Phanatik@kbin.social 2 points 1 year ago

For one thing, you can do the task completely unprompted. The LLM has to be told what to do. On that front, you have an idea in your head of the task you want to achieve and how you want to go about doing it, the output is unique because it's determined by your perceptions. The LLM doesn't really have perceptions, it has probabilities. It's broken down the outputs of human creativity into numbers and is attempting to replicate them.

[-] LemmysMum@lemmy.world -1 points 1 year ago* (last edited 1 year ago)

The ai does have perceptions, fed into by us as inputs. I give the ai my perceptions, the ai creates a facsimile, and I adjust the perceptions I feed into the ai until I receive an output that meets the needs of my requirements, no different from doing it myself except I didn't need to read all the books, and learn all the lessons myself. I still tailor the end product, just not to the same micro scale that we needed to traditionally.

[-] Phanatik@kbin.social 1 points 1 year ago

You can't feed it perceptions no more than you can feed me your perceptions. You give it text and the quality of the output is determined by how the LLM has been trained to understand that text. If by feeding it perceptions, you mean by what it's trained on, I have to remind you that the reality GPT is trained on is the one dictated by the internet with all of its biases. The internet is not a reflection of reality, it's how many people escape from reality and share information. It's highly subject to survivorship bias. If the information doesn't appear on the internet, GPT is unaware of it.

To give an example, if GPT gives you a bad output and you tell it that it's a bad output, it will apologise. This seems smart but it's not really. It doesn't actually feel remorse, it's giving a predetermined response based on what it's understood by your text.

[-] LemmysMum@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

We're not talking about perceptions as in making an AI literally perceive anything. I can feed you prompts and ideas of my own and get an output no different than if I was using AI tools, the difference being ai tools have already gathered the collective knowledge you'd get from say doing a course in photoshop, taking an art class, reading an encyclopaedia or a novel, going to school for music theory, etc.

[-] Phanatik@kbin.social 1 points 1 year ago

I get that part but I think what gets taken more seriously is how 'human" the responses seem which is a testament to how good the LLM model is. But that's set dressing when GPT has been known to give incorrect, outdated or contradictory answers. Not always but unless you know what kind of answer to expect, you have to verify what it's telling you which means you'll be spending half the time fact-checking the LLM.

[-] LemmysMum@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

Exactly, how is the end result not that of the user if they need to craft and modify and adjust and manipulate the prompts inputs and outputs of ai to produce something new or coherent?

It's just a tool. A tool that will improve access to human knowledge and improve each individuals ability to create and produce more complex works with less effort. Each of which will feed back into the algorithm expanding the knowledge and capacity of ai and human ingenuity.

[-] BraveSirZaphod@kbin.social 2 points 1 year ago* (last edited 1 year ago)

There is a practical difference in the time required and sheer scale of output in the AI context that makes a very material difference on the actual societal impact, so it's not unreasonable to consider treating it differently.

Set up a lemonade stand on a random street corner and you'll probably be left alone unless you have a particularly Karen-dominated municipal government. Try to set up a thousand lemonade stands in every American city, and you're probably going to start to attract some negative attention. The scale of an activity is a relevant factor in how society views it.

this post was submitted on 02 Oct 2023
165 points (89.5% liked)

Technology

34745 readers
152 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS