120
submitted 3 weeks ago* (last edited 3 weeks ago) by federalreverse@feddit.org to c/nottheonion@lemmy.world

The lawsuit says the Hingham High School student handbook did not include a restriction on the use of AI.

"They told us our son cheated on a paper, which is not what happened," Jennifer Harris told WCVB. "They basically punished him for a rule that doesn't exist."


cross-posted from: https://lemmy.zip/post/24633700

Case file: https://storage.courtlistener.com/recap/gov.uscourts.mad.275605/gov.uscourts.mad.275605.8.0.pdf
Case file: https://storage.courtlistener.com/recap/gov.uscourts.mad.275605/gov.uscourts.mad.275605.13.0.pdf

you are viewing a single comment's thread
view the rest of the comments
[-] Humanius@lemmy.world 62 points 3 weeks ago* (last edited 3 weeks ago)

I'm guessing they probably have rules against plagiarism, or passing off other people's work as your own.
So then I guess it would be down to whether using AI (without disclosure?) is plagiarism or not

[-] 667@lemmy.radio 55 points 3 weeks ago* (last edited 3 weeks ago)

Most of the larger LLMs state the results of the model stemming from the user’s prompt intellectually belong to the user.

It’s a massive grey area, and the sum of these kinds of cases are what will define ownership of LLM output for the next ~50 years.

Don’t get me wrong, kid absolutely did not comply with the spirit of the assignment.

E: @Blue_Morpho@lemmy.world makes an excellent point:

If the student hired someone to write their essay and the author assigned all copyrights to the student, it's still plagiarism.

Who legally owns the work isn't the issue with plagiarism.

[-] BananaTrifleViolin@lemmy.world 24 points 3 weeks ago

The LLMs can claim whatever they like, it holds no weight or value. They are basically advanced plagiarism engines and the law has already made it clear you cannot copyright the output of an LLM.

This particular case will go nowhere, but there are plenty of legal cases between content creators and AI makers that are slowly moving through the legal system that will go somewhere.

[-] hedgehog@ttrpg.network 3 points 3 weeks ago

the law has already made it clear you cannot copyright the output of an LLM.

That’s true in this context and often true generally, but it’s not completely true. The Copyright Office has made it clear that the use of AI tools has to be evaluated on a case-by-case basis, to determine if a work is the result of human creativity. Refer to https://www.copyright.gov/ai/ai_policy_guidance.pdf for more details.

For example, they state that the selection and arrangement of AI outputs may be sufficient for a work to be copyrightable. And that’s without doing any post-processing of the AI’s outputs.

They don’t talk about situations like this, but I suspect that, if given a prompt like “Rewrite this paragraph from third person to first person,” where the paragraph in question is copyrighted, the output would maintain the same copyright as the input (particularly if performed faithfully and without hallucinations). Such a revision could be made with non-LLM technology, after all.

[-] Flax_vert@feddit.uk 2 points 3 weeks ago

So who owns the copyright then? Is the output just public domain?

[-] Blue_Morpho@lemmy.world 20 points 3 weeks ago* (last edited 3 weeks ago)

It doesn't matter what the LLM license states. Replace the LLM with a person doing exactly what the LLM does and ask yourself if it is plagiarism.

If I do your homework for you and I say, "Because you prompted me with the questions, the answers belong to you." That isn't a free 'get out of plagiarism card' for you. What I tell you isn't relevant.

It's not gray at all.

Edit: that's weird. I got a personal message but the reply showed up here.

[-] Blue_Morpho@lemmy.world 10 points 3 weeks ago* (last edited 3 weeks ago)

If the student hired someone to write their essay and the author assigned all copyrights to the student, it's still plagiarism.

Who legally owns the work isn't the issue with plagiarism.

[-] spankmonkey@lemmy.world 9 points 3 weeks ago* (last edited 3 weeks ago)

Most of the larger LLMs state the results of the model stemming from the user’s prompt intellectually belong to the user.

Who cares what they say to avoid being sued for copyright infringement?

[-] saltesc@lemmy.world 18 points 3 weeks ago

I sometimes use an LLM to "tidy up" my work and paste a bunch of writing in to see if it comes up with anything better. Some parts it will, others it won't, and I'll use or tweak some of it. I wonder if that counts? It's all my work going in, but it's using other people's work to make adjustments.

[-] Blue_Morpho@lemmy.world 17 points 3 weeks ago

Replace LLM with a person. If it was a person editing your work, does it make it plagiarism?

A common proofreading technique is to give your work to another person to read and make comments. That's not plagiarism.

[-] Saik0Shinigami@lemmy.saik0.com 6 points 3 weeks ago

People who proofread only generally make recommendations to edit. LLMs often "rewrite" the vast majority of the document.

If I tell a person who's my editor the concept of my paper and about 20-30% of the actual content that's in the end paper... sounds like someone else wrote the paper to me.

It's all up to how you're using the tool. Lots of kids out there will simple tell chatgpt to write something for them. Other's will simply ask for basic proofreading. It's a bitch to tell the difference on the grading side.

[-] Blue_Morpho@lemmy.world 2 points 3 weeks ago

Yes, that's exactly my opinion on the subject. ( I realize this is a contentless reply but I didn't want you to think I downvoted you.)

[-] Saik0Shinigami@lemmy.saik0.com 1 points 3 weeks ago

I didn’t want you to think I downvoted you.

I'm admin on my small instance. I can see the votes. No worries. In this case the downvote is from xektop@lemmy.world.

Anyway, the most I ever use LLMs professionally for is to help rearrange content for better flow or maybe convert more rambly bits into something that's concise. I tend to be more verbose than I need to be (mostly because my documentation for stuff is wildly verbose since I tend to forget stuff, which is great for documentation... not always great for talking through something for a client).

[-] dharmacurious@slrpnk.net 3 points 3 weeks ago

I write my own papers, but will put paragraphs through an llm and ask it how it can be improved (normally grammarly's 'ai'), and sometimes I take it's advice, but half the time I dislike what it's done. Sometimes I give it a bunch of information on what I need to write, and it'll spit something out, and then I'll sort of use it as a skeleton for my paper, but to be honest, it's kind of shit, regardless of which one I've tried. And it lies. So much.

[-] BonerMan@ani.social -1 points 3 weeks ago

But those rules don't apply here.

this post was submitted on 22 Oct 2024
120 points (89.0% liked)

Not The Onion

12295 readers
467 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 1 year ago
MODERATORS