905
submitted 1 year ago by L4s@lemmy.world to c/technology@lemmy.world

College professors are going back to paper exams and handwritten essays to fight students using ChatGPT::The growing number of students using the AI program ChatGPT as a shortcut in their coursework has led some college professors to reconsider their lesson plans for the upcoming fall semester.

top 50 comments
sorted by: hot top controversial new old
[-] AlmightySnoo@lemmy.world 131 points 1 year ago* (last edited 1 year ago)

I think that's actually a good idea? Sucks for e-learning as a whole, but I always found online exams (and also online interviews) to be very easy to game.

[-] Goodtoknow@lemmy.ca 80 points 1 year ago

Really sucks for people with disabilities and handwriting issues.

[-] Mugmoor@lemmy.dbzer0.com 73 points 1 year ago

It's always sucked for them, and it always will. That's why we make accommodations for them, like extra time or a smaller/move private exam hall.

[-] Instigate@aussie.zone 19 points 1 year ago

And readers/scribes! I’ve read and scribed for a friend who had dyslexia in one of her exams and it worked really well. She finished the exam with time to spare and got a distinction in the subject!

[-] Tavarin@lemmy.ca 13 points 1 year ago

Yep, my girlfriend acted as a scribe for disabled students at a university. She loved it, and the students were able to complete their written work and courses just fine as a result.

[-] Naia@lemmy.blahaj.zone 19 points 1 year ago

My handwriting has always been terrible. It was a big issue in school until I was able to turn in printed assignments.

Like with a lot of school things, they do a shit thing without thinking about negative effects. They always want a simple solution to a complex problem.

[-] Tavarin@lemmy.ca 12 points 1 year ago

My uni just had people with handwriting issues do the exam in a separate room with a writer for you to narrate answers to.

People have been going to universities for millennia before the advent of computers, we have lots of ways to help people with disabilities that don't require computers.

load more comments (3 replies)
load more comments (7 replies)
[-] HexesofVexes@lemmy.world 127 points 1 year ago

Prof here - take a look at it from our side.

Our job is to evaluate YOUR ability; and AI is a great way to mask poor ability. We have no way to determine if you did the work, or if an AI did, and if called into a court to certify your expertise we could not do so beyond a reasonable doubt.

I am not arguing exams are perfect mind, but I'd rather doubt a few student's inability (maybe it was just a bad exam for them) than always doubt their ability (is any of this their own work).

Case in point, ALL students on my course with low (<60%) attendance this year scored 70s and 80s on the coursework and 10s and 20s in the OPEN BOOK exam. I doubt those 70s and 80s are real reflections of the ability of the students, but do suggest they can obfuscate AI work well.

load more comments (82 replies)
[-] mwguy@infosec.pub 76 points 1 year ago

They're about to find out that gen Z has horrible penmanship.

[-] Holyginz@lemmy.world 25 points 1 year ago

Millennial here, haven't had to seriously write out anything consistently in decades at this point. There's no way their handwriting can be worse than mine and still be legible lol.

[-] crwcomposer@lemmy.world 17 points 1 year ago* (last edited 1 year ago)

As a millennial with gen Z teens, theirs is worse, though somehow not illegible, lol. They just write like literal 6 year olds.

load more comments (11 replies)
load more comments (2 replies)
[-] MaggiWuerze@feddit.de 70 points 1 year ago

has led some college professors to reconsider their lesson plans for the upcoming fall semester.

I'm sure they'll write exams that actually require an actual understanding of the material rather than regurgitating the seminar PowerPoint presentations as accurately as possible...

No? I'm shocked!

[-] OhNoMoreLemmy@lemmy.ml 48 points 1 year ago

We get in trouble if we fail everyone because we made them do a novel synthesis, instead of just repeating what we told them.

Particularly for an intro course, remembering what you were told is good enough.

[-] zigmus64@lemmy.world 22 points 1 year ago

The first step to understanding the material is exactly just remembering what the teacher told them.

load more comments (11 replies)
load more comments (2 replies)
[-] aulin@lemmy.world 59 points 1 year ago

There are places where analog exams went away? I'd say Sweden has always been at the forefront of technology, but our exams were always pen-and-paper.

load more comments (5 replies)
[-] neptune@dmv.social 48 points 1 year ago

This isn't exactly novel. Some professors allow a cheat sheet. But that just means that the exam will be harder.

Physics exam that allows a cheat sheet asks you to derive the law of gravity. Well, OK, you write the answer at the bottom pulled from you cheat sheet. Now what? If you recall how it was originally created you probably write Newtons three laws at the top of your paper... And then start doing some math.

Calculus exam that let's you use wolfram alpha? Just a really hard exam where you must show all of your work.

Now, with ChatGPT, it's no longer enough to have a take home essay to force students to engage with the material, so you find news ways to do so. Written, in person essays are certainly a way to do that.

load more comments (6 replies)
[-] Rozz@lemmy.sdf.org 47 points 1 year ago

Am I wrong in thinking student can still generate an essay and then copy it by hand?

[-] CrimsonFlash@lemmy.ca 56 points 1 year ago

Not during class. Most likely a proctored exam. No laptops, no phones, teacher or proctor watching.

load more comments (3 replies)
load more comments (4 replies)
[-] Mugmoor@lemmy.dbzer0.com 38 points 1 year ago

When I was in College for Computer Programming (about 6 years ago) I had to write all my exams on paper, including code. This isn't exactly a new development.

[-] whatisallthis@lemm.ee 27 points 1 year ago

So what you’re telling me is that written tests have, in fact, existed before?

What are you some kind of education historian?

load more comments (1 replies)
load more comments (13 replies)
[-] UsernameIsTooLon@lemmy.world 34 points 1 year ago

You can still have AI write the paper and you copy it from text to paper. If anything, this will make AI harder to detect because it's now AI + human error during the transferring process rather than straight copying and pasting for students.

load more comments (1 replies)
[-] Four_lights77@lemm.ee 33 points 1 year ago

This thinking just feels like moving in the wrong direction. As an elementary teacher, I know that by next year all my assessments need to be practical or interview based. LLMs are here to stay and the quicker we learn to work with them the better off students will be.

[-] pinkdrunkenelephants@sopuli.xyz 25 points 1 year ago

And forget about having any sort of integrity or explaining to kids why it's important for them to know how to do shit themselves instead of being wholly dependent on corporate proprietary software whose accessibility can and will be manipulated to serve the ruling class on a whim 🤦

[-] Not_Alec_Baldwin@lemmy.world 21 points 1 year ago* (last edited 1 year ago)

It's insane talking to people that don't do math.

You ask them any mundane question and they just shrug, and if you press them they pull out their phone to check.

It's important that we do math so that we develop a sense of numeracy. By the same token it's important that we write because it teaches us to organize our thoughts and communicate.

These tools will destroy the quality of education for the students that need it the most if we don't figure out how to reign in their use.

If you want to plug your quarterly data into GPT to generate a projection report I couldn't care less. But for your 8th grade paper on black holes, write it your damn self.

load more comments (2 replies)
load more comments (2 replies)
load more comments (4 replies)
[-] joel_feila@lemmy.world 28 points 1 year ago

Well if i go back to school now im fucked i cant read my own hand writting.

load more comments (1 replies)
[-] TimewornTraveler@lemm.ee 28 points 1 year ago

Can we just go back to calling this shit Algorithms and stop pretending its actually Artificial Intelligence?

[-] WackyTabbacy42069@reddthat.com 36 points 1 year ago

It actually is artificial intelligence. What are you even arguing against man?

Machine learning is a subset of AI and neural networks are a subset of machine learning. Saying an LLM (based on neutral networks for prediction) isn't AI because you don't like it is like saying rock and roll isn't music

load more comments (23 replies)
[-] joel_feila@lemmy.world 13 points 1 year ago

But then the investor wont throw wads of money at these fancy tech companies

load more comments (2 replies)
[-] thedirtyknapkin@lemmy.world 26 points 1 year ago

as someone with wrist and hand problems that make writing a lot by hand, I'm so lucky i finished college in 2019

[-] Mtrad@lemm.ee 25 points 1 year ago

Wouldn't it make more sense to find ways on how to utilize the tool of AI and set up criteria that would incorporate the use of it?

There could still be classes / lectures that cover the more classical methods, but I remember being told "you won't have a calculator in your pocket".

My point use, they should prepping students for the skills to succeed with the tools they will have available and then give them the education to cover the gaps that AI can't solve. For example, you basically need to review what the AI outputs for accuracy. So maybe a focus on reviewing output and better prompting techniques? Training on how to spot inaccuracies? Spotting possible bias in the system which is skewed by training data?

[-] revv@lemmy.blahaj.zone 14 points 1 year ago

Training how to use "AI" (LLMs demonstrably possess zero actual reasoning ability) feels like it should be a seperate pursuit from (or subset of) general education to me. In order to effectively use "AI", you need to be able to evaluate its output and reason for yourself whether it makes any sense or simply bears a statitstical resemblance to human language. Doing that requires solid critical reasoning skills, which you can only develop by engaging personally with countless unique problems over the course of years and working them out for yourself. Even prior to the rise of ChatGPT and its ilk, there was emerging research showing diminishing reasoning skills in children.

Without some means of forcing students to engage cognitively, there's little point in education. Pen and paper seems like a pretty cheap way to get that done.

I'm all for tech and using the tools available, but without a solid educational foundation (formal or not), I fear we end up a society snakeoil users in search of the blinker fluid.

[-] Atomic@sh.itjust.works 14 points 1 year ago

That's just what we tell kids so they'll learn to do basic math on their own. Otherwise you'll end up with people who can't even do 13+24 without having to use a calculator.

load more comments (8 replies)
[-] settxy@lemmy.world 12 points 1 year ago

There are some universities looking at AI from this perspective, finding ways to teach proper usage of AI. Then building testing methods around the knowledge of students using it.

Your point on checking for accuracy is on point. AI doesn't always puke out good information, and ensuring students don't just blindly believe it NEEDS to be taught. Otherwise wise you end up being these guys... https://apnews.com/article/artificial-intelligence-chatgpt-courts-e15023d7e6fdf4f099aa122437dbb59b

load more comments (2 replies)
[-] jordanlund@lemmy.one 22 points 1 year ago

Chat GPT - answer this question, add 4 consistent typos. Then hand transcribe it.

[-] SocialMediaRefugee@lemmy.world 16 points 1 year ago

Might as well go back to oral exams and ask the student questions on the spot.

load more comments (7 replies)
load more comments
view more: next ›
this post was submitted on 13 Aug 2023
905 points (97.8% liked)

Technology

59736 readers
794 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS