905
College professors are going back to paper exams and handwritten essays to fight students using ChatGPT
(www.businessinsider.com)
This is a most excellent place for technology news and articles.
When I was in College for Computer Programming (about 6 years ago) I had to write all my exams on paper, including code. This isn't exactly a new development.
So what you’re telling me is that written tests have, in fact, existed before?
What are you some kind of education historian?
He's not pointing out that handwritten tests are not something new, but that using handwritten tests over typing them to reflect the student's actual abilities is not new.
I had some teachers ask for handwritten programming exams too (that was more like 20 years ago for me) and it was just as dumb then as it is today. What exactly are they preparing students for? No job will ever require the skill of writing code on paper.
Maybe something like, a whiteboard interview...? They're still incredibly common, especially for new grads.
A company that still does whiteboard interviews one I have no interest in working for. When I interview candidates I want to see how they will perform in their job. Their job will not involve writing code on whiteboards, solving weird logic problems, or knowing how to solve traveling salesman problem off the top of their heads.
That's a valid opinion, and I largely share it. But, all these students need to work somewhere. This is something the industry needs to change before the school changes it.
Also, I've definitely done white board coding discussions in practice, e.g., go into a room, write up ideas on the white board (including small snippets of code or pseudo code).
I've done that too back before the remote work era, but using a whiteboard as a visual aid is not the same thing as solving a whole problem on a whiteboard.
It's a close enough problem; a lot of professors I've known aren't going to sweat the small stuff on paper. Like, they're not plugging your code into a computer and expecting it to build, they're just looking for the algorithm design, and that there's no grotesque violation of the language rules.
Sure, some are going to be a little harder "you missed a simicolon here", but even then, if you're doing your work, that's not a hard thing to overcome, and it's going to cost you a handful of points (if that sort of stuff is your only mistake you get a 92 instead of a 100).
And what happens when you run into the company that wants people who can prove they conceptually understand what the hell it is they're doing on their own, which requires a whiteboard?
I program as a hobby and I'll jot down code and plans for programs on paper when I am out and about during the day. The fuck kind of dystopian hellhole mindset do you have where you think all that matters is doing the bare minimum to survive? You know that life means more than that, don't you?
The ability to conceptually understand what they're doing is exactly what I'm testing for when interviewing. Writing a full program on a whiteboard is definitely not required for that. I can get that from asking them question, observing how they approach the problem, what kind of questions they ask me etc.
I definitely don't want them to do just the bare minimum to survive or to need to ask me for advice at every step (had people who ended up taking more of my time than it would've taken me to do their job myself).
I've never needed to write more than a short snippet of code at a time on a whiteboard, slack channel, code review, etc. in my almost 20 years in the industry. Definitely not to solve a whole problem blindly. In fact I definitely see it as a red flag when a candidate writes a lot of code without ever stopping to execute and test each piece individually. It simply becomes progressively more difficult to debug the more you add to it, that's common sense.
Which is equally useless. In the end you're developing a skill that will only be used in tests. You're training to be evaluated instead of to do a job well.
I personally never had a problem performing well in those tests, I happen to have the skill to compile code in my head, and it is a helpful skill in my job (I've been a software engineer for 19 years now), but it's definitely not a required skill and should not be considered as such.
Education is not just for getting a job, you dink.
Same. All my algorithms and data structures courses in undergrad and grad school had paper exams. I have a mixed view on these but the bottom line is that I'm not convinced they're any better.
Sure they might reflect some of the student's abilities better, but if you're an evaluator interested in assessing student's knowledge a more effective way is to make directed questions.
What ends up happening a lot of times are implementation questions that ask from the student too much at once: interpretation of the problem; knowledge of helpful data structures and algorithms; abstract reasoning; edge case analysis; syntax; time and space complexities; and a good sense of planning since you're supposed to answer it in a few minutes without the luxury and conveniences of a text editor.
This last one is my biggest problem with it. It adds a great deal of difficulty and stress without adding any value to the evaluator.