College professors are going back to paper exams and handwritten essays to fight students using ChatGPT::The growing number of students using the AI program ChatGPT as a shortcut in their coursework has led some college professors to reconsider their lesson plans for the upcoming fall semester.
I had some teachers ask for handwritten programming exams too (that was more like 20 years ago for me) and it was just as dumb then as it is today. What exactly are they preparing students for? No job will ever require the skill of writing code on paper.
Maybe something like, a whiteboard interview…? They’re still incredibly common, especially for new grads.
A company that still does whiteboard interviews one I have no interest in working for. When I interview candidates I want to see how they will perform in their job. Their job will not involve writing code on whiteboards, solving weird logic problems, or knowing how to solve traveling salesman problem off the top of their heads.
That’s a valid opinion, and I largely share it. But, all these students need to work somewhere. This is something the industry needs to change before the school changes it.
Also, I’ve definitely done white board coding discussions in practice, e.g., go into a room, write up ideas on the white board (including small snippets of code or pseudo code).
I’ve done that too back before the remote work era, but using a whiteboard as a visual aid is not the same thing as solving a whole problem on a whiteboard.
It’s a close enough problem; a lot of professors I’ve known aren’t going to sweat the small stuff on paper. Like, they’re not plugging your code into a computer and expecting it to build, they’re just looking for the algorithm design, and that there’s no grotesque violation of the language rules.
Sure, some are going to be a little harder “you missed a simicolon here”, but even then, if you’re doing your work, that’s not a hard thing to overcome, and it’s going to cost you a handful of points (if that sort of stuff is your only mistake you get a 92 instead of a 100).
Removed by mod
The ability to conceptually understand what they’re doing is exactly what I’m testing for when interviewing. Writing a full program on a whiteboard is definitely not required for that. I can get that from asking them question, observing how they approach the problem, what kind of questions they ask me etc.
I definitely don’t want them to do just the bare minimum to survive or to need to ask me for advice at every step (had people who ended up taking more of my time than it would’ve taken me to do their job myself).
I’ve never needed to write more than a short snippet of code at a time on a whiteboard, slack channel, code review, etc. in my almost 20 years in the industry. Definitely not to solve a whole problem blindly. In fact I definitely see it as a red flag when a candidate writes a lot of code without ever stopping to execute and test each piece individually. It simply becomes progressively more difficult to debug the more you add to it, that’s common sense.
Really? We do that frequently when problem solving. No one expects the written code to be perfect, but it shouldn’t demand impossible things.
Which is equally useless. In the end you’re developing a skill that will only be used in tests. You’re training to be evaluated instead of to do a job well.
deleted by creator
I personally never had a problem performing well in those tests, I happen to have the skill to compile code in my head, and it is a helpful skill in my job (I’ve been a software engineer for 19 years now), but it’s definitely not a required skill and should not be considered as such.
Removed by mod