Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Education AI

'I Used to Teach Students. Now I Catch ChatGPT Cheats' (thewalrus.ca) 227

Philosophy/ethics professor Troy Jollimore looks at the implications of a world where many students are submitting AI-generated essays. ("Sometimes they will provide quotations, giving page numbers that, as often as not, do not seem to correspond to anything in the actual world...") Ideally if the students write the essays themselves, "some of them start to feel it. They begin to grasp that thinking well, and in an informed manner, really is different from thinking poorly and from a position of ignorance. That moment, when you start to understand the power of clear thinking, is crucial.

"The trouble with generative AI is that it short-circuits that process entirely." One begins to suspect that a great many students wanted this all along: to make it through college unaltered, unscathed. To be precisely the same person at graduation, and after, as they were on the first day they arrived on campus. As if the whole experience had never really happened at all. I once believed my students and I were in this together, engaged in a shared intellectual pursuit. That faith has been obliterated over the past few semesters. It's not just the sheer volume of assignments that appear to be entirely generated by AI — papers that show no sign the student has listened to a lecture, done any of the assigned reading, or even briefly entertained a single concept from the course...

It's other things too... The students who beg you to reconsider the zero you gave them in order not to lose their scholarship. (I want to say to them: Shouldn't that scholarship be going to ChatGPT?â) It's also, and especially, the students who look at you mystified. The use of AI already seems so natural to so many of them, so much an inevitability and an accepted feature of the educational landscape, that any prohibition strikes them as nonsensical. Don't we instructors understand that today's students will be able, will indeed be expected, to use AI when they enter the workforce? Writing is no longer something people will have to do in order to get a job.

Or so, at any rate, a number of them have told me. Which is why, they argue, forcing them to write in college makes no sense. That mystified look does not vanish — indeed, it sometimes intensifies — when I respond by saying: Look, even if that were true, you have to understand that I don't equate education with job training.

What do you mean? they might then ask.

And I say: I'm not really concerned with your future job. I want to prepare you for life...

My students have been shaped by a culture that has long doubted the value of being able to think and write for oneself — and that is increasingly convinced of the power of a machine to do both for us. As a result, when it comes to writing their own papers, they simply disregard it. They look at instructors who levy such prohibitions as irritating anachronisms, relics of a bygone, pre-ChatGPT age.... As I go on, I find that more of the time, energy, and resources I have for teaching are dedicated to dealing with this issue. I am doing less and less actual teaching, more and more policing. Sometimes I try to remember the last time I actually looked forward to walking into a classroom. It's been a while.

'I Used to Teach Students. Now I Catch ChatGPT Cheats'

Comments Filter:
  • Expel them... (Score:5, Insightful)

    by Talon0ne ( 10115958 ) on Sunday March 09, 2025 @03:38PM (#65221703)

    If you'd prefer to cheat in college, something you're presumably paying for, then you should be expelled. Make a spot for someone who's serious about learning. You get one warning.

    • Interesting FP, but what if the most valuable skills to learn these years are how to use software tools of various types?

      I actually wonder if there should be two testing tracks, one for people who want to do it in their heads and a separate kind of tests for people who want to use computers to help them work. Should be able to ask much more interesting questions to the students in the second category. But have to use "extreme" AI to grade their tests?

      The tools remain morally neutral. It's what you do with t

      • by DrMrLordX ( 559371 ) on Sunday March 09, 2025 @05:09PM (#65221859)

        If they need to teach a class on how to use AI tools, then let them. If your professor (and/or TA) state clearly at the beginning of the class that AI tools are not allowed for final submissions and that AI submissions will result in a 0, then the student should not be surprised to receive a failing grade.

      • by beelsebob ( 529313 ) on Sunday March 09, 2025 @05:37PM (#65221913)

        But university isnâ(TM)t for teaching skills. Thatâ(TM)s what a vocational college is for. University is for teaching you how to think, and come up with new ideas. The goal of getting you to write an essay isnâ(TM)t to teach you how to write, or how to use a tool. Itâ(TM)s to teach you to research a topic, to think about the things youâ(TM)ve researched, and to then communicate that idea. Itâ(TM)s one thing to write down the ideas youâ(TM)ve had, as then have AI proof read what you write. Itâ(TM)s quite another to have ai come up with all the âoeideasâ.

        • by troff ( 529250 )

          > But university isnâ(TM)t for teaching skills. Thatâ(TM)s what a vocational college is for. University is for teaching you how to think, and come up with new ideas

          THINKING IS A SKILL.

      • I agree with the expel then answer. I see what you are suggesting, but the students weren't using the AI as a tool, but as a way to cheat. This isn't any different then using a camera to take a picture and then claiming it was something you "painted". A tool was used, but not to help, but to completely do.

        I'm fine with using the AI to brainstorm, make suggestions, or even help with grammar and spelling. But once you tell it "do the whole thing for me", then you aren't doing any work/learning and are just ch
      • The teachers give homework, grade homework, and give the students a grade. By doing so, the teachers are also grading themselves. The student's grade is also a statement about how good a job the teacher is doing teaching the student. No student wants to take a class from the teacher that flunks most of them, and so on.

        The incentives don't line up with the results we want. The same goes for students, most of whom are after a high grade for entirely practical reasons (it translates directly to scholarship

        • by Moryath ( 553296 )

          By doing so, the teachers are also grading themselves. The student's grade is also a statement about how good a job the teacher is doing teaching the student.

          And yet, students in the same class often perform very differently. And often this is not the fault of the teacher or professor, but rather, the student's decision not to engage honestly and put in the needed effort.

      • by taustin ( 171655 ) on Sunday March 09, 2025 @08:05PM (#65222169) Homepage Journal

        Interesting FP, but what if the most valuable skills to learn these years are how to use software tools of various types?

        Then schools should have classes specifically about that.

        In the meantime, unless you're claiming that's the only skill to be taught, it should be banned in classes where it is cheating, and like all cheating, its use should be punished.

      • by troff ( 529250 )

        > but what if the most valuable skills to learn these years are how to use software tools of various types?

        Then, cover that in a f***ing software tools course, not an ethics course.

    • Persistent cheating should result in expulsion.

      And reading the comments below, I now realize the average age of many of the commenters. I never realized that so many were still in highschool (judging from their defensiveness about cheating together with their choice of language).

      They are so biased and think that pretending to know about something when you don't is not the problem. The problem is everyone else.

      Well it's not. And sadly so many people don't even pretend to care about personal integrity.

      I will

    • If everyone is cheating and you expel them all, there will be no one left.

      A far better approach for this professor would be to use a tool like RevisionHistory.com

      Itâ(TM)s a totally free chrome plugin that gives stats on how a document was constructed, by analyzing the revision history. It will tell you how many copy/pastes there were, the size of them, and will even let the user to do a âoereplayâ to watch the document being written at high speed to help understand the student process. The s

      • Unless something has changed in recent years Google docs are garbage for writing papers with citations. MS desktop Word , for all of its limitations and quirkiness, at least has a citation manager built in. And can do all of the "common" styles. AND can have plugins for external citation managers that still lets you insert citations in the document on the fly.

        Last I looked at Gdocs it had zero support for citation management, and no way to insert inline citations other than manually typing them out. Maybe i

        • Yes, but the goal of the tool here is not (necessarily ) to give the student the optimal writing experience. Itâ(TM)s to give the lecturer a view into how they thought.

    • If you'd prefer to cheat in college, something you're presumably paying for, then you should be expelled. Make a spot for someone who's serious about learning. You get one warning.

      There was a news special several years ago, I think on ABC, about cheating in college and many students interviewed said all they cared about was getting good grades, so they could get a good job and make a lot of money. Actually learning things wasn't a big concern. Not sure how they thought that would work out in the long run, but they didn't seem to care. And in most cases, they weren't paying for college anyway, their parents were...

      I agree with your sentiment. I worked and paid my own way throug

      • Same, only significantly later (2004).

        I wonder if the experience of working your way through college gave us a different perspective on it.
        OK, I'm kidding. I don't really wonder that. But I suppose I do wonder if the perspective is better or worse than that of a person who thinks it's just a step in "fake it until you make it".

        Sure, you and I have knowledge and experience that we can use to further humanity as a whole- but the other cohort may be the better survivors overall... until they need us to st
  • by sound+vision ( 884283 ) on Sunday March 09, 2025 @03:53PM (#65221731) Journal

    The process of writing an essay requires you to organize, turn over, comprehend, and contemplate ideas. I recall writing a comment to similar effect here a few months back, and the replies I got sounded a lot like the professor's students. They didn't comprehend that education isn't about the paper, it's about the process you go through to write it.

    And in the business world it seems like "paper pushing" jobs are being taken to a whole new level. If your job involves writing BS, as lots of corporate jobs do, AI is perfectly suited to that. Some departments might be able to cut staff in half and double output with these tools.

    It seems to be coming to a point where people are just trained to nod their heads while ChatGPT writes something that sounds really cool and professional. That's how it's going in school, that's how it's going in business, and in government. All levels of society.

    Having large parts of the population dumb and essentially worthless is very concerning to me. They're liable to end up in a foreign (or domestic) meat grinder. That eventuality may be closer than we believe. Call it the singularity.

    • by Teun ( 17872 )
      Very well spoken!
    • This is the view of a Luddite. Technology evolves and education inevitably evolves with it. In it's infancy, using the Internet for research was prohibited, now no one would suggest all references must come from books. Going back further, you had to turn in hand-written paper, no spell checking or grammar help

      LLMs don't spit out quality papers without good prompts and refinement. They can regurgitate facts and references extremely well but to get a well reasoned argument you have to provide some reas
      • But the LLM is not the best tool for the task, which is to develop the skill to collect, collate, absorb and distill information, then present it coherently. Even IF you think letting the LLM writing the rough draft and you'll just polish it is "the wave of the future" because it saves time or whatever, you cannot actually do that if you are not familiar with the material, or how to present an argument!

      • Judging from your UID and your viewpoint, I think I can now explain why all of our Gen Z new-hires have been dumbfucks.

        I wouldn't be in any way against a "class for the responsible and effective use of LLMs", but teaching someone that an LLM is a "tool with which you can demonstrate your knowledge on a topic" is analogous to cheering on the ingenuity of someone who has Stockfish running on their phone at a chess match.
      • by troff ( 529250 )

        All the other tools you've named are "source of data", or "checking text errors". LLMs are being used to substitute for actual THOUGHT AND TEXT GENERATION.

        And make the "prompt engineering" argument as much as you like; the race for the next LLM is taking care of that as well.

        If you think "processing the knowledge with your own brain" is an "obsolete educational need", then you're demonstrating you haven't got an education or much of a brain.

    • I guess I'm just an unreconstructed intellectual, because I can't understand how anyone could fail to see that teachers/professors give assignments so that you go through a process of learning.

      The learning happens inside of your own brain, and if you don't do the learning your brain won't have either the knowledge in the assignment or the experience of having figured out how to gain and use that knowledge. When you know how to learn things that makes you useful for learning things as needed.

      I guess people r

  • by PseudoThink ( 576121 ) on Sunday March 09, 2025 @04:13PM (#65221763)

    I started teaching computer science and engineering classes in a large, public high school a few years ago. There is a significant fraction of students who regularly rely on ChatGPT (or similar tools) to do their work for them. Never mind the private group chats they use to share their work and tips for cheating. They talk about it openly with each other during class or in the hallways, on a regular basis.

    I've had some students who cheated in similar fashion in my class. Whenever I notice it (so far), I take it seriously and go through the normal (and very time consuming) process to document and report it. Then the parents get involved.

    About half the time, the parents challenge the report, based solely on their child's word. I try to make sure they understand the documented evidence, but in the end I always go with whatever the parent prefers, which is usually to rescind the report. Never mind the significant hassle and legal liability doing otherwise would create for myself and administration. I can make mistakes, and it's not always completely obvious whether something constitutes an academic integrity violation. I imagine that if I were the child's parent, I would want their teacher to listen to me and prioritize what I prefer for my own child. I'm not always happy about it, but I'm okay with that.

    Because of all this, I can understand why I hear so many students regularly talking about cheating. It's probably because their teachers don't even try to detect or correct the issue. It's not worth the hassle to them, and its far, far easier to just let it go unaddressed. I think the result is that many high school students learn to normalize various degrees of cheating, cunning tactics, and learn to become rather convincing actors. It's only later in college or on the job that they learn about the real world consequences.

    • by philmarcracken ( 1412453 ) on Sunday March 09, 2025 @05:17PM (#65221885)

      The book "Punished by Rewards" by Alfie Kohn addresses this issue. Kohn argues that when we emphasize external rewards like grades and test scores, we often undermine students' intrinsic motivation to learn and create consequences like cheating.

      The educational system has made the reward (the grade or score) more important than the actual learning:

      * The grade becomes the goal rather than a measurement of learning
      * Students learn to value performance over mastery
      * The extrinsic reward (grade) crowds out intrinsic motivation to understand the material

      Kohn would view cheating as a predictable outcome of reward-based educational approaches. When schools emphasize test performance, rankings, and grades, they inadvertently teach students that the number matters more than the knowledge. Students rationally respond to these incentives by finding the easiest path to the reward - through cheating.
      This doesn't excuse cheating, but it helps explain why reward-focused systems tend to produce it. Restructuring educational approaches to emphasize curiosity, meaningful engagement with material, and learning for understanding rather than for performance metrics could happen, but that would take... effort. Too hard basket? Heh

      • It takes more than effort. There needs to be an evidence-based methodology.

        How do we remove grades from the equation? How do we ensure that teachers can effectively adopt this new teaching method? How do we ensure this teaching method does not show preference to higher-income students? How do we make sure we do not fail an entire generation of students by making them guinea pigs for a wildly new pedagogy?

        I wholeheartedly agree that we need to shift the educational paradigm. I just think that if we are going

      • You can't have both compulsory education [wikipedia.org] and an intrinsic-only reward system.

        The implication of expecting both is that everyone must want to learn and think deeply. That's clearly not the case. Ultimately nations use education certificates as verification for some kind of higher-end work positions. That's inherently an extrinsic reward, and something people will try to cheat at.

        So pick one.

      • When a measure becomes a target, it ceases to be a good measure.

        When the number of commits or SLOC changed is the primary metric management uses to measure a programmer's worth, then you get low-quality commits.

        When your top students have to bribe teachers for bonus points above their already better-than-perfect grades to even be in the running for valedictorian, education isn't their goal.

  • The same services that scan for plagiarism also scan for AI. No need to play AI detective when this has been already done and part of most school's mechanisms now.

  • by Pinky's Brain ( 1158667 ) on Sunday March 09, 2025 @04:29PM (#65221789)

    Let them use their out of class time for reading and dedicate more in classroom time for graded work, with a ban on electronic devices, problem solved. It requires balancing the course around grading less hours work, but it's not like out of class time is 10x longer than in class time.

    Or he could continue fighting windmills, but I don't think that constitutes clear thinking.

    • by debrisslider ( 442639 ) on Sunday March 09, 2025 @11:54PM (#65222407)

      That's totally backwards. You get, what, 3-4 hours a week to have a lecture and in-class discussion with a PhD level educator and you think that time would be better spent having him watch them take quizzes? They don't use in-class time for reading, they use it for a structured explanation and discussion of the material that they just read or will just read, and if you're reading multiple sources, a demonstration of how you can take aspects of both and combine them, or how one informs the other, or even how two diametrically opposed sources can teach you better than two that have the same method or viewpoint.

      The professor can throw in random references, historical context, and insights that simplify material or explanations that break down a complicated subject into something easier to comprehend as part of a structured plan that will build up a particular area of understanding over dozens of classes, as opposed to an AI that will spit out a one-page explanation of a concept pulled from who knows what sources that might be completely hallucinating a key point that students aren't equipped to notice, and in any case might generate "insights" that are totally irrelevant and unhelpful to the overall educational program while deluding the student that they've done "enough" of the learning to get by.

      It's also hard if not impossible to adequately judge learning based on in-class hand-written finals with no electronic devices. Especially once you get out of 101 level classes with 500 students. It's a tremendous waste of time and money to make a professor do the job of a teaching assistant and sit around watching a bunch of teenagers write in blue books.

      We don't need to ruin students' educational prospects for the sake of their own convenience. They need to learn how to learn, because if they don't, they're gonna in be in trouble later on.

  • ChatGPT like all LLM's is only a probability mirror of you.

    It is like an advanced universal translator with a probability statistics engine, it will basically try to predict your next sentence and words.
    It will try to get whatever you are searching for, right. And what is right isn't always right, that depends on you and the probability it has been trained on.

    It's really just as simple as that.

    • ChatGPT like all LLM's is only a probability mirror of you.

      No.

      It is like an advanced universal translator with a probability statistics engine, it will basically try to predict your next sentence and words.

      Yours? No.
      It is not trained to predict your next word, though the idea if you being locked in a room, grading completions of seemingly infinite sets of data serving as a human error function for several hundred million years is amusing as hell.

      It will try to get whatever you are searching for, right. And what is right isn't always right, that depends on you and the probability it has been trained on.

      I guess this is true-ish, being the context window and prompts are certainly part of the weighting, but it's still not nearly as straight forward as "it tries to predict your next sentence and words".

      The test is quite simple.
      Find your favorite LLM, prompt it t

  • by Dan East ( 318230 ) on Sunday March 09, 2025 @04:35PM (#65221807) Journal

    I'm sorry, but the ability to write a good essay is a distinct skill independent of whatever it is the student is supposed to be writing about. By default this means that the subject material itself is not the only metric being measured and graded. It's one thing if we're talking graduate level students in a very specific field of study where the generation of research papers becomes a thing, but for many undergraduate courses the essay format is not optimal and is overused.

    • Some good highschools give their students composition classes, either as part of their English curriculum or as a separate course. Those who don't have that background from highschool (or junior high) can/should take remedial courses in college to correct that shortcoming. If someone is taking a philosophy/ethics course, being able to compose a simple essay should be a prerequisite.

  • This is not my first rodeo, and before then it was Encarta cheats and World Book cheats. Thinking is excercise for the brain, and many people don't like excercise that's why obesity is so easy to aquire. Soon we will have neurolinks just telling us all the answers and colleges will be just museums.
  • As the saying goes, "You can lead a horse to water but you can't make it drink; you can teach a man some knowledge but you can't make him think..."

    I understand where he's coming from, but even in college, a fair portion don't want to be academics, and even fewer want to be intellectuals. They want to be taught what to think, not how.

    It's a classic catch-22 type of problem; professors want to transform students into intellectuals, but intellectuals don't need to be taught (they are self-teaching) Consequ

    • This prof is teaching a philosophy course. It's probably a humanities requirement or an "easy A" elective. Most of his students likely aren't pursuing degrees in philosophy. At least let's hope not! The world can only shoulder so many of them. The rest become bartenders.

      • So I have a funny story about filling this requirement.

        Here I was, 17, picking out my classes based on the reqs sheet, etc. I see psychology 120. Sign up. I go find the book at the book store matching numbers and the title. I find the psychology book I need. I get to class. The teacher proceeds to write "Welcome to Philosophy 120".

        I just sat there dumbfounded for a minute. Then I go back and look at the book (I hadn't opened it yet..) and sure as shit, says Philosophy on the cover. I look at the registratio

  • This wasnt a thing when I was in school. I believe if you had the dough, you could pay known paper mills to write for you but I never understood why someone would risk it rather than just put it together yourself since cheating risked expulsion. Hard earned C was always more rewarding than an easy A. Being called in to answer once has to be nerve wrecking especially if its clear you dont know the material, you arnt going to get a second pass.
  • Are you there to develop skills or to build a network of connections?
    In a Cambridge-style debate, I heard on NPR 20 years ago the proponents of the later proposition carried the debate.
    This seems to show that this has become the majority view.
    The problem is that the students and the instructors don't agree on the goals

  • It is no different than using a calculator on your math test. Sometimes the teacher says you are allowed to use a calculator and most of the time you are not allowed. The teacher realizes that you need to know how to do it both ways, but the teacher needs to be clear about when using the tool is appropriate. It sounds like this professor is not being clear and upfront about if and when using AI tools is appropriate.
  • by Berkyjay ( 1225604 ) on Sunday March 09, 2025 @05:38PM (#65221915)

    College professors have always had this conceit. But it's a way of thinking that is long out of date with modern education. No one really goes to college by choice, that died in the 90's. Almost everyone is there because they feel like they have to in order to find a well paying job. I'm old enough to remember math teachers in grade school banning the use of calculators and English teachers demanding all our papers be handwritten. Educators always seem to the the last ones to adopt to newer technologies. I'm not saying that students using AI is a good thing. Frankly I think it's going to make most people more stupid. But it's up to the educators to adapt and find new ways to get students to actually learn.

  • I stopped giving students homework and I evaluate them in class. It's pointless to give out essay assignments at the time of AI.
    In fact, I had to teach math at some point (not my specialty), and the course admin would give us these weekly homework sheets to give out to students. I noticed a few of them do the work and the rest just copies. I told them there is this thing called AI. You can ask it to solve the homework and you don't have to copy from your buddies (I don't care, it's not my course). The nex
  • And of good enough quality for OCR.

    Now, if they cheat, well, they have the benefit of having written it down manually (which has considerable benefit).

    This also forces legible handwriting.

    For the record, so far in high school, all of my kid's writing has been hand written (without the OCR requirement).

  • by timholman ( 71886 ) on Sunday March 09, 2025 @07:25PM (#65222109)

    My colleagues and I in academia talk about this all the time. Everyone can see the ChatGPT freight train heading straight towards liberal arts programs. Anyone who thinks that there is any hope of survival for the educational process in its current form is in denial of what is happening.

    Cheating with ChatGPT is becoming pervasive at the junior-high school level and above. Teachers and professors won't be able to adapt fast enough, and in many cases will themselves be replaced by AIs within a decade. And shortly after people realize that your typical liberal arts degree isn't worth the paper it's printed on, with students who cannot read, write, or reason beyond the level of a 12-year-old, then the educational edifice collapses.

    Ultimately the joke will be on the students themselves once employers realize that they aren't worth paying a penny to hire - just use an infinitely more capable AI for the job. Interesting times are ahead for all of us. I'll be retiring just in the nick of time to avoid the worst of the academic mayhem.

  • Give them weekly assignments, tell them to take no more than 3 hours for the week on it, not too bad for a class.
    Tell them under no circumstances to use the Internet, ChatGPT, or anything. Just the source materials and classes and whatever insights they have.
    Do this every week
    Grade as normal every week.
    For the mid term, have them come in with the books, and have computers with no access to the internet or phones. They type up and submit a 3 hour essay on one of the 5 topics you give them (or whatever).
    Compa

    • I don't think there's a school anywhere that has a (academic integrity) policy that would permit removing students based on that state of affairs.

    • by habig ( 12787 )

      This is the way my intro physics classes often end up.

      The ones who wipe out on the tests complain bitterly about being tested on things that weren't taught.

      When, in reality, the same things are on the exams that were on the homeworks which they cheated their way past without thinking. Or in the in-class problem solving sessions where they waited for the people next to them to figure stuff out and just wrote it down to get it done.

      Not that either of those things gets them a whole lot of points. They're wor

    • and have computers with no access to the internet or phones.

      This is the hard part. Universities don't have computer rooms anymore, or they only have for a small group not an exam day with everyone. You can't go BYOD because there are too many ways student will manage to connect to the internet (e.g. hide a 5G hotspot in the toilet).

      We force students to buy a graphing calculator because it can be set in Exam Mode that prevents cheating. But there isn't comparable hardware for essay writing. We'd need a FreeWrite device without internet, but these days all devices fea

  • by InfiniteBlaze ( 2564509 ) on Sunday March 09, 2025 @09:42PM (#65222261)
    Grade the student on their use of AI. If they present content that isn't relevant, that doesn't exist, or is patently false...grade it as such. There is a certain minimum proficiency with language that is required for skillful curation of AI-generated content. They'll need to learn the rules of grammar and how to validate resources; these are the core skills that a student demonstrates when they write for themselves. They should learn to treat the AI-generated content as their rough draft, or their outline - not as their final draft. This is seemingly the best course of action as it accomplishes the goal of teaching the important life skills while also accommodating advances in technology.
  • by Petersko ( 564140 ) on Monday March 10, 2025 @02:47AM (#65222577)

    Unless you're a school that lives or dies on the sterling reputation of its grads, just wave them through. Let them sink or swim post grad. Take their money and wave them through. But put a disclaimer on the result. "AI usage limits the value of education, and may leave you less useful and more vulnerable in the workforce". Then, when they find themselves cut first when AI encroaches because their value add is zero, it won't be a surprise. But stop trying to protect people from themselves.

    *essays and soft work only.

"They that can give up essential liberty to obtain a little temporary saftey deserve neither liberty not saftey." -- Benjamin Franklin, 1759

Working...