Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Education AI

Surge in UK University Students Using AI To Complete Work 42

More than 90% of UK undergraduate students now use AI in their studies, up from two-thirds a year ago, according to a Higher Education Policy Institute survey released Wednesday. The poll of 1,041 full-time undergraduates found 88% used generative AI such as ChatGPT for assessments, compared with 53% in 2024, with science students more likely to use the technology than humanities peers. Half of students cited "saving time" and "improving work quality" as their primary motivations.

The proportion considering it acceptable to include AI-generated text after editing rose to 25% from 17% last year, while only 6% approved using AI content without editing. "Every assessment must be reviewed in case it can be completed easily using AI," said Josh Freeman, policy manager at Hepi. The report identified "persistent digital divides" in AI competency, with men and students from wealthier backgrounds more likely to be frequent users.

Surge in UK University Students Using AI To Complete Work

Comments Filter:
  • by mukundajohnson ( 10427278 ) on Wednesday February 26, 2025 @10:33AM (#65196415)

    There's a reason we still learn math that we'll never use. It's to learn how to think. I can't imagine AI is contributing much to cognitive development.

    • There's a reason we still learn math that we'll never use. It's to learn how to think. I can't imagine AI is contributing much to cognitive development.

      That's a feature, not a bug. If the AI prophets get their way, human beings will cease thinking at all, and rely completely on them for all information, all decision making, and all life planning. Hail to the new overlords, we're finally saved from ourselves!

      • by GoTeam ( 5042081 )

        That's a feature, not a bug. If the AI prophets get their way, human beings will cease thinking at all, and rely completely on them for all information, all decision making, and all life planning. Hail to the new overlords, we're finally saved from ourselves!

        That sounds about right. It's also concerning that so many of them (50%-ish) claim it is "improving work quality". I guess it depends on the usage context, but it seems like you have to manipulate the output to clean up the results it gives you more often than not.

        • by gweihir ( 88907 )

          That sounds about right. It's also concerning that so many of them (50%-ish) claim it is "improving work quality". I guess it depends on the usage context, but it seems like you have to manipulate the output to clean up the results it gives you more often than not.

          That is one thing I do not get. When I do something myself, do it right and see it working, that gives me a significant amount of job satisfaction. Do these people lack that experience?

    • I don't necessarily see the problem with math because while AI would help you get past the homework it's not going to help you on the tests.

      One thing I remember is a buddy of mine is a guitarist and remember them telling me that they learned a whole bunch of bad technique from various teachers back in the day because they just didn't have anyone or anything to tell them not to do that. This fundamentally limited there playing skills because they learned bad technique and it's extremely hard bordering on
      • The problem is figuring out the steps is part of the learning process, and yes we have bad teachers. But what happens when the current generation grow up and become teachers we will have even worse teachers. Which will intern teach the AI that will become worse. The problem will not develop over night it will take a generation or two and will be very hard to fix. We are already seeing declines in test scores https://hechingerreport.org/ma... [hechingerreport.org] AI will just make it worse. Anecdotally I go shops and when the ti

        • With math you're not really figuring out the steps somebody is telling you to steps and you're memorizing them.

          Although I suppose but some word problems there is a little bit more creative thinking involved.

          But even then 99% of that is still just following patterns. At the end of the day your teacher gives you a whole bunch of word problems that fit specific patterns and you're really just going to get those patterns on the test. I got good at test taking back in the day because I could figure out
          • by gweihir ( 88907 )

            With math you're not really figuring out the steps somebody is telling you to steps and you're memorizing them.

            Oh? I had do to several proof each week in linear algebra and calculus to even be allowed to take the exam. And then I had to do more in logic, term-rewriting and deduction systems. Well, that was for a Master's in CS, but still.

    • If you built an AI that applied the Socratic method to student questions I think it would be very helpful in helping students develop their own problem solving approaches and to fill in missing gaps that prevented them from solving the problem themselves in the first place. Of course most students don't want to learn how to learn or to work hard at educating themselves so they'll look for another AI that spits out the answer.

      AI, like any other tool, can be misused. It's the responsibility of the teachers
    • You remind me of James H. Schmitz's SF short story "The Pork Chop Tree" (Analog, February 1965). You can read it here:

      https://s3.us-west-1.wasabisys... [wasabisys.com]

      starting on page 42 of the magazine. A remarkably thought-provoking take on predation and its alternatives.

    • There's a reason we still learn math that we'll never use. It's to learn how to think. I can't imagine AI is contributing much to cognitive development.

      LOL. no. We learn math that we'll never use because of the foolish idea that inside every child is a math genius waiting to come out. "Educators" have seen Stand and Deliver too many times.

    • by gweihir ( 88907 )

      It looks like AI will retard or prevent cognitive development for many. At a huge cost for society.

    • by Idzy ( 1549809 )
      And now we have the internet and computers and can do a quick search and get the answers and why not? All University/College needs to teach is how to learn not specific knowledge. We can use our brains to tackle the bigger stuff. Yes if you want to be a theoretical physicist you need to learn advanced algebra and calculus but how many people actually use it in a way that they need to understand how the math works to get a result you can just google?
  • by Pseudonymous Powers ( 4097097 ) on Wednesday February 26, 2025 @10:37AM (#65196425)
    This is just cheating. It's just cheating. The students doing this are cheating. When, in the past, people "used" other people to do their work for them, we called it cheating. They are not learning anything; they are cheating. Cheaty Cheaty Cheatface McCheatalot. CHEAAAAAAAAA etc.
  • by Roger W Moore ( 538166 ) on Wednesday February 26, 2025 @10:46AM (#65196437) Journal
    How AI is used matters a lot more than the fact it is used. I got my physics degree in the UK and almost all of it was based on exams that were held over a week at the end of each year and tested you on all the material covered in that year which pretty much rules out the possibility of using AI to cheat on assessments that actually matter for your degree.

    We did have lab and project reports which contributed somewhat to our grades and there AI could have been used (had it been available back then) but my experience with current AI is that it is useless for actually authoring anything scientific: its writing is very vague, leaves out details and would get an extremely poor grade. What it is good at is cleaning up and editing text to improve clarity and succinctness but for that you do have to write the material first and then check it's output to ensure it did not hallucinate extra stuff.

    I've got no problems with that sort of use and even encourage my own students to use it that way. It means that you've done the work yourself and are using AI like an advanced grammar/spell checker to make the text better. Chances are that you may even pick up better writing skills yourself from that. So the fact that students use AI nowadays is not a problem, the question is HOW are they using AI. Although, unless the UK university system has shifted significantly from exams as assessment, it's hard to see how AI will have much impact on most of the assessment that matters for your degree.
    • What it is good at is cleaning up and editing text to improve clarity and succinctness but for that you do have to write the material first and then check it's output to ensure it did not hallucinate extra stuff .

      In short, it is completely useless, but sorta fashionable.

      • It sounds a bit like spellcheck back in the day. I grew up when spellcheck was just becoming available, and was a function you had to trigger manually and let it chew through the text for a while. My parents said that it improved my spelling better than any classwork did, because it focused on words I actually used and gave relatively immediate feedback.
        Expanding that to grammar is a lot more difficult, but happening now apparently.

        It sounds a bit like having your parents (educated but not in your degree

      • n short, it is completely useless, but sorta fashionable.

        No, editing text for style, brevity, clerity etc. is being useful. So far I'd say that's the only use for it that I seen though. So it's not useless just nowhere near as useful as all the hype suggests....at least not yet.

    • I agree with you that it is how its used, my problem is that I don't think we as a society will be able to control how its used. People take the easy option its in our very nature.

      Honestly if you don't care if the about the students writing style then just do not mark on that. Of course that depends what exactly you are trying to teach, but if you want to teach them writing have a course on writing. That is important for post grad research papers, but if you are teaching a doctor, its much more important th

      • I have a daughter studying vet nursing and every essay assignment has to be X words +/- 10%, probably 20% of the time spent trying to be in this word limit. This seems to be a institution wide policy. I am not saying you do this, but I think the people who set this brain dead, policy should not be allowed anywhere near students.

        I mean, what do you do? Artificial limits are crap, but 1) students need to know how much text is expected for a good mark (roughly!), what's considered too little and what's too much. "Too much" is a thing, as some students go wild and produce booklets, either because they want to ensure they get a good mark, or because they like writing. Also, somebody has to assess those writeups fairly, and guess what, we are not allocated all the time in the world to do that. So the +/- 10% or 20% is a pragmatic choice

  • by fleeped ( 1945926 ) on Wednesday February 26, 2025 @10:48AM (#65196451)
    I've worked in a couple such universities, post-AI era. First, the message from the universities is mixed: there's no coherent university-wide applied strategy, and really if there is one, it changes frequently. Use it, don't use it, translation is ok, editing is ok, editing is not ok, style is ok, style is not ok, cite how you used it, etc. Of course, only the most inept/lazy students get caught, if they copy-paste AI stuff wholesale and then, IF there is in-person assessment, they get caught knowing little about what they submitted.

    Post-covid there has been a shift towards coursework, which is far easier to cheat. Attendance, if not enforced, cannot be used to inform knowledge/progress of student, so they can be away for a long time, then return with completed coursework.

    Because of the chaotic handling, no surprise students adopt a "whatever, I'm just going to use it and lie about it" strategy as much as possible, as that definitely gives them the edge. And if you think "Ah design your assessments/coursework differently so they can't be gamed with AI" well that just doesn't work anymore. The only reasonable way of assessing is in-person, in a controlled environment (in-class tests with limited internet access or oral exams), anything else is a joke. Not just in the UK of course.
  • Any test that is bunch of text is sus to being with.

    We need more doers, less talkers. IE no more sales and marketing people in charge of anything but taking out the trash.

  • by RobinH ( 124750 ) on Wednesday February 26, 2025 @11:11AM (#65196491) Homepage
    We were talking to some university students recently, and they were openly wondering what the point of learning anything was, since you could just look everything up on your phone at an AI prompt. Aside from the obvious fact that AI is often wrong, we pointed out to them that when we interview candidates for a job, we give them a pen and paper test, with no access to a computer, to see if they actually know anything. Their eyes got a little wider when we told them that. We don't ask trick questions, but if you have an electrical engineering degree on your resume, you'd better be able to apply Ohms Law, or... and I'm not making this up... you need to know how many milliamps are in an amp. Honest to God, I interviewed an electrical engineering graduate in 2020 who didn't know that. She didn't get hired.
    • by jp10558 ( 748604 )

      I always feel 2 ways on this. At least in IT, I don't necessarily expect you to know much / anything specific. It changes all the time. What I want to know is can you learn stuff or figure it out. I don't much care *how* you accomplish that stuff as long as the output is effective.

      I guess if I had to test someone, I'd mostly be interested in troubleshooting technique - but if AI leads you through that well - IDK, $20 a month or so is probably less than quite a few other tool costs for an employee anyway.

      • by RobinH ( 124750 )
        This wasn't for IT. It was for engineering. And if you don't understand Ohms Law, how to calculate power, and how to do some SOH-CAH-TOA level trig, after 4 years of an electrical engineering degree and a couple co-op terms, then I'm not interested in hiring you because it means you have no grasp of the purpose of all the stuff you supposedly learned. I'm happy to let some other company take a chance on you. And that's just reality. Employees are open about having no loyalty to a company, and companies
        • by jp10558 ( 748604 )

          Sure - I just mean there's probably a lot of places that just want the output at a cheap rate in any field. It's not good - just what the economy seems to push.

          I'm also not sure at the learning level that "old learning" is inherently better than using new tools. It would seem silly to me to be against EEs using Cadence tools say - because you didn't have to learn "how to do it by hand". You're right - if a very basic sort of quiz is useful to you - great. I just figure that in many fields you're going to be

  • ... A generation of editors with no authors. A whole lot of students that are getting really good at correcting and re-voicing without developing any ability to be truly creative. These will be the same people complaining about "everything being the same" in the latest video game or TV series. If you don't build those muscles, you won't have them when you need them.

    • You just described the thoughts about structural engineering students in the early 1980s. With this new fangled autocad and parametric parts the figuring it out skills at drawing time with disappear, people will just cut and paste to delivering faster. That those parts can be produced in a factory and not have days where the weather stops construction. Jr engineer smugly state I know how to figure the flex in a beam, I just don't spend time doing it for 1023 cases I do 7 with a slide rule in 1974
  • ... to attending graduation ceremonies. And watching my childs AI recieve its diploma.

  • What's interesting is that if people put "generative AI" many people seem to read it as actually making sense.

    Basically, the idea is the same as if you were going to the gym and while there paying people to exercise for you. That would sound stupid to people. But because doing the same thing with learning and AI has buzzwords in it, it sounds reasonable to many.

  • ... if they hand in AI generated text, but have no clue what is in there. Takes a bit more work on my side because I do not only have to grade their submissions, but also have a 5-10 minute discussion with them about it to verify they actually know what is in there. Yes, work for me, work for them, but education done right is work. And those that pass will eventually see the wisdom of this.

  • The more AI is used in your work, the less you will be required to perform it in the future. If your work is primarily done using AI don't be surprised when somebody uses AI to do your work. Students should be worried about "their own" value-add.

"If a computer can't directly address all the RAM you can use, it's just a toy." -- anonymous comp.sys.amiga posting, non-sequitir

Working...