

AI Tool Usage 'Correlates Negatively' With Performance in CS Class, Estonian Study Finds (phys.org) 63
How do AI tools impact college students? 231 students in an object-oriented programming class participated in a study at Estonia's University of Tartu (conducted by an associate professor of informatics and a recently graduated master's student).
They were asked how frequently they used AI tools and for what purposes. The data were analyzed using descriptive statistics, and Spearman's rank correlation analysis was performed to examine the strength of the relationships. The results showed that students mainly used AI assistance for solving programming tasks — for example, debugging code and understanding examples. A surprising finding, however, was that more frequent use of chatbots correlated with lower academic results. One possible explanation is that struggling students were more likely to turn to AI. Nevertheless, the finding suggests that unguided use of AI and over-reliance on it may in fact hinder learning.
The researchers say their report provides "quantitative evidence that frequent AI use does not necessarily translate into better academic outcomes in programming courses."
Other results from the survey:
The researchers say their report provides "quantitative evidence that frequent AI use does not necessarily translate into better academic outcomes in programming courses."
Other results from the survey:
- 47 respondents (20.3%) never used AI assistants in this course.
- Only 3.9% of the students reported using AI assistants weekly, "suggesting that reliance on such tools is still relatively low."
- "Few students feared plagiarism, suggesting students don't link AI use to it — raising academic concerns."
Surprising? (Score:4, Insightful)
Re:Surprising? (Score:4, Insightful)
Well, as most people cannot fact-check, they are forever surprised by actual facts. And I have a nagging suspicion that AI, dumb and without insight as it is, may actually do better than many people at "insight" about common and well-understood topics. But a good encyclopedia basically does the same.
Functionally illiterate (Score:4, Interesting)
The surprising statistic is that the UK has 260 colleges and universities and 2,053,520 students (undergraduate and postgraduate) with 663,355 international students not from the UN and not from the EU. https://www.universitiesuk.ac.... [universitiesuk.ac.uk]
When the top universities are excluded, the student enrollment from non-UK and non-EU countries is much higher than the 30% overall number.
A guess is that the lower tier universities/colleges will have a much higher AI use.
A second guess is that there are many colleges/universities who are fully dependent on international student tuition to keep their doors open.
Re: (Score:2)
Was that an AI generated answer, because the seems to be no logic or point to it.
Such a surprise (Score:5, Interesting)
I have two personal data points:
1. My IT security students (several different classes and academic institutions) all view AI as last resort or something to be used only after they have solved a task to verify they got it all. This comes from negative experiences they made. They say AI misses important aspects, prioritizes wrongly, hallucinates (apparently IT security is niche enough that this happens often), and generally it takes more time to check its results than to come up with things directly. They also mislike that you often do not get references and sources for AI claims.
2. I taught a Python coding class in the 2nd semester for engineering students (they needed a lecturer and I had time). The students there told me that AI can at max be asked to explain one line of code, it routinely already failed at two connected ones. And for anything larger it was completely unusable. They also found that AI was often clueless and hallucinated some crap.
Hence I conclude that average-to-smarter students are well aware of the dangers and keep a safe distance. Below average ones are struggling anyways and may just try whatever they can get their hands on. And at least here, 30-50% of the initial participants drop out of academic STEM courses because it is too much for them. AI may have the unfortunate effect of having them drop out later, but overall I do not think it will create incompetent graduates. Oh, and I do my exams on paper whenever possible or online-no-AI for coding exams (so they can use the compiler). The latter gets more problematic because of integrated AI tools. I expect we will have to move coding exams to project work (on site) or something in the near future and have them take a full day or the like and maybe group work and pass/fail grading. As I do not teach coding anymore from this year on, I am not involved in any respective decisions though.
Re:Such a surprise (Score:4, Insightful)
The students there told me that AI can at max be asked to explain one line of code, it routinely already failed at two connected ones.
The smart students are saying this because they can get through your busy work faster with AI and don't want you to know. The dumb students are saying this because they aren't smart enough to manage the AI. You are believing this because why?
Re:Such a surprise (Score:4, Interesting)
I am believing this because these were discussions about non-graded exercises and they demonstrated both failed and successful AI results and they can get as much help as they want from me on these exercises. Also, I have a good relationship with most of my students and discussions are frank and open. And on top of that, about 50% (in the coding class 100%) of my students are working while studying and have professional experience and this type of student is primarily interested in learning things.
Your level of mistrust, on the other hand, seems to indicate something completely different.
Re: Such a surprise (Score:2)
Re: (Score:1)
Your level of mistrust, on the other hand, seems to indicate something completely different.
The word you're looking for is experience. Something you've demonstrated a lack of in regards to AI and CS competency.
Re: (Score:2)
Ah, yes. 35 years in the field with an on-target engineering PhD and teaching all along clearly has me completely inexperienced. Not.
Re: (Score:2)
For what it's worth, my own experiences match your groups 1 and 2. Though I'm self-taught with no CS background whatsoever. The guy in my signature claims to be, and so does this guy:
https://slashdot.org/comments.... [slashdot.org]
You gotta love the way he asserts that he will code four times as fast, and how he relies on his programs to crash and burn when he makes a mistake, but then proceeds to talk about how his potential semantic error is superior because it involves typing a whole four less characters. And that's wi
Re: (Score:2)
Does not surprise me. The biggest preventer of good engineering is big egos. Other engineering disciplines try their best to filter these idiots out. CS is not quite there yet and a lot of coders do not have a CS education in the first place.
I have found uses for AI in just two areas though: Assisting in reverse engineering code output as they tend to be able to spot patterns in ways I occasionally overlook, and converting documentation into usable data structures. And that to me makes them basically just the next evolution of the search engine, ...
Agreed. "Better search" is basically the main thing LLMs can do. Verifying whether you overlooked something is low risk because it can only improve your work (if you did it honestly). Finding and combining of essentially generic data structures from a description is pre
Re: (Score:1)
You know how you know that I'm qualified and you're not? Because when I troll you, I do it in plain sight. You don't because you already know that what I'm saying is too easy to prove.
Re: (Score:1)
Your issue is that the state of the art is rapidly advancing faster than even 6-12 months for anecdotal experiences. I'm having AI build something for me while I type this, and I guarantee you its stringing more than a few lines together. I would not rely on second hand data for this, I'd go figure it out for yourself. Every single person that hacks out code for a living right now needs to go see what the stat of the art is and figure out how to deal with it. To me it sounds like you've chosen "door ost
Re: (Score:2)
The tech really is not advancing. It just gets better at hiding the defects. Incidentally, this is several decades old tech with known fundamental limitations. Oh and there are really bad drawbacks from letting LLMs do your coding, like a high prevalence of security problems. But I guess people like you need to run into a wall before they see it is there. Well, good luck in the next few years, you are going to need it.
Re:Such a surprise (Score:4, Insightful)
It could be as simple as students using AI where they used to use someone else.
After all, in the "pre-AI" age, students would routinely copy code and other things from other students, and you can tell they did it because their grades generally were worse.
These days, I'm sure instead of asking other students, they are asking AI, and all we're seeing is the same thing - the students of the past simply copied one another, the students today, rather than copying, ask AI. Just like students of the past used paper mill sites to write their homework, now use ChatGPT to do same.
Of course, I can't say I am completely innocent of the practice - we were in a group doing a group project. I and someone else smarter than me were working on something extremely complex for the rest of the group while they worked on something much simpler for another class. In the end we figured out the complex work, and presented it to the rest of the group to learn from our knowledge, and they submitted the group project with all our names on it. I saw what they submitted and studied from it so I did learn, and the rest of the group learned from us where we all got good grades in the end. It's just if it was divided equally among everyone the two projects would've been a mess and it was simpler if we split the work the other way
Re: (Score:2)
It could be as simple as students using AI where they used to use someone else.
Sure. But AI cuts out the social aspect, like a comment "you do know that you need to pass an exam, right?" from the person you copy from. And AI makes this far too easy to do. Also, I had students angry at me when I graded copied "solutions" with no points and they asked "how can you know". Then I showed them the example where over several copying generations an index became a factor and then became an exponent. Never got any more complaints after that.
Bottom line, some students will self-sabotage and AI m
Re: (Score:2)
Of course. When I was in university (pre dot-com era) the Internet was around to communicate with (everyone exchanging ICQ numbers was a thing), but if you wanted to copy off someone you either had to bump into them on campus, or phone them at home, and if you did that, you had to do it at reasonable times because parents and other things
Re: (Score:2)
Re: (Score:2)
That is possible. I am also doing this in a country where only about 20% of all people get the qualification for academic studies at all. This does not invalidate my experience though, just the generalization.
Re: (Score:1)
What? (Score:2)
Having a machine do something for you doesn't make you as proficient when doing it yourself? Shocking!
Next you'll tell me that using a calculator doesn't help you remember your times tables.
Re: (Score:2)
Nice failure to understand you have there. As it is about building or not building understanding, that is pretty ironic. You nicely demonstrated why _you_ are getting dumber using AI. And here is a hint (although I expect this will be flying right over your head): Information, like tables, you can carry around with you, learning them by heart is pretty worthless. Insight you cannot substitute with a book or a somewhat better search mechanism like an LLM. Insight you either have or you do not.
Learning tables is learning patterns. Are you really saying that learning patterns does not extend your knowledge?
Re: (Score:2)
I think you may have failed to understand what times tables are.
Information, like tables, you can carry around with you, learning them by heart is pretty worthless.
Good luck finding a single math teacher on the planet that agrees with you on this topic.
Re: What? (Score:2)
Re: (Score:2)
Indeed. And tables have errors. I have an old table book here that comes with about 4 pages of corrections on a lose sheets. Be able to do it manually, get a sense for what you expect to see (slide rules help a lot, but sadly nobody even makes them anymore as far as I can find out), and then use a calculator. Except for some special cases, tables are entirely obsolete and all they do is make you a bit faster when you do it manually. Which is not going to happen often.
Re: (Score:2)
I think you may have failed to understand what times tables are.
Information, like tables, you can carry around with you, learning them by heart is pretty worthless.
Good luck finding a single math teacher on the planet that agrees with you on this topic.
That will be easy. What competent math teachers disagree with is doing away with slide-rules. Because these do teach really good estimation skills and the process to get a result step-by-step. Tables are just something that got used because nothing better was available.
Re: (Score:2)
Replied to wrong comment.
Re: (Score:2)
I am saying it does not extend your _insight_. Knowledge is of limited value.
Re: (Score:2)
Correlation does not mean causation... (Score:3, Insightful)
Re: (Score:3)
it's reallt simple: if you want to get more shit done, use more ai. if you want to learn, use less.
Re: (Score:2)
It's kinda like laying work off on bottom feeder outsourced labor consultancies. You'll absolutely generate a lot of "activity," but generating better outcomes???...not so much.
Re: (Score:1)
The question how much outsourcing actually helps. From my experience, as soon as you need to get actual work done, you need to use local, small outsourcers, because they will care about you and will actually try to do good work. Forget about any well-known names. They just want your money and want to string you along for as long as possible. Actually solving your problems is not part of their business model and would decrease their profits.
Re: (Score:2)
Apparently there are blithering idiots that have not noticed this blatantly obvious problem yet and disagree enough to mod this down. No surprise.
Re: (Score:2)
it's reall[y] simple: if you want to get more shit done, use more ai. if you want to learn, use less.
I will modify that a bit to fit reality. It you want to get more shit done, but results don't matter, use AI. If you want to learn, or if you want a higher correctness ratio, forego AI.
Re: (Score:2)
Indeed. There is nothing wrong to use AI as an additional plausibility check, but that is essentially it for any work that requires insight for good results. Note that a lot of big names get filthy rich by not caring about the quality of their work. These are clearly evil (doing massive damage to others for a comparatively much smaller gain to them), but that seems to be something the human race can well afford at this time. Or not.
Re: (Score:2)
fair enough, but that reality isn't really in the scope of the artice. it isn't about work being of good quality or not, but about students doing said work with ai apparently not having learned enough in the process. the reason is unclear, but this is the logical outcome when the process involves less mental exercise and intimate contact with the subject matter. still, while this to true in general i don't know it is exactly what happened here (as gp points out).
then again ai is just a tool. if not used jud
Re: (Score:2)
For the first case, that really depends on what you want to get done. For the second part, yes.
Re: Correlation does not mean causation... (Score:2)
Re: (Score:2)
Re: (Score:2)
But this is *negative* correlation, maybe that does mean causation???
Re: Correlation does not mean causation... (Score:2)
Hinges Strongly on "HOW" They Use AI (Score:5, Informative)
Initially, I found the same in myself--a real degradation overall in my productivity. I am a software Engineer. It has not been easy learning how to use generative AI to actually increase and improve productivity. At first, you think it can do almost anything for you but gradually over time realize it greatly over-promises.
Overall, the key is that you need to remain in charge of your work. It is an assistant that can be entrusted more or less to small tasks with oversight, at best. For example, frame out your project and clearly define your rules and preferences in fine detail. Then..
It's good at:
- Researching, summarizing, and giving examples of protocols, best practices, etc.
- Help you identify considerations you might have overlooked.
- Writing bits of code where the inputs/outputs and constraints are clearly defined.
It's bad at:
- Full projects
- Writing anything novel (it knows common patterns and can't work much beyond them.
- Being honest and competent -- it cheat on writing code and writing tests for that code; when you catch it red handed, it will weasel its way out.
The bottom line: you are in charge. You need to review what it gives you. You need to work back and forth with it.
Also -- I am still learning.
--Matthew
Re: (Score:2)
While I mostly agree, there is another aspect: Coding is a skill that needs to be practiced. If you stop writing the simple things yourself, you will get worse at writing the harder stuff.
Re: (Score:2)
I feel like a lot of these same observations can be said for people that over rely on libraries and frameworks and don't actually understand or have any control over what they're doing. At a certain point, the AI is just patching frameworks together and implementing libraries without understanding them .. just like some programmers do.
Re: (Score:2)
In Other News (Score:2)
It turns out that even though you can cover 5 miles quicker in a car, it negatively correlates with health outcomes compared to running or cycling the same distance. Using AI is like taking a taxi.
Re: (Score:3)
That is actually a very good comparison. Skills and insights need to be used to maintain them and even more so to improve them.
This is the google calculator effect (Score:2)
Re: (Score:1)
Funny, Seems like a given to me! (Score:2)
These typewriters will destroy writing. (Score:2)
My grandmother talked about her grandfather when I was a boy. She described how he was a scribe at the local courthouse. He would take the judgements and write them up on parchment in beautiful handwriting. It seems he lamented the introduction of typewriters, and how they would cheapen the documents, with people using soulless mechanical machines rather than the love and precision of the human hand. People would lose the ability to write he would say.
At school we were not allowed calculators at first, as i
Education or degree? (Score:2)
I never look at grades when evaluating CVs. I look at github
oh no (Score:2)
You mean I am supposed to fit all this information in my head, practice it, and eventually understand it? Sounds really hard
In a related story, (Score:2)
Runners who strap their fitbit to the dogâ(TM)s tail for an hour each day have less aerobic fitness.
Garbage In Garbage Out (Score:2)
AI is total rubbish at logic.
If you want AI to make code for you if you do anything beyond psuedo code (A does B, returns C) you're going to get rubbish. AI is great for boilerplate repetitive stuff.
CS students in 101 classes shouldn't be using AI for assignments for the same reason you don't give 10 year old calculators, you want them to learn the process first before using other tools.
You need to be able to recognize when it's spitting garbage back out at you.