Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Education Technology

The Student and the Algorithm: How the Exam Results Fiasco Threatened One Pupil's Future (theguardian.com) 174

Josiah Elleston-Burrell had done everything to make his dream of studying architecture a reality. But, suddenly, in the pandemic summer of 2020, he found his fate was no longer in his hands -- and began a determined battle to reclaim his future. From a long read at The Guardian: The algorithm did what it was supposed to do. Humans, in the end, had no stomach for what it was supposed to do. Algorithms don't go rogue, they don't go on mutant rampages, they only sometimes reveal and amplify the cruddy human biases that underpin them. Ofqual's mistake was to think this exercise -- which made plain our usual tricks for filtering and limiting young lives -- would be morally tolerable as it played out in public view.
This discussion has been archived. No new comments can be posted.

The Student and the Algorithm: How the Exam Results Fiasco Threatened One Pupil's Future

Comments Filter:
  • What? (Score:5, Informative)

    by Jarik C-Bol ( 894741 ) on Saturday February 20, 2021 @07:15AM (#61082894)
    The Summary is piss poor, which I expected, but the article is written like a novel, with miles and miles of paragraphs describing what people are wearing, the scenery, and a host of other irrelevant details. I gave up.
    • Yep, the summary might as well be "RTFA" and the article is what you get when someone wants to write documentaries for a living but can be bothered to do more research than sitting down for an hour each with one or two people.

      • Can someone take one for the team and summarize it for us? TL;DR. Thanks. Your help is appreciated. You are saving an endless amount of person-hours.
        • Re:What? (Score:5, Informative)

          by AleRunner ( 4556245 ) on Saturday February 20, 2021 @09:08AM (#61083116)

          It's a bit too long for me too, but I can give you the summarised story based on what actually happened (I read the last paragraph to check):

          * In the UK we cancelled exams last year.
          * Instead of exams, they decided to have an algorithm guess what each student got
          * They designed the algorithm based on previous years data and tested it on the same data which is a total fuck up.
          * The algorithm ended up giving students grades based on the school they came from rather than anything they did
          * This was very bad for good students in bad schools and good for bad students in good schools
          * Then people complained
          * Then the algorithm was cancelled but it was too late for universities to take into account new grades
          * Some people managed to fight on and get into university based on other evidence than their grades
          * This is the story of one of those people
          * He happens to be black so both the left and the right are out to make specific, irrelevant and wrong comments about the story

          The main deviation from the facts in this story seems to be that they want to forgive the computer programmers and the computer which was "just doing as it was told". As nerds we know that computers are inanimate so don't need excused and that the job of computer programmers is to know that computers do what they are told and to make sure that it doesn't have harmful results. This they failed to do.

          • by quenda ( 644621 )

            * This is the story of one of those people
            * He happens to be black

            It is "The Guardian". His colour is no accident.

            • * This is the story of one of those people
              * He happens to be black

              It is "The Guardian". His colour is no accident.

              Kind of; they definitely prefer "ethnic minority" stories. Partly I'd say it's a distractor to get you thinking about whether it's racism so that you don't notice as they nail the head of Ofqual and aim at the useless minister, trying to take the blame away from the developers. In fact this should be between the software developers and the minister with the head of ofqual being the blameworthy stooge/victim in the middle.

              From our point of view it's entirely arbitrary because there are a bunch of white kids

            • There were six pictures accompanying the article.
              Three were of the student, one of them including his mother.
              One was a of a page from his art sketchbook.
              Two were of other students demonstrating against what was going on, about half of those students happened to be white. Pretty much par for the course.

          • by AmiMoJo ( 196126 )

            Universities were actually swamped. The offers they made were legally binding, so once grades were revised they suddenly had too many students.

            To make things worse this year has been a disaster for students. Universities can't provide the courses they are paying for. They can't use the accommodation they rented.

            All the kids at school are basically a year behind now too.

            • Universities were actually swamped. The offers they made were legally binding, so once grades were revised they suddenly had too many students.

              To make things worse this year has been a disaster for students. Universities can't provide the courses they are paying for. They can't use the accommodation they rented.

              All the kids at school are basically a year behind now too.

              Yeah, but it's okay because the "private school" kids and "public school [wikipedia.org]" (dear American friends, when I use it, that word does not mean what you think it means) kids mostly benefited from the situation so none of the ruling class are losing out badly on this. And when you say "all the kids", any kids I heard of in private schools have been having direct lessons on Zoom all the way through. I doubt they are more than a month behind, if that.

          • So a statistical method is not accurate for 100% of individuals, only most of them. Not exactly surprising. One could have written the story, then gone to find an individual that demonstrates it. The fallacy is to pretend that there wasn't a global catastrophe going on last year that was criminally mismanaged by the UK government. The solution is not to "fix" some algorithm that was never indented to be 100% accurate, but to elect politicians that are actually interested in helping people not grifting o
          • Thanks for the summary
          • Thank you for posting this, and doing what TFA, TFS, the editors, and every other poster here utterly failed to do.

            I'm sorry the burden of fixing everyone else's fuck-up fell on your shoulders, but thank you for bearing the burden.

          • computers do what they are told and [it is the designer's job] to make sure that it doesn't have harmful results. This they failed to do.

            True. However, the fault wasn't so much in the algorithm as in pretty much everything around the algorithm.

            • The designers came up with a bunch of flawed schemes ... largely because there "a plan without any flaws" does not exist in nature. Even more so in AI. The "winning" algorithm was the "least bad" one.
            • The producer of the product (a director-level guy) refused
    • I gave up.

      Sadly, so did journalism in favor of hype and clickbait.

    • It is called long-form journalism.

      • It literally says at the top of the article "The long read" which I gather is a series of long articles of dubious interest.

        https://www.theguardian.com/ne... [theguardian.com]
        • by shilly ( 142940 )

          Yep. I mean, it's one thing to find long-form journalism tedious or annoying or turgid or what have you, but it's quite another to complain that a long-form article has failed because it's not written short-form, and it's one step beyond that to see all these complaints from people who failed to read the words "The long read" right at the top...

    • We've moved in to the next level, where the headline is no longer the click-bait, the summary is.
    • Re:What? (Score:5, Informative)

      by TheNameOfNick ( 7286618 ) on Saturday February 20, 2021 @09:06AM (#61083114)

      It's really not that complicated or surprising. Instead of grading students through exams, which had been cancelled due to a pandemic, the schools deferred grading to an algorithm which took prior performance into account as well as rankings created by teachers based on expected exam performance, but also parameters like grading averages for the school and other inputs not strictly based on the student's individual performance. This algorithm apparently did poorly for a relatively small number of students (the article says half a percent) who were outliers of some sort. One year before, the protagonist of the story expected to get a result better than three Bs, which was the threshold for his planned architecture studies, but through no fault of anyone else he got an A, a B and, crucially, a U (ungraded) in maths, which prevented him from getting into college. When he repeated his final year for another shot at the exams, the exams were cancelled and the algorithm gave him A, C and E. Apparently an algorithm which grades mostly on past performance doesn't do too well for students who strive to improve on their past performance. Had the algorithm graded him the year before, he probably would have been graded a lot better than in his actual exams.

      • The point is: if he had written the exams, he had received an AAA.

        That an algorithm gave him an E (worst possible grade) and an Ungraded is just absurd!

    • by e3m4n ( 947977 )
      Its goddamn clickbait. Designed to display more ads. They never get to the fucking point.
    • by rebill ( 87977 )

      also known as "Too long; Fell Asleep".

  • The Algorithm (Score:5, Informative)

    by Anonymous Coward on Saturday February 20, 2021 @07:39AM (#61082922)
    Thanks Wikipedia: https://en.wikipedia.org/wiki/... [wikipedia.org]

    Schools were asked to make a fair and objective judgement of the grade they believed a student would have achieved, but in addition to rank the students within each grade. This was because the statistical standardisation process required more granular information than the grade alone

    +

    The normal way to test a predictive algorithm is to run it against the previous year's data: this was not possible as the teacher rank order was not collected in previous years.

    This is worse than stack ranking in the corporate world. With stack ranking its based on the performance of the individual vs. the group within the past year with opportunities for managers to argue for/against ratings. This algorithm used incomplete historical dataset to predict a specific grade distribution for this year. It also assumes an evenly distributed bell curve for everybody - which may not be the correct statistical distribution model. Worst of all, it did the group rankings based on school, which for a standardized test has no bearing on an individual's output.

    Algorithm is horribly flawed and stories like this that come out are a sad result.

  • Summary (Score:5, Insightful)

    by enriquevagu ( 1026480 ) on Saturday February 20, 2021 @08:05AM (#61082956)

    This is one of the worst summaries in a long time. You cannot even discern what the article is about, other than a student and an exam.

    • by ranton ( 36917 )

      This is one of the worst summaries in a long time. You cannot even discern what the article is about, other than a student and an exam.

      The summary writer didn't want to read the whole article any more than the rest of us, so I get it.

    • This is one of the worst summaries in a long time. You cannot even discern what the article is about, other than a student and an exam.

      Then it's a very good summary, because you can't discern that from the actual article either.

  • by rsilvergun ( 571051 ) on Saturday February 20, 2021 @08:41AM (#61083038)
    choices and mistakes you make in your early teens (when your decision making skills are still pretty terrible and you lack the perspective to make them anyway) chart the course for your entire life.
  • Goddamn clickbait (Score:4, Insightful)

    by e3m4n ( 947977 ) on Saturday February 20, 2021 @09:27AM (#61083166)
    This shitty article has all the signs of clickbait. They entice you with a headline, and then drivel on and on never getting to the point. You try to skip ahead but ever time you think they are finally going to tell you what happened they insert another back story and beat that horse to death as well. It has only one purpose. To display as many fucking ads as possible. The topic needs to be down modded and hidden. How did this submission pass peer review?
  • TL;DR He got in

    The previous year he got ABU, the algorithm downgraded him to ACE. He appealed and was awarded AAC by the exam board which was not quite enough. So asked for a concession from the University and it was granted.

    Story told in three sentences, not three excessively worded pages.

  • Half of the article is irrelevant details about a person.

    The other less than half of the article is irrelevant details about a grading system that ultimately did not apply to this same person.

    "On 16 August, after Roger Taylor acknowledged âoea situation that was rapidly getting out of controlâ, a decision was made that the Approach-1 algorithm was by now so tarnished it would be better if they abandoned it. Elleston-Burrell was at work the next day, on 17 August, when he heard. Ofqual and the gove

    • There's a bit more than that. In most years, lots of people fail their conditional result but then are either allowed in directly by the university they applied to since it has enough space for them or are offered a space in the same course in a different university. In 2020 the algorithm knocked down some people but also pushed up lots of other people (obviously impossible to be sure exactly who, but mostly students from small private schools). That meant that more students from small private schools an

In the long run, every program becomes rococco, and then rubble. -- Alan Perlis

Working...