Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Education United States

Yale-Harvard Snub of US News Rankings Opens Way for More Exits (bloomberg.com) 35

First, Yale Law School. Now, Harvard Medical School. One by one, some of the nation's top graduate programs are quitting the great who's-up-who's-down scorecards of higher ed: US News & World Report's rankings. From a report: Harvard, No. 1 on the publication's latest medical-school list for research, joins a growing boycott of the most famous name in US college rankings. This week, the medical schools of Stanford University and the University of Pennsylvania announced they will no longer participate. Yale kicked off the movement in November, and was followed soon after by Harvard, Penn and Georgetown University law schools. The big question now is whether the movement will trickle down to undergraduate institutions. Critics of the rankings say their methodology is flawed and fail to represent the student experience, while supporters argue the lists are valuable guides for students. While this may put pressure on undergraduate colleges to reconsider their participation, those who study the rankings say the exodus might take some time.

Love 'em or hate 'em, they exert a powerful hold over institutions, students, parents and even recruiters. For some schools, sliding in the rankings can mean lost funding. Undergraduate schools have been tight-lipped about what happens next, although many admissions officers privately question the rankings' value. The criticism has been mounting for years. "I am convinced that the rankings game is a bit of mishegoss -- a slightly daft obsession that does harm when colleges, parents, or students take it too seriously," Princeton University President Christopher L. Eisgruber wrote in a 2021 op-ed in the Washington Post. In August, US Education Secretary Miguel Cardona called rankings "a joke."

This discussion has been archived. No new comments can be posted.

Yale-Harvard Snub of US News Rankings Opens Way for More Exits

Comments Filter:
  • by timholman ( 71886 ) on Friday January 27, 2023 @12:05PM (#63244855)

    The worst mistake that every university in the U.S. ever made was in passively allowing U.S. News & World Report to become the arbiter of quality in higher education. USN&WR rankings are based entirely on self-reported data by the institutions, so the temptation to cheat is enormous. Columbia University got caught, but many other schools cheat and don't get caught.

    On top of that, universities find themselves doing entirely arbitrary things to game the ranking system. As an example, having one or more faculty members who are members of the National Academy of Engineering can create a huge bump in an engineering school's ranking, even though NAE membership by itself has no effect whatsoever on the school's overall teaching and research. But thanks to USN&WR, universities will bend over backward to recruit and hire NAE members to their faculty.

    Everyone is fed up with USN&WR. They're tired of jumping through USN&WR's hoops just so USN&WR can sell more magazines. I predict that the trickle of will grow into a flood as more schools exit the ranking process in the next few years.

    • What will take its place and how will it be better?

      The "market" for higher ed is about as efficient as the "market" for healthcare.

    • Who rates schools as the top "Party Schools" now that Playboy is pretty much off the radar?
    • As someone who has done recruiting of hundreds of college students from dozens of colleges, I would say that the USNWR ranking serves as a rough predictor of the quality of graduates from those programs. There's a lot of variation of course, I interviewed some really poor candidates from my alma mater (UMich) and some really good candidates from schools that probably rank worse than #100 on their list. It's not perfect, but it's a helpful resource for parents and students as they think about spending an e

      • by Shaitan ( 22585 )

        They refused to incorporate woke diversity factors into the rankings. USNWR initially refused to skew the data and then made some concessions that those universities said weren't good enough. That is what started this exodus not genuine doubts about the rankings.

        USNWR has said they will continue to the rank the schools whether they will provide voluntary data or not.

    • Many criteria that are valued by USN&WR are negatives for students.

      For instance, "selective" institutions with very low acceptance rates are rated higher. So schools encourage lots of students to apply by advertising and visiting high schools and then reject nearly all of them to boost their scores.

      Since acceptance rates have become so low, students have to apply to more and more schools, hoping to win the acceptance crapshoot. That means more time filling out forms and writing essays. It also means the

      • by Shaitan ( 22585 )

        "It also means the few schools that accept a student are less likely to be the those the student most wanted to attend."

        Right... but why did they want to attend that difficult to get in school? Because it is hard to get in and thus an achievement if they do. The fact is that these formerly elite universities are lowering their standards as part of diversity initiatives, which increases their acceptance rate. This means it is no longer an achievement to get in. For now their previously earned reputations con

    • by fermion ( 181285 )
      The value in the ranking is that highly ranked school can inflate the price charged to students. The flaw is the idea that any school matches any qualified students. At a certain level of education the student and school should mesh. A lower quality school on paper might be the best for a student. And the ethical school will give a scholarship

      Obviously for the average student the school is not likely to matter. But if the student has money to burn going to a higher ranked school might result in some marg

      • by Anonymous Coward

        There are 5 law schools within a short bus ride of me, and people will pay extra to attend the private independent school even though there pass rate is no better than the HBCU school.

        I can't see how you would qualify for law school if you ride the short bus.

    • Actually for many of the rankings surveys (the methodology varies somewhat by discipline area) of an institution's "reputation" are the single heaviest-weighted elements. The surveys are conducted of institutional leaders (who obviously have a conflict of interest), sometimes hiring officials, sometimes high school counselors. It's a black box because USNWR will not release much information about these surveys other than the determined "score".
    • by Shaitan ( 22585 )

      This isn't the real issue at all. The schools pulled out the ranking because they went woke on admissions and they wouldn't incorporate woke factors into the rankings determination... instead they insist upon the real unskewed outcomes.

      That isn't to say that the issues you and TFS mention aren't true but that isn't what started and motivates this exodus. But the voluntary data is only one aspect of the rankings. They've stated they will continue to provide rankings on the schools whether they provide volunt

    • Rankings are not entirely based on self-reported data, and haven't been for quite some time.

    • by Potor ( 658520 )
      As a professor I am much more worried about the increasingly intrusive effects of the accreditation process than USN&WR rankings, although I definitely do not mourn the latter's slide from significance.
    • by shanen ( 462549 )

      Nice FP, but they weren't passive about it. The schools routinely provided the requested information.

      I don't blame the raters, but the extremely unequal educational system. Also the popular belief that a degree from a "top school" means something.

      (I have two, and prove otherwise? (Going for funny?))

  • The real reason for these exits is that the rankings show the consequences of the universities' D.I.E. efforts, both in admission and hiring.

  • UW, WSU, UPS, SPU, SU are all dropping as well.

    • UW, WSU, UPS, SPU, SU are all dropping as well.

      I didn't know that. I'll be curious to see how that plays out... my department has been heavily invested in improving its US News ranking.

      • Replying to myself... looks like it's just the Law School and School of Medicine that are pulling out. Apparently this is happening all across the US.

  • by Tony Isaac ( 1301187 ) on Friday January 27, 2023 @01:19PM (#63245041) Homepage

    The rankings won't stop. US News says they will continue to rank these schools based on publicly accessible data.

    This might be a good thing, because until now, these schools have specifically targeted higher rankings. It's kind of like SEO, where marketers specifically manipulate their web site contents to get a better ranking.

    If the schools are no longer trying to massage their own ratings, the ratings might reflect a more honest view of the quality of various schools' medical programs.

  • by VeryFluffyBunny ( 5037285 ) on Friday January 27, 2023 @01:25PM (#63245051)
    There are problems with whole institution rankings or even by department:

    The variation in measurable educational quality is much greater from teacher to teacher than between departments or institutions.

    Being a prestigious research institution doesn't mean you're a good teaching institution - There are no meaningful correlations in that respect. Also unsurprisingly, there no correlation between research expertise/ability and teaching expertise/ability.

    Elite institutions have selective enrollment. This means they enroll the students who are already higher performing & they're likely to continue to do so. Less prestigious institutions have to take whatever students they can get & so have a more diverse range of academic ability to work with (Interestingly, they tend to be better at teaching & supporting because they typically have to be for the less "academic" students). Academic achievement is not a good proxy for the quality of teaching.

    Some of these rankings stats rely on student questionnaires to rate teachers' teaching efficacy & general classroom qualities. Students typically have no idea of how to judge teaching; it's difficult for even educational experts to judge. Additionally, students' judgments are often sexist & racist (There's also strategies & techniques that teachers can use to get better ratings from their students, without resorting to any kind of overt coercion, it's pretty much invisible to everyone unless you know what they are). Is that how you want to judge institutions?

    There's no useful empirical stats available for HE institutions' teaching quality, i.e. They rarely if ever measure effect sizes/learning gains of their own teaching & if they do it's pretty much impossible to get access to those stats. Even getting samples of students' submitted & graded written assignments is pretty much impossible. The only accessible collection of graded HE student writing I know of is the British Academic Written English (BAWE) corpus, for which the researchers had to approach individual students & offer to pay them to hand over their graded writing, i.e. they couldn't get any institutions to give them access to their students' writing.

    Finally, there's Goodhart's law: "Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes." In other words, if the rankings are consequential to the institutions, they'll find ways to maximise their stats without necessarily changing/improving the things that they're supposed to be measuring.
    • I would tend to agree with the comment about teaching quality. I've attended everything from Community Colleges to prestigious ones. Some of the best teachers, in my opinion, were at the humble Community College. The high ranking schools can have absolute tyrants, who are more interested in getting students to figure out and jump through their hoops.

      If I'm paying big bucks to learn, I don't care about the reputation. I want to be presented the material in a manner that I can make use of in the future.

    • The recent move by top graduate programs to quit participating in US News & World Report's rankings has made me realize that these rankings do not accurately reflect the overall quality of a school or the student experience. I do my best in studying to enter the best university; sometimes it's hard, but such services as https://essays.edubirdie.com/w... [edubirdie.com] help to cope with all necessary tasks. The decision of institutions like Yale Law School, Harvard Medical School, Stanford University, and the Universit
  • I'm from Canada, when good students graduate high school they typically apply to one school, the biggest school in their region.

    And when top students graduate high school they typically apply to one school, the biggest school in their region.

    And when weaker students graduate, well then they may need to apply to a smaller school for weaker students, but there's none of this "apply to 50 Universities to you can get into the most prestigious one... oh I mean 50 Colleges because we don't want to sound elitist"

  • These rankings are predicated on a flawed idea that has been disproven empirically: That certain "selective" colleges give one a better education, and lead to better career outcomes. The data show that choice of undergraduate institution has almost no impact on a person's eventual career success; that correlates with one's choice of major, and how hard one studies in college. (The same isn't true of graduate institutions, where there is some evidence that institution does have a small impact on career outco

  • Rankings are a joke because:

    (1) It seems obvious that "quality" (or whatever the rankings are supposedly representing) is not amenable to ranking because quality is (1) not unidimensional or scalar and (2) not definable in a way that is agreeable to most readers.

    (2) It's not obvious how "quality" (or whatever the rankings are supposedly representing) is desirable or translates to metrics that are desirable (such as financial return on investment or probability of obtaining a job).

    (3) Even if "quality" were

  • by zkiwi34 ( 974563 ) on Friday January 27, 2023 @02:34PM (#63245225)
    If there is no measure (or even something approximating a measure) of school/graduate quality, then when equity etc inevitably fails, there will not be an obvious smoking gun around.
  • In order to vote on the rankings, one must register with Doximity. Money grab by USN&WR, partnered with Doximity. Data on hundreds of thousands of physicians. I did it, under pressure of the medical school.

Technology is dominated by those who manage what they do not understand.

Working...