Yale-Harvard Snub of US News Rankings Opens Way for More Exits (bloomberg.com) 35
First, Yale Law School. Now, Harvard Medical School. One by one, some of the nation's top graduate programs are quitting the great who's-up-who's-down scorecards of higher ed: US News & World Report's rankings. From a report: Harvard, No. 1 on the publication's latest medical-school list for research, joins a growing boycott of the most famous name in US college rankings. This week, the medical schools of Stanford University and the University of Pennsylvania announced they will no longer participate. Yale kicked off the movement in November, and was followed soon after by Harvard, Penn and Georgetown University law schools. The big question now is whether the movement will trickle down to undergraduate institutions. Critics of the rankings say their methodology is flawed and fail to represent the student experience, while supporters argue the lists are valuable guides for students. While this may put pressure on undergraduate colleges to reconsider their participation, those who study the rankings say the exodus might take some time.
Love 'em or hate 'em, they exert a powerful hold over institutions, students, parents and even recruiters. For some schools, sliding in the rankings can mean lost funding. Undergraduate schools have been tight-lipped about what happens next, although many admissions officers privately question the rankings' value. The criticism has been mounting for years. "I am convinced that the rankings game is a bit of mishegoss -- a slightly daft obsession that does harm when colleges, parents, or students take it too seriously," Princeton University President Christopher L. Eisgruber wrote in a 2021 op-ed in the Washington Post. In August, US Education Secretary Miguel Cardona called rankings "a joke."
Love 'em or hate 'em, they exert a powerful hold over institutions, students, parents and even recruiters. For some schools, sliding in the rankings can mean lost funding. Undergraduate schools have been tight-lipped about what happens next, although many admissions officers privately question the rankings' value. The criticism has been mounting for years. "I am convinced that the rankings game is a bit of mishegoss -- a slightly daft obsession that does harm when colleges, parents, or students take it too seriously," Princeton University President Christopher L. Eisgruber wrote in a 2021 op-ed in the Washington Post. In August, US Education Secretary Miguel Cardona called rankings "a joke."
From a trickle to a flood (Score:5, Interesting)
The worst mistake that every university in the U.S. ever made was in passively allowing U.S. News & World Report to become the arbiter of quality in higher education. USN&WR rankings are based entirely on self-reported data by the institutions, so the temptation to cheat is enormous. Columbia University got caught, but many other schools cheat and don't get caught.
On top of that, universities find themselves doing entirely arbitrary things to game the ranking system. As an example, having one or more faculty members who are members of the National Academy of Engineering can create a huge bump in an engineering school's ranking, even though NAE membership by itself has no effect whatsoever on the school's overall teaching and research. But thanks to USN&WR, universities will bend over backward to recruit and hire NAE members to their faculty.
Everyone is fed up with USN&WR. They're tired of jumping through USN&WR's hoops just so USN&WR can sell more magazines. I predict that the trickle of will grow into a flood as more schools exit the ranking process in the next few years.
Re: (Score:2)
The "market" for higher ed is about as efficient as the "market" for healthcare.
Re: (Score:3)
Re: (Score:2)
As someone who has done recruiting of hundreds of college students from dozens of colleges, I would say that the USNWR ranking serves as a rough predictor of the quality of graduates from those programs. There's a lot of variation of course, I interviewed some really poor candidates from my alma mater (UMich) and some really good candidates from schools that probably rank worse than #100 on their list. It's not perfect, but it's a helpful resource for parents and students as they think about spending an e
Re: (Score:3)
They refused to incorporate woke diversity factors into the rankings. USNWR initially refused to skew the data and then made some concessions that those universities said weren't good enough. That is what started this exodus not genuine doubts about the rankings.
USNWR has said they will continue to the rank the schools whether they will provide voluntary data or not.
Re: (Score:3)
Many criteria that are valued by USN&WR are negatives for students.
For instance, "selective" institutions with very low acceptance rates are rated higher. So schools encourage lots of students to apply by advertising and visiting high schools and then reject nearly all of them to boost their scores.
Since acceptance rates have become so low, students have to apply to more and more schools, hoping to win the acceptance crapshoot. That means more time filling out forms and writing essays. It also means the
Re: (Score:3)
"It also means the few schools that accept a student are less likely to be the those the student most wanted to attend."
Right... but why did they want to attend that difficult to get in school? Because it is hard to get in and thus an achievement if they do. The fact is that these formerly elite universities are lowering their standards as part of diversity initiatives, which increases their acceptance rate. This means it is no longer an achievement to get in. For now their previously earned reputations con
Re: (Score:2)
Obviously for the average student the school is not likely to matter. But if the student has money to burn going to a higher ranked school might result in some marg
Re: (Score:1)
There are 5 law schools within a short bus ride of me, and people will pay extra to attend the private independent school even though there pass rate is no better than the HBCU school.
I can't see how you would qualify for law school if you ride the short bus.
Re: (Score:3)
Re: (Score:2)
This isn't the real issue at all. The schools pulled out the ranking because they went woke on admissions and they wouldn't incorporate woke factors into the rankings determination... instead they insist upon the real unskewed outcomes.
That isn't to say that the issues you and TFS mention aren't true but that isn't what started and motivates this exodus. But the voluntary data is only one aspect of the rankings. They've stated they will continue to provide rankings on the schools whether they provide volunt
Re: (Score:2)
True, yet you keep doing it. While I agree what these schools are doing could be fairly characterized as bullshit, it is still happening.
https://www.theepochtimes.com/former-medical-educator-criticizes-harvards-withdrawal-from-college-ranking-system_4997578.html
Re: (Score:1)
Rankings are not entirely based on self-reported data, and haven't been for quite some time.
Re: (Score:2)
Re: (Score:1)
Nice FP, but they weren't passive about it. The schools routinely provided the requested information.
I don't blame the raters, but the extremely unequal educational system. Also the popular belief that a degree from a "top school" means something.
(I have two, and prove otherwise? (Going for funny?))
The real reason (Score:1, Troll)
The real reason for these exits is that the rankings show the consequences of the universities' D.I.E. efforts, both in admission and hiring.
All the Washington State universities (Score:2)
UW, WSU, UPS, SPU, SU are all dropping as well.
Re: (Score:2)
UW, WSU, UPS, SPU, SU are all dropping as well.
I didn't know that. I'll be curious to see how that plays out... my department has been heavily invested in improving its US News ranking.
Re: (Score:2)
Replying to myself... looks like it's just the Law School and School of Medicine that are pulling out. Apparently this is happening all across the US.
This might improve the quality of the rankings (Score:5, Insightful)
The rankings won't stop. US News says they will continue to rank these schools based on publicly accessible data.
This might be a good thing, because until now, these schools have specifically targeted higher rankings. It's kind of like SEO, where marketers specifically manipulate their web site contents to get a better ranking.
If the schools are no longer trying to massage their own ratings, the ratings might reflect a more honest view of the quality of various schools' medical programs.
Problems with whole institution rankings (Score:5, Interesting)
The variation in measurable educational quality is much greater from teacher to teacher than between departments or institutions.
Being a prestigious research institution doesn't mean you're a good teaching institution - There are no meaningful correlations in that respect. Also unsurprisingly, there no correlation between research expertise/ability and teaching expertise/ability.
Elite institutions have selective enrollment. This means they enroll the students who are already higher performing & they're likely to continue to do so. Less prestigious institutions have to take whatever students they can get & so have a more diverse range of academic ability to work with (Interestingly, they tend to be better at teaching & supporting because they typically have to be for the less "academic" students). Academic achievement is not a good proxy for the quality of teaching.
Some of these rankings stats rely on student questionnaires to rate teachers' teaching efficacy & general classroom qualities. Students typically have no idea of how to judge teaching; it's difficult for even educational experts to judge. Additionally, students' judgments are often sexist & racist (There's also strategies & techniques that teachers can use to get better ratings from their students, without resorting to any kind of overt coercion, it's pretty much invisible to everyone unless you know what they are). Is that how you want to judge institutions?
There's no useful empirical stats available for HE institutions' teaching quality, i.e. They rarely if ever measure effect sizes/learning gains of their own teaching & if they do it's pretty much impossible to get access to those stats. Even getting samples of students' submitted & graded written assignments is pretty much impossible. The only accessible collection of graded HE student writing I know of is the British Academic Written English (BAWE) corpus, for which the researchers had to approach individual students & offer to pay them to hand over their graded writing, i.e. they couldn't get any institutions to give them access to their students' writing.
Finally, there's Goodhart's law: "Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes." In other words, if the rankings are consequential to the institutions, they'll find ways to maximise their stats without necessarily changing/improving the things that they're supposed to be measuring.
Re: (Score:3)
I would tend to agree with the comment about teaching quality. I've attended everything from Community Colleges to prestigious ones. Some of the best teachers, in my opinion, were at the humble Community College. The high ranking schools can have absolute tyrants, who are more interested in getting students to figure out and jump through their hoops.
If I'm paying big bucks to learn, I don't care about the reputation. I want to be presented the material in a manner that I can make use of in the future.
Re: (Score:2)
Re: (Score:1)
Rankings (Score:2)
I'm from Canada, when good students graduate high school they typically apply to one school, the biggest school in their region.
And when top students graduate high school they typically apply to one school, the biggest school in their region.
And when weaker students graduate, well then they may need to apply to a smaller school for weaker students, but there's none of this "apply to 50 Universities to you can get into the most prestigious one... oh I mean 50 Colleges because we don't want to sound elitist"
Ranking should be based on selectivity (Score:2)
These rankings are predicated on a flawed idea that has been disproven empirically: That certain "selective" colleges give one a better education, and lead to better career outcomes. The data show that choice of undergraduate institution has almost no impact on a person's eventual career success; that correlates with one's choice of major, and how hard one studies in college. (The same isn't true of graduate institutions, where there is some evidence that institution does have a small impact on career outco
Rankings are a joke because ... (Score:2)
Rankings are a joke because:
(1) It seems obvious that "quality" (or whatever the rankings are supposedly representing) is not amenable to ranking because quality is (1) not unidimensional or scalar and (2) not definable in a way that is agreeable to most readers.
(2) It's not obvious how "quality" (or whatever the rankings are supposedly representing) is desirable or translates to metrics that are desirable (such as financial return on investment or probability of obtaining a job).
(3) Even if "quality" were
It is all CYA stuff (Score:3)
Doximity required (Score:1)