Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Education Math The Almighty Buck United States

Mathematicians Deconstruct US News College Rankings 161

An anonymous reader writes "US News makes a mint off its college rankings every year, but do they really give meaningful information? A pair of mathematicians argues that the data the magazine uses is all likely to be at least somewhat relevant, but that the way the magazine weights the different statistics is pretty arbitrary. After all, different people may have different priorities. So they developed a method to compute the rankings based on any possible set of priorities. To do it, they had to reverse-engineer some of US News's data. What they found was that some colleges come out on top pretty much regardless of the prioritization, but others move around quite a lot. And the top-ranked university can vary tremendously. Penn State, which is #48 using US News's methodology, could be the best university in the country, by other standards."
This discussion has been archived. No new comments can be posted.

Mathematicians Deconstruct US News College Rankings

Comments Filter:
  • by j1mmy ( 43634 ) on Wednesday October 08, 2008 @05:05PM (#25305555) Journal

    At best, they provide a filter for individuals of a certain level of ability of competence, e.g. the average graduate from a school #1 is going to be more capable than the average graduate from school #100.

  • Playing the numbers (Score:5, Interesting)

    by timholman ( 71886 ) on Wednesday October 08, 2008 @05:30PM (#25305847)

    Several of the metrics that U.S. News uses do seem to be arbitrarily weighted, leading to some bizarre contortions on the part of the various schools to enhance their ratings. Most of the data is self-reported by the universities, which clearly provides a powerful motivation to spin or "enhance" the numbers to one's advantage. I have no doubt that several colleges fudge the numbers to raise their rankings, leading to a lot of frustration at other schools that are playing by the rules but feel that they're being cheated in the rankings.

    And some of the metrics make little sense. For example, engineering schools can raise their rankings by several places just by having one or more faculty members in the National Academy of Engineering. Yet NAE membership is essentially meaningless in terms of research and teaching, and hardly more prestigious than having faculty members who are Fellows in other established engineering societies. Yet U.S. News ignores the number of Fellows in IEEE, ASCE, ASME, etc., and focuses on NAE membership. So why the emphasis on NAE? Probably because the NAE told U.S. News that they were the most important engineering society, and U.S. News never questioned it, when in fact the NAE has almost negligible impact on higher education.

  • by fishbowl ( 7759 ) on Wednesday October 08, 2008 @05:39PM (#25305969)

    Having been to Stanford as a visiting scholar recently, I have to say I am very glad I never went there (tuition and housing orders of magnitude out of my reach or not.)

    Things I take for granted on a university campus, such as being able to walk into a library, or to use public wi-fi while sipping coffee somewhere on the grounds... These things are actually difficult or impossible for a visitor to do -- even a visitor with credentials who is there on academic business! I was *amazed* at the difficulty of getting into the Green Library for instance, and my week was pretty much destroyed by the fact that if you want to use on-campus wi-fi, "you can't", simple as that.

    At every turn, everything that could have been convenient for a visitor was hostile. I ended up rushing through my research and spending all my time at a coffee shop in Palo Alto (where the wi-fi was free, and nobody minds if you hang out and work.)

    Thanks Stanford, you're awesome.

  • by crashcodesdotcom ( 813209 ) on Wednesday October 08, 2008 @05:44PM (#25306023)
    Excellent idea. I'd take it one step further...

    Offer the service on the site as you suggested. After some time period assign weights based on what the people using the site used. Tada! Maybe then you get a list worth printing?
  • by DeadDecoy ( 877617 ) on Wednesday October 08, 2008 @06:34PM (#25306649)
    That's actually a pretty sad indication of how much a college name matters over what you do. I had a friend who did his undergrad at MIT, and when applying for jobs, he was insta-accepted to various tech jobs. No interview. No background checks. Just an open door. Given that, he refused those jobs because that easy entry gave him some indication as to who the companies hired and on what criteria. On the other extreme, were a couple of people who had to work twice as hard because they had to sell the college they attended. It's a little sad, but it's the reality.
  • Anecdotal (Score:4, Interesting)

    by Anonymous Coward on Wednesday October 08, 2008 @06:42PM (#25306721)

    I'm a law student. I also attend one of the most maligned law schools in the country. Not entirely by choice.

    Oh sure, I wanted to go to the University of Michigan. I wanted to go to Georgetown. I applied to a number of elite law schools, and was surprisingly accepted by most of those that I applied to. The problem was money. Law school, as you can imagine, is pretty expensive. It's typically a 3-year program that runs anywhere from $25-50k/year for tuition alone. Build in the cost of books, rent, food, etc. and you're looking at another $15-20k/year. Federal student loans aren't that generous, and the terms on the private loans make them rather detestable. While my grades were good enough to get me into those high-end schools, hey weren't good enough to make be stand out enough to get much in the way of scholarships. And since I was paying for school by myself, I had to take a look at my safety schools. So I started researching the various ranking systems and what criteria they used.

    One of the major ranking indexes I looked at, for example, heavily weighted entrance requirements as well as the attrition rate. The result was that the schools who only accepted people with the best GPAs and LSAT scores ranked high. That was expected. But the attrition rate? By its rankings, if two schools accepted students with the exact same criteria, the one with fewer failures/drop-outs after the first year ranked higher. That struck me as being really odd. A more rigorous program is desirable, and will likely result in more failures. Meanwhile, the school I go to will take in very average students the first year, and has a huge failure rate; anywhere from 20-50%, depending on who you ask. The first year professors are brutal, and the whole year is designed not only to teach you, but to weed out the people who don't really want or deserve to be there. Consequently, they get hammered in almost every ranking except for "most competitive students," where it's in the top 10 in the country.

    Then I started noticing some other oddball problems. That same ranking service said that the average undergraduate GPA and LSAT score were below the school's minimum requirements. At several schools, I noticed that, if they offered part-time programs, it looked like an incredibly low portion of the students were enrolled full time. Then I realized how they were figuring that out: it wasn't by graduation, it was by sampling year-to-year enrollment.

    Example: Say a normal student graduates in 3 years. A part-time student graduates in 6. Over 6 years, the school graduates 60 full-time students (let's say they're spread out evenly at 6 per year) and 10 part-time students. The thing is, because of their sampling, those part-time students wind up being counted for twice as long. So at any given time in that 6-year period, you have 18 full-time students, and 10 part-time students. The sample is going to show that more than 1/3 of the student body is part-time, even though the school is graduating six times as many full-time students. It's rather misleading.

    I noticed a number of other glaring issues, too. For example, prestigious schools have loads of information published, while the less prestigious schools usually have little more than a few out-of-date statistics. Self-reinforcing, no?

    In the end, it felt like the ranking systems were a complete waste. They rank everything but the quality of the education. And while I don't mean to play to the cliche, because I know it's not universally true, but I actually flew around the country and visited a couple of those "elite" schools that I was accepted into. They don't let you forget how "elite" they are. At all. The snobbery was utterly overwhelming. One of them told me that their students were "the Maseratis of law school." /gag

    I wound up going to the school that offered me the biggest scholarship.

  • Re:Reputation (Score:5, Interesting)

    by Abreu ( 173023 ) on Wednesday October 08, 2008 @06:50PM (#25306833)

    Something very weird happens here in Mexico.

    According to several international studies*, The National Automonous University of Mexico (UNAM), and the National Polytechnics Institute (IPN), the two largest public universities in the country, are the best institutions of higher learning in the country.

    Yet it is very common to see "UNAM, IPN, graduates need not apply" in job listings. Why?

    Because employers seem to believe that the networking and prestige of the exclusive private schools are worth more than being a graduate of the two institutions that generate 90% of the scientific research in the country!

    Sources:
    http://www.topuniversities.com/worlduniversityrankings/results/2007/overall_rankings/top_400_universities/ [topuniversities.com]
    http://www.arwu.org/rank2008/ARWU2008_TopAmer(EN).htm [arwu.org]

  • Comment removed (Score:3, Interesting)

    by account_deleted ( 4530225 ) on Wednesday October 08, 2008 @07:36PM (#25307241)
    Comment removed based on user account deletion
  • Re:Reputation (Score:3, Interesting)

    by mikael ( 484 ) on Wednesday October 08, 2008 @08:19PM (#25307577)

    From what I've seen in the UK, many company directors seem to have a preference for graduates from the university that they went to, rather than by any other selection method. But with so many qualified people chasing the same well-paying jobs, you can't really blame them. Otherwise they start using techniques like handwriting analysis, psychometric questionnaires and pop quizzes to divine who is the "safe bet".

Always draw your curves, then plot your reading.

Working...