ken explains the hidden curriculum: what goes in to US News & World Report rankings?

If you applied to med school, you surely took a look at this website.



The 2014 rankings just came out. The short summary if you're interested: MUSC is #59. There were two public schools in the top 10 (UCSF and Univ. of Michigan). UNC is #1 for primary care. Whether or not you think these rankings are important, everyone else does, so you might as well know how they work. If you've ever wondered why med schools do certain things, it's either to satisfy  LCME demands or to move up in US News & World Report rankings. These are the incentives that drive med schools. 


Lets start with the biggest chunks:

20% of the ranking is the opinion of other deans.

20% of the ranking is the opinion of residency directors. 

So 40% of the rankings are determined by the opinion of extremely busy people. These are individuals who might be keeping up on the latest activity of Stanford med, but they certainly aren't keeping up with the newest moves made by a school like the University of Vermont. At a recent panel of med school Deans, one dean admitted to ranking schools purely based on previous reputation because she did not know how good they were recently. Do you think a top 5 school could get by based on reputation? I'd guess yes. This is a rich-stays-rich scheme. 

30% of the ranking is faculty research activity. 

First off - how many med students are interested in research? Not many. So, why does this even matter? Second of all - do you really think med students work in these high-powered labs? Unlikely. Maybe the rare gunner MD/PhD student, but otherwise these labs aren't wasting their time training you to hang out for a summer. They're probably busy recruiting Chinese post-doc machines. 

So we've already established that 70% of the pie is worthless. Moving on. 

13% of the ranking is MCAT scores. 

Another 6% is undergrad GPA. 

And 10% is faculty:student ratio. 

Arguably, these last 29% are the best current measure of the best med schools. It's at least telling you which schools have lots of potential teachers, and which schools are bringing in the best performing undergrads. I'd be interested to see the rankings solely based on this last 29%. The first 70% basically provides the current big ponds with a huge amount of buffer to keep doing what they do, while all the small ponds are left to fight for scraps. It's a system designed to keep lower and mid-tier schools where they're at. Essentially, the only way those mid-tier schools can move up is by improving the undergrad GPA/MCAT scores, or recruiting big grant-getters. Neither of which actually improves the quality of med school education.

Overall - this is terribly disappointing. When motivated and talented 20 year olds get on the internet to decide where they want to do their medical education, these are the rankings they're looking up. I'm sure you've been told these rankings are meaningless, but you sure as hell still looked at these rankings and it meant something to see which schools were high on the list. Really the only thing these rankings tell you is that Harvard will continue to be Harvard until we start coming up with better metrics. 

see you on the other side,

from ken

enjoy sidenote in 140 characters or less @kensidenotelife.

7 comments:

  1. I don't think MCAT or GPA are a good metric to rank med schools by. These stats aren't evaluating the school, they are only evaluating the students that choose to attend that school. Not that I think grades and tests are the best way to evaluate students or schools, but wouldn't it make sense to incorporate board scores into the rankings instead of undergrad stats???

    I agree that these rankings are meaningless, but I'm not sure how to make them better without med schools investing money to hire outsider, unbiased evaluators which would probably be difficult to find anyway and might cost students more in tuition. How do you suggests we change the metrics?

    ReplyDelete
    Replies
    1. Yeah I definitely agree that MCAT/GPA are not good metrics. I just think it's better than the evaluations by reputation.

      I'd think better metrics have to do with students performance and satisfaction. Like step 1 and step 2 scores.

      Off the top of my head here are some qualities that would make a med school good - real relationships between students and faculty, quality clinical experiences, student involvement, percentage of students staying there for residency, cost, how long relationships between students last post med school, anything else you can think of?

      Delete
    2. Great post! People like to look at numbers and rankings, because it is quick and easy, but they dont really look behind the numbers. One big reason I don't like rankings is you have to have something definable in order to scale it. MCAT/GPA are definable things, so it is easy to scale them. I didn't know about those other factors that go into the USN & WR rankings, pretty bs standards. That makes me think of the other reason I dont like rankings, factoring in someone's bs probably very biased opinion.

      I like the qualities you laid out for a good med school, just hard to make them into a definable and scalable. Do you think there is some way to make those rankable?

      Delete
    3. Yeah it really surprised me to see how full of shit the rankings were. It's basically an old boys club.

      The only way I can think of to make those qualities rankable is to survey students and faculty in some sort of unbiased way. I'll try to come up with a half-baked idea. Do you have any ideas?

      Delete
  2. Couldn't agree with you more... I actually intentionally did not look at the rankings. I didn't know those other components were in there besides just research $, though, so thanks for the info. I think the ideas that you listed up there are great. We're sitting at #31, but the things you listed are almost all things that Keck is strong in, and we wind up with a lot of very satisfied students who match well, with probably about 1/3 staying here. It is unfortunate that the factors going into the current rankings are what drives changes, though, because it does result in things like students being pushed into required research projects that are more for school appearances than our own real benefit.
    I think they could do away with the opinion of other deans completely. That doesn't even make sense as a component. I think it does make a little bit of sense to keep the opinion of residency directors though, if you want the rankings to be a useful tool for students to choose a school. I completely agree that it is a mechanism for the top schools to stay on top, but if you want to know where to go if you want your school's image to help you match well it is something to consider.

    ReplyDelete
    Replies
    1. Yeah it's a sad reality that things that are so unrelated to doctoring drive med education decisions.

      Delete
    2. Also 1/3 of students staying is awesome.

      Delete