The Problem with U.S. News Medical School Rankings (& Why Schools Are Leaving)

Facebook
Twitter
LinkedIn
Email

The U.S. News Medical School Rankings have long been an important resource for helping premeds decide where they want to go to medical school; however, this may no longer be the case. Here’s why nearly a dozen medical schools, including many top-ranked institutions like Harvard, Stanford, and Columbia University, have all left the U.S. News Rankings.

Over the last couple of months, many notable medical schools have publicly announced that they will no longer submit information to the U.S. News Best Medical School Rankings. This may come as a surprise to many students; however, many argue that this situation has been decades in the making.

Let’s discuss how the U.S. News medical school rankings work, and why so many medical schools are choosing to leave.

 

How the Best Medical School Rankings Work

To understand the issue, we must first understand what the U.S. News Rankings are and how they’re calculated.

Each year, U.S. News publishes two medical school lists – the Best Medical Schools by Research and the Best Medical Schools by Primary Care. These lists are meant to provide an objective comparison between medical schools to help students decide which schools would be a good fit for them.

In this article, we’ll be focusing on the Best Medical Schools by Research list as this is the one that students often consider when choosing which schools to apply to. Let’s start by breaking down the methodology.

There are 5 main factors that determine a school’s ranking: peer assessments, residency program director assessments, student selectivity, faculty resources, and research activity.

Peer assessments account for 15% of a school’s rating and are based on survey data from various MD and DO school deans, deans of academic affairs, heads of internal medicine, and directors of admissions. These individuals are asked to evaluate each medical school on a scale of 1-5 from marginal to outstanding. If they don’t know enough about a school to make an assessment, they have the option to mark “don’t know.”

Residency program director assessments account for 15% of a school’s ranking. Program directors are asked to rate each school on the quality of their research on a scale from 1-5 from marginal to outstanding. A school’s score is then based on three-year weighted averages of the three most recent years of ratings.

Student selectivity accounts for 20% of a school’s ranking and is based on three factors. The median MCAT score of matriculants accounts for 13% of the ranking, the median GPA of matriculants accounts for 6%, and the acceptance rate accounts for 1%. In short, schools that accept fewer students or students with higher GPAs and MCAT scores are ranked higher than schools that accept more students or students with lower GPAs and MCAT scores.

Next is faculty resources which accounts for 10% of a school’s rating. This is measured as the ratio of full-time faculty to full-time MD or DO students. The more full-time faculty that a school has relative to the number of medical students, the higher their score.

Lastly, there’s research activity which accounts for the remaining 40% of a school’s ranking. This can be further divided into total federal research activity, which is the total dollar amount of federal grants and contracts that each medical school and its affiliates received, and the average federal research activity per faculty member. Total federal research activity accounts for 30% of a school’s ranking and the average federal research activity per faculty member accounts for 10%.

In total, 40% of the ranking comes from federal research funding, 30% from subjective assessments, 20% from student selectivity, and 10% from faculty resources.

 

Problems with U.S. News Methodology

Now that we understand how the medical school rankings work, let’s talk about the issues that people have with its methodology.

Perhaps the biggest issue is that people view these rankings as a reflection of the quality of a school’s education – despite none of the criteria directly assessing this.

The largest proportion of a school’s ranking comes from federal research funding. This gives a disproportionate preference to larger schools and doesn’t consider the impact of a school’s research in furthering our medical knowledge.

Next, 30% of the ranking comes from subjective assessments from medical school administration and residency program directors — meaning that it comes from an individual’s perception of a school. In addition to introducing bias into the rankings, there’s the issue that not every dean and program director is knowledgeable about each medical school. As such, this heavily favors schools that already have a strong reputation and negatively impacts smaller schools – even if they may be providing a quality medical education and performing groundbreaking research.

Next, there’s student selectivity. This metric only assesses a student’s academic achievement before starting medical school, which has little to do with the quality of a school’s education.

Lastly, faculty resources only consider the number of full-time staff per medical student without assessing the quality of instruction or their involvement in student education. As such, a school can have hundreds of research or administrative staff who aren’t directly involved in a student’s education, yet it will still positively affect their ranking.

Although one could certainly argue that research funding, student selectivity, and faculty resources may impact the quality of a student’s education indirectly, they are not direct measurements. The weaknesses of the U.S. News methodology aren’t new either. People have been questioning these measurements for decades. In fact, one researcher wrote about the issues with this ranking system over 20 years ago, only to come out with a similar paper in 2019 noting that “little has changed” in the past two decades.

 

Other Problems with U.S. News Ranking

The methodology isn’t the only issue either. If this were the primary factor driving schools to leave the U.S. News Rankings, we likely would have seen schools start to leave a long time ago. Instead, the poor methodology is just the tip of the iceberg. Let’s explore the other reasons why schools are leaving.

Inaccurate Reporting of Data

First, there’s the issue of inaccurate reporting. There have been multiple scandals in recent years regarding schools not being truthful about the data they submit to U.S. News. Just last year, Columbia University came under fire after one of their mathematics professors published a paper regarding “inaccurate, dubious, and highly misleading” statistics being submitted to U.S. News after noticing his university’s significant rise in rankings over the last 2 decades.

Columbia initially defended its numbers before ultimately admitting the mathematics professor was correct, stating “we deeply regret the deficiencies in our reporting and are committed to doing better”. The professor suggested that the motivation behind manipulating the numbers was to use the increased ranking to justify rising tuition fees to help pay for increasing administrative staff.

Focus on Ranking Over Education

Next, there’s the fact that these rankings incentivize schools to focus on other metrics to improve their rankings — even if they run counter to medical student education. As discussed previously, none of the metrics used to calculate a school’s ranking measure the quality of the school’s education. Instead, these factors are more indicative of which schools have the most funding, the best reputation, and are most selective with the students they choose to accept. In addition, the inclusion of a school’s acceptance rate in their ranking encourages schools to accept as many applications from students as possible – even if they wouldn’t seriously consider them for a spot. Because MCAT and GPA are factored into a school’s ranking, schools are incentivized to choose students with the best stats instead of taking a more holistic view of applicants. A better system would incentivize schools to choose students based on who they believe will make the best doctors instead of prioritizing those who will help their rankings.

That being said, there is a balance. Minimizing the importance of objective metrics such as GPA and MCAT scores can open the door for nepotism and risks medical school admissions becoming more about who you know vs individual merit. Even with the influence of objective metrics on these rankings, there have still been many scandals whereby top institutions accept students based on factors such as pedigree and familial donations instead of merit. By removing these objective metrics, it would theoretically become easier for schools to hide nepotism.

 

How Will This Affect Premeds and Medical Students?

Now that we’ve discussed the issues with these rankings, let’s talk about how schools leaving the medical school rankings will affect premeds and medical students. Although medical school prestige does matter to a degree, it ultimately won’t be the primary determinant of your success in medical school, residency, or as an attending physician. Your individual performance will always be much more important than the name of your school.

For instance, when I was choosing which medical school to attend, I was considering two top 5 medical schools and one top 15 school. Although the prestige of the two top 5 schools was alluring, I ultimately chose to attend UCSD, which ranked 13th at the time.

The main reason that I was even considering the other two schools was because of their rankings, but I knew that there were other, more important, factors I needed to consider. Things like location, competitiveness, tuition, financial aid, cost of living, curriculum, specialty interest, personal opportunities, culture, and fit are all important factors to consider when choosing a medical school. For this reason, getting rid of the rankings may be positive in that it will take the focus away from prestige and shift it to other, arguably more important, factors.

Furthermore, without these rankings, there is less incentive for schools to prioritize GPA and MCAT scores. Although this may help them take a more holistic approach when reviewing applicants, it may also open the door for preferential admissions and nepotism. Getting rid of these rankings may also remove an incentive for schools to hire additional administrative staff which may help to decrease tuition costs for students. That being said, this is most likely just wishful thinking.

Although “Best Medical School” lists may seem like a helpful tool to narrow down your choices, it’s always best to do your due diligence and do your own research when deciding which school is the best fit for you. Choosing a school based on prestige or reputation and blindly assuming the quality of education is better is a foolish way to approach your decision.

Instead, your goal should be to find the school that provides you with the most opportunities for personal growth and development so you can become the best and most competent physician you can be.

I believe that getting rid of an arbitrary ranking system has the potential to have some positive impact on students; however, as with most things in life, there may be unintended consequences as well. Only time will tell how this actually impacts medical school admissions and whether or not it’s a net positive for students. I’d like to be optimistic and say that it is; however, it’s impossible to know for certain.

What’s important is to focus on the things that you can control, like becoming a stand-out premed with a high GPA and MCAT, stellar extracurriculars, and glowing letters of recommendation.

“Life is 10% what happens to you and 90% how you react to it”. So instead of worrying about the 10% you can’t control, figure out how you can make the other 90% work in your favor.

If you find yourself getting discouraged because your dream school is highly competitive, stop right there. Your ability to crush your MCAT, achieve a high GPA, and have glowing letters of rec is less a function of your intelligence and more a function of proper preparation, constantly improving, and putting in the work. Having a stellar medical school application is no different.

At Med School Insiders, our mission is to empower a generation of happier, healthier, and more effective future doctors. From medical school or residency application help to crushing your MCAT or USMLE, we’ve got your back. We know what you’re going through and can help you excel on your exams and help put you in the best light possible with your medical school and residency applications. And our results speak for themselves. We’ve become the fastest-growing company in the space with the highest satisfaction ratings.

If you enjoyed this article, check out our piece on Does Medical School Prestige Matter? 

Facebook
Twitter
LinkedIn
Email

Leave a Reply