A Critical Look At The Doximity Residency Navigator

medical education Jan 23, 2024

It’s officially fall here in Mississippi. While most people think of this as a time to enjoy college football, indulge in pumpkin spice everything, and start planning for the holidays, it means something else for those of us involved in medical education and training. For us, it’s peak Interview Season and medical students are interviewing with programs around the country in hopes of matching with a residency in their chosen specialty.

 

 “Interview Season” is kind of like the holidays — some people love it, some hate it, some look forward to it, some dread it, and some end up completely stressed out and ready to scream.

 

 Most physicians who have survived the residency application, interview, and matching process prefer never to think about it again, but I don’t have that luxury. As a residency program director and an assistant dean for a college of osteopathic medicine, I’m still right in the thick of it. Every summer I watch students start the research and application process, and every fall I interview candidates for the internal medicine residency program I direct.

 

 I have a vested interest in the whole process for two reasons: I want to help my students find and match with excellent residency programs, and I want to attract the best possible candidates to my residency program. So when someone tells me about a helpful tool or strategy for navigating the system, I pay attention.

 

 One of the most popular residency research tools with current medical students is the Doximity Residency Navigator. Doximity claims the tool is “built to help medical students make informed residency decisions and to increase transparency in the residency match process”, so I decided to check it out and write an honest (and hopefully helpful) review of what I found.

 
 
 

So here goes — my unfiltered thoughts on the Doximity Residency Navigator —complete with the good, the bad, and the “come on guys, that’s really got to change.”

 
 
 

How does Doximity rank residency programs?

 

 

 

First things first, how does Doximity rank residency programs? And which programs do they rank? According to their methodology documentation, how they rank programs is relatively straightforward. Doximity bases its rankings on three major criteria:

 
 
 

1. Current resident and alumni satisfaction data

 
 
 

Doximity uses a Resident Satisfaction Survey to assess how satisfied current residents and recent alumni are with their residency programs. The survey covers topics like work-life balance, career guidance, culture, clinical diversity, and preparation for board certification in an attempt to give prospective residents insight into how well programs will treat them and how well they’ll prepare them for success as a physician.

 
 
 

2. Reputation data

 
 
 

Reputation data is gathered by surveying board-certified physicians and asking them to list the residency programs that offer the best clinical training in their chosen specialty. Responses are aggregated and weighted based on alumni status, program size, graduation year, and Program Director input.

 
 
 

3. Research output

 
 
 

The third major determining factor in Doximity’s residency rankings is research output.

 

Participation in clinical trials, number of research grants awarded, percentage of a program’s current residents and recent graduates who publish, and number and quality of published works by recent alumni (<10 years) all factor into a program’s research output score.

 
 
 

As for which programs Doximity includes in their rankings and who has input, Doximity ranks U.S.-based “M.D. programs” in a variety of specialties, and ABMS board-certified physicians can nominate programs for consideration as top residency programs.

 

 And there you have it — Doximity’s rankings explained. Now let’s talk about how effective this methodology is in assessing residency program quality.

 
 
 

Three major flaws in the Doximity Residency Navigator methodology

 

 

 

Let me start by saying I don’t hate everything about the Doximity ranking methodology. The first two criteria — resident satisfaction and reputation — are spot-on and critically important. Both matter a great deal. Resident satisfaction matters because good programs should offer both excellent training and respect for the work-life balance and well-being of program participants.

 

 Similarly, reputation is an excellent measuring stick for the quality of a program. If other doctors recognize that graduates from a particular residency program do well and take good care of patients, it reflects well on the program — and the reverse is true as well.

 

 But that’s where my praise ends — because I think there are some big problems with the rest of the Residency Navigator methodology. To be blunt, I’m pretty fired up about them because I think the problems seriously undermine the credibility of Doximity’s residency rankings and do medical students trying to find quality residency programs a disservice. Here are the three major flaws I see:

 
 
 

1. No patient input

 

 

 

The Doximity rankings include input from residents and alumni, but patients are totally excluded. That’s a big problem because the whole point of residency programs is to help young doctors develop the skills necessary to competently take care of patients.

 

 Without patient input, there’s no way to measure how well programs are performing this critical task. I mean, come on — how the hell can we assess residency programs without considering how well they prepare physicians to take care of patients?

 
 
 

2. DO exclusion

 

 

 

Patients aren’t the only ones excluded from consideration in the Doximity rankings — DOs get left out, too.

 

 To start with, Doximity states that only MD residency programs are eligible for inclusion. Since all physician residency programs in the United States are now accredited under ACGME, this is flawed language and implies a distinction between MD and DO programs that does not exist.

 
 
 

Even worse, only ABMS board-certified physicians can nominate programs for consideration as top residency programs in the Doximity rankings. I have several problems with this.

 

 First, it leaves out tens of thousands of competent DOs who have been certified under AOA instead of ABMS. In the US, ABMS & AOA can both certify physicians and are recognized as equivalent certifications by ACGME, insurers, regulators, governments, and other entities nationwide. DOs can be certified under either board and so can MDs — as long as they complete a residency with osteopathic recognition. Doximity’s choice to exclude AOA-certified doctors is simply indefensible.

 

 Second, by ignoring input from AOA-certified physicians, the Doximity rankings skew heavily in favor of large academic centers. Here’s what I mean: most DOs (and MDs) who are certified under AOA train for community-based practice — often in residency programs outside large cities or academic centers. Because these physicians aren’t allowed to nominate programs, residencies geared toward community-based practice don’t get the same recognition as large academic programs in the rankings.

 

 Third, I think it’s a mistake to focus on board certification when determining who gets to contribute to the rankings.

 

 Why?

 

 Because the goal of residency training is not board certification — it’s to produce doctors who are qualified and competent to take care of patients. 

 

 I realize this might seem a little nitpicky, but qualified and certified are not the same thing, and I think it’s important not to lose sight of the fact that training residents to provide excellent patient care is more important than preparing them to pass a test.

 

 Qualified physicians will have no trouble passing the certification exam, but passing a test doesn’t make someone a good doctor. And I know some damn good docs who aren’t board-certified simply because they choose to not participate in what a growing number of physicians perceive to be an onerous, expensive, and non-value-added certification process.

 
 
 

3. Research vs Patient Care Bias

 

 The third major flaw I see in the Doximity residency ranking methodology is the weight placed on residency programs’ research output. Let me be clear — I am NOT anti-research. I believe research is important — it’s how we get better and improve the care we offer our patients — but I don’t believe research output should be such a major factor in evaluating the quality of a residency program.

 

 I’ve said it before and I’ll say it again: the main point of residency programs is to train doctors to take good care of people — not to produce research or anything else. The residency programs with the highest research output tend to be in large academic centers where residents move on to fellowships and then to clinical practice in academic settings (often remaining in those large academic centers.

 
 
 

Physicians who train for and dedicate themselves to community-based practice produce much less research, but the high-quality patient care they provide is just as valuable as the research produced by their academic colleagues.

 

 Here’s my point: by heavily emphasizing research output, the Doximity rankings are effectively saying that residency programs based in research or academic settings are better than those focused on community-based practice — and that’s just not true. Instead, they simply have different missions and aims for their programs.

 
 
 

Some final thoughts on the Doximity Residency Navigator rankings

 

 With a few changes, the Doximity rankings could be a really useful tool for helping students navigate the residency application process. As is, medical students should take the rankings with a big grain of salt.

 

 They leave out vital factors like patient satisfaction and DO input, and they skew heavily in favor of large research-focused programs. On the other hand, they’re a great source of information about resident satisfaction and overall reputation as perceived by MDs.

 

 And that, in a nutshell, is my two cents on the Doximity Residency Navigator rankings. As promised, I’ve given you my take on what’s good, what’s bad, and what’s completely ridiculous.

 

 Now I’d love to hear what you think. Have questions? Disagree? Need help navigating the world of medical residency? Whatever’s on your mind, I want to hear from you.

 

Continue the conversation by leaving a comment or sending me a message. 

 

 Click here to subscribe and stay up to date with all my musings on physician finance, healthcare reform, personal health, medical education, and more. 

Share this post:

You're safe with me. I'll never spam you or sell your contact info.