Medical Students, Advising, Op-Ed

SLOEs: Leaving Students Blind

Standardized evaluations have been a staple of emergency medicine residency applications since 1995, when the Council of Residency Directors in Emergency Medicine (CORD) established the Standardized Letter of Recommendation (SLOR).

The SLOR was revamped in 2014; the name became Standardized Letter of Evaluation (SLOE) to better reflect its purpose of evaluating potential EM physicians.

Perhaps the best-known element of a SLOE is the global assessment, in which a candidate is ranked as top 10%, top third, middle third, or bottom third relative to their peers also pursuing an EM residency. A review of all SLOEs from 2016-2017 indicated that 18% of students were in the top 10%, 37% in the top third, 35% in the middle third, and 10% in the bottom third.1 These rankings are skewed toward the more favorable levels. Still, they are a significant improvement from a whopping 40% of students ranked in the top 10% in a study examining a subset of 2011-2012 SLORs. This improvement likely can be attributed to the change of name and increased training of SLOE writers.2 While the skew toward grade inflation still exists, it is theoretically rectified by reporting how many students received each rank from the institution the year before on each SLOE.

EM-bound medical students may not learn about SLOEs until their second, third, or even fourth year of medical school, yet SLOEs are the most critical element of the ERAS application. The 2020 Program Director Survey results show that letters of recommendation in EM (SLOEs) are the most important factor in an application when deciding to extend interview offers.3 SLOEs remain critical when programs are ranking applicants, with all program directors who responded to the survey citing it as a factor when making rank lists.

SLOEs are arguably the most influential piece of a student’s application to an EM residency. Yet, there is an important distinction from other factors such as board scores, grades, and experiences: applicants usually do not know their metrics. Even if an applicant did not waive their right to see the letter in ERAS, that only grants them the opportunity to see their SLOE after residency has started. Of course, students are historically expected to waive this right anyway and may open themselves to bias if they choose to retain their right of review. While some programs share components of the SLOEs they write with students, it is far from standard practice. Many students are left to guess their competitiveness based on the most crucial piece of their application.

As medical student leaders on EMRA’s Medical Student Council, we feel some content of SLOEs should be shared with students who complete audition rotations with EM residency programs. Even if only the results of question one in the global evaluation section of SLOEs were shared with applicants, along with the program-specific evaluation distribution, students could make better-informed decisions when applying to EM residency programs. Therefore, making SLOE results available to students would allow the results to function much more like a board score report, with further evaluation components and comments within the SLOEs remaining hidden to students. This transparency would alleviate significant stress and uncertainty among anxiety-ridden applicants and allow students to have appropriate application strategies.

However, it would be irresponsible to question a decades-old practice of keeping SLOE results from students without considering the potential repercussions of changing the system. We will consider those ramifications below.


Point 1: A blinded evaluation is more honest, and unblinding could lead to grade inflation.

Counterpoint: Students invest time, money, and professional opportunities in the medical training process. They want and deserve transparency in evaluations, even if that is uncomfortable for evaluators. Students should have an accurate understanding of how they are performing relative to both their peers and to the expectations of senior colleagues. Without this, students lose opportunities to improve and cannot effectively manage their career planning. Grade inflation would be counterproductive to these goals, though we do not believe unblinding will contribute to grade inflation.

While there is evidence that SLOEs have some inflation, with only 10% of applicants receiving lower-third rankings, it has improved.1,2 A study of SLOE writers found that the most significant contributor to not strictly adhering to SLOE guidelines was fear of adversely affecting a student’s opportunity to match.4 This would not change if the evaluation became unblinded. The SLOE guideline adherence has improved due to better training of letter writers and further education surrounding the stigma of a lower-third ranking. It will be essential to continue these efforts. However, it feels disingenuous to say that SLOE authors would not follow SLOE guidelines if SLOEs became unblinded. Rather, under scrutiny from students, we believe that the most logical step for a program would be strictly adhering to SLOE guidelines to justify their system for ranking.

As we enter a world of an unscored Step 1 and Level 1, it will be more important than ever for SLOEs to maintain their integrity as a tool to stratify applicants. While we respect the concern that grade inflation could be an unintended consequence, we feel this fear is overstated and does not outweigh the benefit of respecting students as future colleagues with honest, transparent evaluations.


Point 2: This is a high-stakes evaluation, and students might resent a “poor” evaluation without the proper context.

Counterpoint: The opportunity for students to simply view subjective evaluations of their performance has long been in practice with core and elective rotations. Although student resentment and generally not accepting a grade with “grace” should be taken as legitimate concerns, there are better ways to address these concerns than leaving students blinded to evaluations. Further, student experiences point to students receiving feedback like professionals so long as they are treated as professionals.

Although students spreading negative reviews of an institution online or harboring resentment for a program after receiving a poor evaluation is a legitimate concern, we already have some insight into this experience. Social media channels connect us and give us valuable information that would otherwise be unavailable. Students already communicate about these matters without being able to view their SLOE. Contrary to popular belief, students with clear expectations from programs respect embedded program policies and acknowledge that the middle-third is still a credible ranking from programs that adhere to SLOE guidelines. From these medical student-created online gathering spaces, it is clear that students crave transparency and respect that programs follow SLOE guidelines, so long as they are given insight into their evaluation. Of course, if a student completes a full-month EM rotation and only finds out they may receive a lower-third ranking during an exit interview, it is reasonable to be concerned about how they may receive this evaluation. This concern points to a more significant issue: students and programs alike need a more robust feedback structure with clear expectations when giving and receiving feedback and evaluations. While many programs offer students feedback and meetings with a clerkship director throughout sub-internship, it is not standardized, and despite a whitepaper on feedback structure during an EM clerkship, there is not widespread implementation.5

Performing well on the EM rotation is understood to be necessary. This means excelling on objective measures of performance, as well as actively seeking out feedback and improving throughout the rotation. However, even when students seek out and receive feedback, they cannot accurately predict their SLOE rankings.6 This suggests students should have a mid-rotation meeting with a clerkship director or SLOE author to address any areas of weakness the student may not perceive as deficiencies, priming them for the evaluation they are working for, and allowing them to improve. We posit that not only will this communication help determine if EM is the right fit for a candidate, but it will also reduce the chance of a student reacting poorly to an evaluation or resenting the rotating institution.


Point 3: If students know their SLOE ranking, they might choose not to include it in their application.

Counterpoint: Medical education is a long journey that pushes students to operate at their best at all times. Students are conscious of every activity and opportunity they undertake and how it could affect their future careers. The SLOE plays a critical role for medical students. A poor SLOE has the potential to ‘SLOEpedo’ an application, as evidenced by a significantly increased risk of going unmatched for applicants with any lower-third global assessment ranking.7,8 So, a student with a lower-third SLOE ranking may choose not to assign this SLOE as one of their letters. While this could certainly be a consequence of unblinding SLOEs, many potential solutions exist.

First and foremost, not including a SLOE from an EM rotation is already considered a red flag, and this would increase if SLOEs became unblinded. Most schools report early elective clerkships in the transcripts provided through ERAS, so it is easy to cross-reference EM rotations and SLOEs for any clerkship in the transcript. However, not all rotations are reported, especially after the September ERAS deadline, so another potential solution is a central reporting system for away rotations or SLOEs on the programmatic end. Given that a centralized letter-writing system exists for SLOEs housed by CORD, this could be a logical next step. Even without any infrastructure change, there is always the option to directly ask students if they received a SLOE from each EM rotation they completed. While student transparency is a reasonable hesitation to unblinding SLOEs, alternatives afford students the right to their evaluations while maintaining the integrity of the process.

CONCLUSION
As EM-bound medical student leaders, we are uniquely positioned to speak for the population we represent – all students interested in EM, osteopathic and allopathic, U.S. students and international students. We fervently believe students have the right to know their SLOE global ranking. We are pushing forward to allow students to determine their competitiveness with clarity. Unblinding SLOEs will have many downstream positive impacts for students and programs alike. While there are valid concerns with this change, solutions allow us to mitigate these concerns while respecting students as soon-to-be colleagues with the transparency they deserve.


REFERENCES

  1. Jackson JS, Bond M, Love JN, Hegarty C. Emergency Medicine Standardized Letter of Evaluation (SLOE): Findings From the New Electronic SLOE Format. J Grad Med Educ. 2019;11(2):182-186. doi:10.4300/JGME-D-18-00344.1
  2. Love JN, Deiorio NM, Ronan-Bentle S, et al. Characterization of the Council of Emergency Medicine Residency Directors' standardized letter of recommendation in 2011-2012. Acad Emerg Med. 2013;20(9):926-932. doi:10.1111/acem.12214
  3. NRMP. Results of the 2020 NRMP Program Director Survey. Published August 2020. Accessed November 22, 2021.
  4. Pelletier-Bui A, Van Meter M, Pasirstein M, Jones C, Rimple D. Relationship Between Institutional Standardized Letter of Evaluation Global Assessment Ranking Practices, Interviewing Practices, and Medical Student Outcomes. AEM Educ Train. 2018;2(2):73-76. Published 2018 Jan 31. doi:10.1002/aet2.10079
  5. ​​Bernard AW, Kman NE, Khandelwal S. Feedback in the emergency medicine clerkship. West J Emerg Med. 2011;12(4):537-542. doi:10.5811/westjem.2010.9.2014
  6. Kukulski P, Ahn J, Babcock C, et al. Medical student self-assessment as emergency medicine residency applicants. AEM Educ Train. 2021;5(3):e10578. Published 2021 Feb 19. doi:10.1002/aet2.10578
  7. “The Emergency Medicine Re-Applicant Applying Guide.” CORD EM, CORD EM, https://www.cordem.org/siteassets/files/committees/student-advising/2021-updated/asc-em-advising-the-re-applicant-2021-22-application-cycle.pdf.
  8. Hansroth JA, Davis KH, Quedado KD, et al. Lower-Third SLOE Rankings Impede, But Do Not Prevent, A Match in Emergency Medicine Residency Training. J Med Educ Curric Dev. 2020;7:2382120520980487. Published 2020 Dec 17. doi:10.1177/2382120520980487

Related Articles

The Trouble With Plasma

Ellen Shank 10/10/2023
Remunerated plasma donation remains a necessary unpleasantry. In response, we can bolster the voluntary systems that uphold our country’s current blood product supply by signing up for regularly sched

Donating Eggs in Medical School: A Story of Bodily Autonomy and Privilege

Elise Prehoda 10/10/2023
As a second-year medical student who had been anxiously struggling with unanticipated medical expenses and no viable income, I thought about the upcoming months-long amount of time during which I woul
CHAT NOW
CHAT OFFLINE