ChatGPT is 'robbing applicants of their voices' in residency applications

The emergence of OpenAI’s large language model ChatGPT has made way for an influx of medical students utilizing the chatbot to help author their personal statements for residency applications. 

With the proper prompts, ChatGPT can easily generate entire essays, and its medical writing has come so far since its initial release that even the software developed to spot AI-authored materials sometimes fails to flag the LLM’s works. This has prompted concerns about plagiarism and misrepresentation, especially when ChatGPT makes its way into residency applications, authors of a new analysis in the Journal of the American College of Radiology suggest. 

“Considering that up to one-third of college students and up to one in five of U.S teens aware of ChatGPT report using artificial intelligence for schoolwork, it is unsurprising that medical students might leverage this technology for their applications for residency,” Emile B. Gordon, with the Department of Radiology at Duke University Health System, and co-authors note, adding that "up to 20% of medical students report using it for written assignments and clinical work.” 

Researchers recently sought to determine how radiology residency program directors perceived the use of ChatGPT on student applications. The team had eight program directors blindly review four personal statement variations—one original and three GPT-4 variations authored using different prompts—for five applicants to gauge their abilities to differentiate between the statements of students and ChatGPT. Participants also completed surveys related to their opinions on AI-generated statements. 

Survey responses revealed that the program directors were not confident in their ability to spot the papers authored by LLMs. Despite this, they ably identified 95% of the materials written by human applicants as “probably or definitely” original. They also consistently rated GPT’s statements as below average compared to student written works.  

The group acknowledged that the growing use of AI in residency applications is likely inevitable, though they expressed concern over how this would affect the authenticity and value of personal statements in the future. 

“Focus group discussions elucidated a growing sense of depersonalization of the application process with the increased use of AI for personal statement creation. Participants agreed that, historically, residency applicants could occasionally distinguish themselves through their essays but feared that the influence of personal statements in the era of AI would be diminished.” 

Though the directors had varying opinions on how to best regulate AI use to help with the application process, they agreed that overuse could “rob applicants of their voices” and potentially render personal statements on applications non-contributory in the future. 

Learn more about the findings here. 

Hannah murhphy headshot

In addition to her background in journalism, Hannah also has patient-facing experience in clinical settings, having spent more than 12 years working as a registered rad tech. She joined Innovate Healthcare in 2021 and has since put her unique expertise to use in her editorial role with Health Imaging.

Around the web

The nuclear imaging isotope shortage of molybdenum-99 may be over now that the sidelined reactor is restarting. ASNC's president says PET and new SPECT technologies helped cardiac imaging labs better weather the storm.

CMS has more than doubled the CCTA payment rate from $175 to $357.13. The move, expected to have a significant impact on the utilization of cardiac CT, received immediate praise from imaging specialists.

The newly cleared offering, AutoChamber, was designed with opportunistic screening in mind. It can evaluate many different kinds of CT images, including those originally gathered to screen patients for lung cancer. 

Trimed Popup
Trimed Popup