ChatGPT is 'robbing applicants of their voices' in residency applications

The emergence of OpenAI’s large language model ChatGPT has made way for an influx of medical students utilizing the chatbot to help author their personal statements for residency applications. 

With the proper prompts, ChatGPT can easily generate entire essays, and its medical writing has come so far since its initial release that even the software developed to spot AI-authored materials sometimes fails to flag the LLM’s works. This has prompted concerns about plagiarism and misrepresentation, especially when ChatGPT makes its way into residency applications, authors of a new analysis in the Journal of the American College of Radiology suggest. 

“Considering that up to one-third of college students and up to one in five of U.S teens aware of ChatGPT report using artificial intelligence for schoolwork, it is unsurprising that medical students might leverage this technology for their applications for residency,” Emile B. Gordon, with the Department of Radiology at Duke University Health System, and co-authors note, adding that "up to 20% of medical students report using it for written assignments and clinical work.” 

Researchers recently sought to determine how radiology residency program directors perceived the use of ChatGPT on student applications. The team had eight program directors blindly review four personal statement variations—one original and three GPT-4 variations authored using different prompts—for five applicants to gauge their abilities to differentiate between the statements of students and ChatGPT. Participants also completed surveys related to their opinions on AI-generated statements. 

Survey responses revealed that the program directors were not confident in their ability to spot the papers authored by LLMs. Despite this, they ably identified 95% of the materials written by human applicants as “probably or definitely” original. They also consistently rated GPT’s statements as below average compared to student written works.  

The group acknowledged that the growing use of AI in residency applications is likely inevitable, though they expressed concern over how this would affect the authenticity and value of personal statements in the future. 

“Focus group discussions elucidated a growing sense of depersonalization of the application process with the increased use of AI for personal statement creation. Participants agreed that, historically, residency applicants could occasionally distinguish themselves through their essays but feared that the influence of personal statements in the era of AI would be diminished.” 

Though the directors had varying opinions on how to best regulate AI use to help with the application process, they agreed that overuse could “rob applicants of their voices” and potentially render personal statements on applications non-contributory in the future. 

Learn more about the findings here. 

Hannah murhphy headshot

In addition to her background in journalism, Hannah also has patient-facing experience in clinical settings, having spent more than 12 years working as a registered rad tech. She began covering the medical imaging industry for Innovate Healthcare in 2021.

Around the web

GE HealthCare said the price of iodine contrast increased by more than 200% between 2017 to 2023. Will new Chinese tariffs drive costs even higher?

These risks appear to be present regardless of a person's age or health at the time of infection.

Agfa and Sectra both performed well with end-user satisfaction scores in the 2025 Best in KLAS list of radiology IT systems.