ACR Assess-AI national data registry tracks performance of clinical algorithms
The American College of Radiology (ACR) recently unveiled the Assess-AI national radiology data registry designed to monitor and evaluate the performance of clinical medical imaging artificial intelligence (AI) algorithms. This initiative aims to ensure the sustained accuracy, safety and efficiency of AI tools across diverse radiology departments.
"As the implementation of AI progresses, what has become really clear is that legacy PACS systems were built during in a time when AI was not a thing. What is becoming clear is this technology doesn't perform the same over time, because our departments change so you need to monitor the performance. We change scanners and we change protocols over time, so even though the product may have worked well when you first put it in, you really need to keep an eye on that over time," explained Christoph Wald, MD, PhD, MBA, FACR, vice chair of the ACR Board of Chancellors and chair of the ACR Commission on Informatics.
He emphasized the importance of continuous monitoring to account for variables such as AI drift from changes to data inputs. This was a big topic in AI in sessions and in conversations with radiology AI experts at the Radiological Society of North America (RSNA) 2024 meeting in December, where Wald spoke with Health Imaging.
The ACR has created numerous tools online to help its members become educated about radiology AI, which makes up more than 75% of the 1,016 algorithms now cleared by U.S. Food and Drug Administration (FDA). Their new registry is a tool designed to help test algorithms against datasets to determine how they perform now and how they change over time so IT teams can make adjustments as necessary.
How Assess-AI works
Assess-AI operates through ACR Connect, a free data data connection structure for ACR members. Departments can ship performance data from their AI tools to the registry, enabling detailed analysis and benchmarking against similar technologies in other institutions. This feedback loop helps practices identify performance variations, troubleshoot potential issues and optimize protocols.
"The AI gets trained on a particular dataset, which is coming from a hospital or a small number of hospitals. And so the data reflects the people that are being taken care of by those institutions. If that data is very similar to the data you have in your hospital because you happen to take care of a similar population with a similar disease prevalence, then it's likely that the model will work reasonably well. But if that's not the case, then it might not," Wald said. "So the registry is really one building block of a program that we recommend to practices. It is the real world monitoring piece so that people don't have to reinvent the wheel in every practice, and we make it quite cost effective."
He said responsible implementation of AI in a department starts with understanding, discussing, defining and documenting what problems are you trying to solve. ACR has tools to then select technology. These AI tools are located on its online AI Central repository.
A comprehensive AI management framework
Wald said Assess-AI is part of a broader ACR program to promote responsible AI implementation. The ACR Recognized Center for Healthcare-AI (ARCH-AI) designation outlines a systematic approach, including:
• Problem definition: Identifying clinical needs AI tools aim to address.
• Technology selection: Resources like the ACR’s AI Central repository help institutions evaluate and choose from over 330 FDA-cleared AI products based on factors such as disease type, anatomy and vendor transparency.
• Acceptance testing: Ensuring AI tools work effectively within local data environments.
• End-user education: Training radiologists on proper usage and troubleshooting of AI systems.
• Real-world monitoring: Leveraging the Assess-AI registry for ongoing performance evaluations.
"Many institutions have run ahead with AI and are very sophisticated, but a lot of departments are struggling with what should they be doing, in what order, and what are the building blocks of success. So we help navigate that," Wald explained.
He cautioned AI is not a "set-it-and-forget-it" technology. Just like the need to regularly calibrate a CT scanner, AI tools also require regular quality assurance to ensure consistent and accurate performance.