Radiologists want ‘immediate’ action to ensure AI is safe, effective in younger patients
Congress and the U.S. Food and Drug Administration must take urgent action to ensure young patients are not harmed by artificial intelligence tools designed specifically for adults, radiologists warned Wednesday.
AI is quickly becoming part of everyday medical care, particularly in radiology. But as it stands, only one of the more than 100 FDA-cleared algorithms in medical imaging is approved for use in children. This gap leaves young patients behind and opens them up to potential harm, physicians wrote in a viewpoint published July 28 by AJR.
“There is an urgent need to remedy the disparity between FDA-cleared medical imaging AI software for use in adults versus children,” Marla Sammer, MD, a rad at Texas Children’s Hospital in Houston, and co-authors wrote, adding “we believe both immediate Congressional action and FDA changes are necessary to ensure pediatric medical imaging AI software development.”
In the past, Congress has enacted legislation affording the FDA authority to incentivize and require companies create drugs for children. Before these 2002 and 2003 laws, the high cost of pediatric trials and lower return on investment dissuaded developers.
Sammer and co-authors at Cincinnati Children’s Hospital now want lawmakers to take a similar approach to medical imaging AI devices.
For one, they suggest improving the FDA’s existing regulatory framework. Consumers have no way to know if software was tested and cleared for use in younger patients, and “at a minimum,” public documents should explicitly state the intended age for devices. Labels should further include benefits, risks, and strategies for mitigating risks—just as drug labels do, the authors explained.
“In summary, Congress has provided legislation to require and incentivize drug development for children. This same precedent should also be applied to FDA-cleared AI medical devices,” the group concluded. “Without these actions, children’s healthcare needs will be left behind and potentially impaired.”
Read the full viewpoint here.