FDA Shoots Down Bid for Review Exemption on Company’s Radiology AI Devices, Citing Safety Gaps
The U.S. Food and Drug Administration (FDA) has officially shut down an attempt by Harrison.ai to bypass premarket review requirements for a defined group of radiology AI-enabled devices. In an April 1 letter, the agency made it clear that 510(k) reviews remain necessary to prove these devices work safely before they hit the market.
"Holding a 510(k) clearance may not reflect that a manufacturer is proficient in, or even has experience with, the processes used in the development of the cleared device, let alone in processes that would necessarily be appropriate for all future devices of the subject types," the FDA stated in its decision letter to Rubrum Advising, LLC, the consulting firm that filed the petition on behalf of Harrison.ai.
The exemption petition targeted a broad range of radiology AI devices, covering four generic device categories and six distinct product codes. Specifically, Harrison.ai sought to exempt radiological computer-assisted diagnostic (CADx) software for suspicious cancer lesions, medical image analyzers, and radiological computer-aided triage and notification (CADt) tools. The request also included computer-assisted detection and diagnosis (CADe/CADx) software, which is often used to flag everything from fractures to pneumonia. By including these categories, the proposal would have deregulated the primary tools clinicians rely on for automated "second opinions" and emergency case prioritization.
Specifically, Harrison.ai proposed a shortcut: if a manufacturer already held a 510(k) clearance for one of these devices, they could bypass the usual premarket notification for similar future products under certain conditions.
Their pitch was simple: If a company has already proven it can play by the rules with one device, its internal safeguards and constant postmarket monitoring should be enough to skip the repeat paperwork. They argued the current system creates an “innovation gap,” essentially a regulatory bottleneck, that keeps potentially life-saving AI trapped in the approval phase for too long.
The Case for Oversight
Beyond the product designs, the agency was particularly critical of letting companies monitor themselves. The FDA countered the proposal’s reliance on manufacturer-designed postmarket plans as a stand-in for premarket oversight. The agency questioned, in its response, the idea of companies assessing their own product risks without any mechanism for independent evaluation.
One major sticking point for the FDA was the company’s attempt to group different AI tools together. The agency maintained that these devices simply are not interchangeable and rejected grouping detection (CADe) and diagnosis (CADx) software into a single exemption category.
Perhaps most significantly, the FDA doubted that doctors could catch AI errors in real time. Using screening mammography as an example, the agency pointed out that a radiologist often cannot tell a true-positive from a false-positive without a biopsy. This directly challenged Harrison.ai’s claim that clinicians could act as a safety net, effectively weakening the argument for shifting oversight from the lab to the real world.
But the FDA did not act in a vacuum. The agency received 47 comments on the proposal, and the feedback was overwhelmingly negative. Industry heavyweights and safety advocates warned that the plan was too broad, raising alarms on everything from patient safety to the fact that postmarket AI monitoring is still in its infancy. For example, the American College of Radiology urged the FDA to “ensure patient safety and device effectiveness” as it considered the petition. For many who weighed in, the idea of removing federal oversight from high-risk diagnostic software felt like a premature leap.
In its denial, the FDA highlighted Predetermined Change Control Plans (PCCPs) as a preferred path forward. These plans allow companies to get pre-approval for future AI updates once an initial review has been completed, offering a middle ground between the rigid 510(k) process Harrison.ai tried to bypass and the unregulated landscape of total exemption.
A High-Stakes Decision
In a January 2025 executive order, President Trump explicitly called on agencies to scrap regulations that might hinder America’s “global AI dominance.”
Adding to the tension surrounding the rejection was a potential conflict of interest within the FDA itself. Earlier this year, STAT reported that Rick Abramson, the head of the office overseeing AI policy, was previously an executive at a Harrison.ai subsidiary.
In a blog post written after the rejection letter was received, Harrison.ai said that the petition was successful in forcing the industry to discuss this possibility and address the innovation gap.
ASCO AI in Oncology is published by Conexiant under a license arrangement with the American Society of Clinical Oncology, Inc. (ASCO®). The ideas and opinions expressed in ASCO AI in Oncology do not necessarily reflect those of Conexiant or ASCO. For more information, see Policies.