Concerns Regarding Unregulated Clinical Decision Support Software

"While we do not know exactly which data elements are included in NarxCare, evidence suggests the algorithm differentially identifies individuals on the basis of work status, disability, and insurance, among other characteristics.42 Cochran et al.34 found patients with higher pain severity or interference, those who were widowed, on leave, retired, or disabled, were most likely to have artificially elevated Narx Scores (“false positives”). As the authors hypothesize, it is possible that patients with unmanaged pain related to work status or disability now also experience challenges accessing care due to having an artificially elevated Narx Score. Furthermore, individual proxy measures likely used in NarxCare’s algorithm target patients with complex medical histories. For example, one study found that 20% of patients identified as “doctor-and-pharmacy shopping” according to a commonly accepted definition using prescription-level data—a factor likely included in NarxCare’s algorithm—were in fact diagnosed with cancer, thus necessitating visits with multiple providers to adhere to complex treatment regimens.43 Patients without a primary care clinician, and/or those who see multiple different clinicians to manage pain associated with a complex medical condition (such as cancer), may be incorrectly flagged as high risk by NarxCare’s algorithm and create barriers to getting the care they need.

"Perhaps most concerning is that proprietary CDS tools carry the potential to incorporate additional data sources that reflect systemic inequalities in society, yet are unrelated to clinical care. Any company with access to various types of data (ex., credit history, zip code, educational attainment, or criminal justice history, among others) has the capacity to include these data elements in their proprietary algorithms without announcing it publicly. In the case of predictive modeling formulaic algorithms such as NarxCare, the harms of doing so may be amplified. By using retrospective data elements as proxy measures, predictive modeling algorithms predict future patterns of behavior by modeling past patterns of behavior. They are not designed to identify or address the root causes of the patterns predicted, and in this sense, they can perpetuate, rather than address, the disparities in place when the algorithm was created.

"This context is alarming as the medical community grapples with an increased awareness that clinical algorithms have the potential to discriminate—exemplified by the race adjustment embedded in the formula calculating glomerular filtration rate (GFR) that was used in clinical care and accepted by clinicians for decades without question, despite causing racial inequities in access to life-saving treatment.44, 45 Although Bamboo Health’s current promotional materials state only data from PDMP registries are included in NarxCare’s algorithm,19 state PDMP user manuals as recently as May 2020 stated that the platform could and will incorporate non-PDMP data sources in the future.20 It is already known that several states incorporate criminal justice histories into their PDMPs.10, 42 Furthermore, as recently as 2021 Bamboo Health belonged to an umbrella organization—Appriss—responsible for designing software leveraging outstanding warrants and incarceration status to predict patterns of criminal activity for law enforcement.44, 46 Regardless of whether criminal justice information—or other types of data unrelated to clinical care—are included in NarxCare, Bamboo Health has not sufficiently reassured clinicians that its algorithm is free from potentially discriminating sources of data. Furthermore, its opaque nature does not allow us to determine for ourselves whether this is the case."

Source

Buonora, M.J., Axson, S.A., Cohen, S.M. et al. Paths Forward for Clinicians Amidst the Rise of Unregulated Clinical Decision Support Software: Our Perspective on NarxCare. J GEN INTERN MED (2023). doi.org/10.1007/s11606-023-08528-2