Document Type

Article

Publication Date

Summer 2023

Abstract

This Article argues that unconsented access to data is not ethically problematic in and of itself. What has made it problematic is the weak framework of alternative protections that the Privacy Rule prescribes when data are used without individual consent. This Article proposes specific measures to strengthen those protections so that the pursuit of greater health care equity need not imply a loss of meaningful privacy standards.

Part I describes two competing visions of how to protect data privacy, examining the roots of ongoing discontent with the Privacy Rule and tracing policymakers’ original rationale for fashioning a major federal privacy regulation that allows so much unconsented access to health data.

Part II briefly introduces what AI/ML CDS tools are and why they are poised to occupy a central position in twenty-first-century AI-enabled health care. The growing use of these tools creates an unfamiliar landscape in which past insights about the “right” way to protect data privacy may need revisiting. Part II levels three critiques at popular, post-1970s privacy policies that rely on individual consent rights and simple data de-identification strategies as their main tools of data privacy protection. First, such policies can have disparate impacts that threaten to exacerbate health inequities in an AI-enabled health care system. Second, notice-and-consent privacy policies rest on philosophical and scientific assumptions that deny the reality of human diversity, completely at odds with a twenty-first-century health care system tasked with serving ever more diverse patient populations. The third and possibly most damning critique is that widely favored consent norms and data de-identification methods often fail at their central mission: they do not provide very strong privacy protection.

Part III identifies five legal pathways available under the Privacy Rule that could enhance access to diverse, inclusive data sets to train a new generation of more-equitable AI/ML CDS tools. Part IV explores why, twenty-five years after HIPAA’s inception, these data access pathways continue to be underutilized, contributing to the observed pattern of CDS tools that tend to work better for cis-gendered white males treated at leading academic medical centers than for all the rest of us. The Privacy Rule enables data acquisition practices that could enhance health equity while affording stronger privacy protections than patients enjoy today, yet gatekeepers of data hesitate to embrace these practices amid lingering concern about gaps in the Privacy Rule’s privacy framework. Part IV concludes that these concerns are valid and proposes specific measures to address them.

Share

COinS