Attorney General Bonta Launches Inquiry into Racial and Ethnic Bias in Healthcare Algorithms

Wednesday, August 31, 2022
Contact: (916) 210-6000, agpressoffice@doj.ca.gov

Sends letters to 30 hospital CEOs across the state requesting information regarding the use of commercial healthcare decision-making tools 

OAKLAND – California Attorney General Rob Bonta today sent letters to hospital CEOs across the state requesting information about how healthcare facilities and other providers are identifying and addressing racial and ethnic disparities in commercial decision-making tools. The request for information is the first step in a DOJ inquiry into whether commercial healthcare algorithms – types of software used by healthcare providers to make decisions that affect access to healthcare for California patients – have discriminatory impacts based on race and ethnicity.

“Our health affects nearly every aspect of our lives – from work to our relationships. That’s why it’s so important that everyone has equal access to quality healthcare,” said Attorney General Bonta. “We know that historic biases contribute to the racial health disparities we continue to see today. It’s critical that we work together to address these disparities and bring equity to our healthcare system. That’s why we’re launching an inquiry into healthcare algorithms and asking hospitals across the state to share information about how they work to address racial and ethnic disparities when using software products to help make decisions about patient care or hospital administration. As healthcare technology continues to advance, we must ensure that all Californians can access the care they need to lead long and healthy lives.”

Healthcare algorithms are a fast-growing type of tool utilized in the healthcare industry to aid in various arenas, from administrative work to diagnostics. In some cases, algorithms may help providers determine a patient's medical needs, such as the need for referrals and specialty care. They may be based on simple decision-making trees or more complicated programs driven by artificial intelligence. These tools are not fully transparent to healthcare consumers, or even, in some circumstances, to healthcare providers themselves. The use of healthcare algorithms can help streamline processes and improve patient outcomes, but without appropriate review, training, and guidelines for usage, algorithms can have unintended negative consequences, especially for vulnerable patient groups.

While there are many factors that contribute to current disparities in healthcare access, quality, and outcomes, research suggests that algorithmic bias is likely a contributor. This may occur in a number of ways. For example, data used to construct a commercial algorithmic tool may not accurately represent the patient population for which the tool is used. Or the tools may be trained to predict outcomes that do not match the corresponding healthcare objectives. For example, researchers found one widely used algorithm that referred white patients for enhanced services more often than Black patients with similar medical needs. The problem was that the algorithm made predictions based on patients’ past record of healthcare services, despite widespread racial gaps in access to care. Whatever the cause, these types of tools perpetuate unfair bias if they systematically afford increased access for white patients relative to patients who are Black, Latino, or members of other historically disadvantaged groups.

Attorney General Bonta is committed to addressing disparities in healthcare and assuring compliance with state non-discrimination laws in hospitals and other healthcare settings. To that end, today’s letter to hospital CEOs seeks information to help determine whether the use of healthcare algorithms contributes to racially biased healthcare treatment and outcomes. In the letter, Attorney General Bonta requests:

  • A list of all commercially available or purchased decision-making tools, products, software systems, or algorithmic methodologies currently in use that assist or contribute to the performance of any of the following functions: 
    • clinical decision support, including clinical risk prediction, screening, diagnosis, prioritization, and triage;
    • population health management, care management, and utilization management;
    • operational optimization, e.g., office or operating room scheduling;
    • payment management, including risk assessment and classification, billing and coding practices, prior authorization, and approvals; 
  • The purposes for which these tools are currently used, how these tools inform decisions, and any policies, procedures, training, or protocols that apply to use of these tools; and
  • The name or contact information of the person(s) responsible for evaluating the purpose and use of these tools and ensuring that they do not have a disparate impact based on race or other protected characteristics. 

A sample copy of the letter is available here.

# # #