Ziad Obermeyer and colleagues at the Booth School of Business release health care Algorithmic Bias Playbook

Faculty Headshot for Ziad Obermeyer

Ziad Obermeyer

Over the past three years, Berkeley Public Health Professor of Health Policy and Management Ziad Obermeyer, MD, has published a number of papers—in collaboration with colleagues at other institutions, including the University of Chicago Booth School of Business—that show both racial bias in a commonly used health care algorithm, as well as how properly calibrated algorithms can help physicians more equitably treat patients. He has also consulted with dozens of health care organizations to help reduce or eliminate algorithmic bias.

Now Obermeyer and his Booth School colleagues have created an Algorithmic Bias Playbook.

“Algorithmic bias is everywhere,” the authors posit. “Our work with dozens of organizations—health care providers, insurers, technology companies, and regulators—has taught us that biased algorithms are deployed throughout the health care system, influencing clinical care, operational workflows, and policy.”

An algorithm is a step-by-step procedure for solving a problem; they can be analog, but  most often are computer-based in the contemporary workplace. When used in health care organizations, they can help with decision-making and can be used to identify and help patients with complex health needs

The playbook, which is available at the Booth website, is aimed for an audience of health care leaders, technical teams working in health care, and policymakers and regulators. It offers a guide to defining, measuring, and mitigating racial bias in live algorithms.

Many organizations are looking for something concrete to do about racial bias,” says Obermeyer. “Addressing the bias in their algorithms should be at the top of the list. But it’s often not clear where to begin — how to define and measure bias in real-world settings, and what to do about it. After working with organizations to do exactly that, we realized we could distill our process down into a clear set of steps that worked for our partners.” 

The playbook offers four steps for combating bias: “First, getting a handle on the algorithms that are live within an organization is critical,” says Obermeyer. “Many groups we worked with had no central place where algorithms could be tracked, or any clear accountability structure for them. That’s a huge problem for bias — but it’s also just a huge problem in general, given the scale at which algorithms are impacting their patients and customers.

“Second, measuring bias based on a simple text: is the algorithm predicting what we want it to be predicting? 

“Third, if it’s not, try to fix it. Often the process of measuring the bias implies a clear way to retrain the algorithm, and align its predictions more closely to its true purpose. 

“Fourth, put in place the accountability structures and organizational practices needed to prevent biased algorithms from ever touching a patient or customer.”  

Obermeyer hopes that the playbook will help health care organizations take algorithmic oversight seriously. “They can do a lot of good — or create a lot of problems, and they need to be treated with extreme care,” he says. 

“If organizations don’t start taking this seriously, regulators are going to do it for them,” he continues.”We’ve been working with state and federal regulators and law enforcement agencies, who are increasingly committed to making sure algorithms in use aren’t biased. Some organizations think that they will be protected if they just don’t look into algorithmic bias — that is definitely not the case, and ignorance is not a very good defense.”

OOMPH Student Gateway Ticket formTesting 1-2-3

This form is for requesting updates/changes to the OOMPH Student Gateway site. Go there to say what you want.