Onur Bakiner

Onur Bakiner

"Always Look a Gift Horse in the Mouth": Interview with Liz Chiarello

Liz Chiarello, associate professor of sociology at Saint Louis University, visited Seattle this week to present her book Policing Patients: Treatment and Surveillance on the Frontlines of the Opioid Crisis. Here is her interview with the Technology Ethics Initiative director Onur Bakiner.

Liz Chiarello, associate professor of sociology at Saint Louis University, visited Seattle this week to present her book Policing Patients: Treatment and Surveillance on the Frontlines of the Opioid Crisis (Princeton University Press, 2024). The book relies on hundreds of interviews with physicians, pharmacists, and enforcement agents to provide a nuanced explanation for the opioid story – one that refuses to lay blame with the pharmaceutical industry and healthcare professionals only. Technology Ethics Initiative director Onur Bakiner interviewed Chiarello about the book (the interview is edited for length and formatting.).

 

Onur Bakiner (OB): What is Policing Patients about?

Liz Chiarello (LC): Policing Patients is about how efforts to curb the opioid crisis are turning doctors and pharmacists into cops, and why that’s bad for patient care. The narrative about the opioid crisis is that the opioid crisis is a result of overprescribing, and doctors have been blamed for the worst drug crisis in U.S. history. They’ve been called drug dealers in white coats. The idea has been that if we can bring down prescribing rates, we can also bring down overdose rates.

Now, that is simply not true because prescribing rates have been going down since 2012, but overdose rates have been skyrocketing in the meantime. If bringing down prescribing rates was going to solve the problem, we would have solved it already. Instead, what’s happening is we’ve really restricted what healthcare providers can do, and as a result, patients have been abandoned and left to turn to the illegal drug market, which is more dangerous than ever.

OB: What is the role of technology in deepening (or failing to resolve) the opioid crisis?

LC: We have to put technology in perspective. Technology promises to do a lot of things and often fails to deliver, but there are many sectors that are really interested in a technological solution to social problems. We have to remember, as Sarah Brayne and other social scientists have argued, that technology is social. The Prescription Drug Monitoring Program (PDMP) is a two-tiered surveillance technology that was designed for law enforcement and is being used in healthcare. It was given as a gift from law enforcement to healthcare. Essentially, because our leaders have embraced the idea that if you could cut down on prescribing, you would cut down on overdoses, they invested in prescription drug monitoring programs, which are statewide programs that track all of the controlled substances (not just opioids), in a state. It’s pharmacy dispensing data. If you go to your pharmacist and you have your prescription for Vicodin, the pharmacist dispenses the drug to you, but they send that information to a private company called Bamboo Health. Bamboo compiles all of the information and then feeds that information back to healthcare providers, who can use it to monitor and make decisions about patients, and to law enforcement, who use this same data to monitor providers and patients.

So, PDMPs are two-tiered surveillance technologies, but they have not brought down overdose rates. They have helped bring down prescribing rates, but that’s where the problem lies because the problem is not prescribing; the problem is overdose. While those were positively correlated for some period of time, they’re now negatively correlated. What has happened is healthcare providers, with this technology in hand, have been motivated to route patients out of the healthcare system, leaving them vulnerable to the illegal drug supply.

OB: The book is critical of inaction on the part of regulatory agencies like the FDA or DEA. What is needed to make them respond more effectively to future crises?

LC: I don’t know that I’m so critical of these two agencies specifically, although certainly the FDA played a pretty major role in giving Purdue Pharma, the maker of OxyContin, a competitive advantage. Basically, the FDA allowed Purdue to label OxyContin as less addictive than other opioids because it had this extended-release formula. The idea was that people with substance use disorders would not use an extended-release formula because they were after a quick high. Well, that turned out not to be true at all, but by the time that became evident, OxyContin was already widespread.

I think we have to think about this more in terms of what we want our agencies to do. Certainly, with the overturning of the Chevron Supreme Court decision, we’re in a moment where our administrative agencies are going to have a reckoning. This raises the question of who gets to make decisions, who are the experts, and who has to defer to whom. Even in the best of times, the FDA is heavily underfunded. They don’t do their own research; they have to rely on the research that companies bring to them, and companies aren’t always acting in the public’s best interest. So, I think it’s really critical to consider whether we want agencies like the FDA to act as gatekeepers, and if so, decide how we give them the resources they need. If we don’t want them to act as gatekeepers and we’re going to allow the political process to supersede what those agencies are doing, that’s a different conversation.

With regards to the DEA, at an individual, interpersonal level, I have talked to a lot of DEA agents who I think are doing the best they can in the circumstances that they’re in. At an agency level, I think it gets complicated because the agency, like other organizations, is committed to its survival. The war on drugs has helped to boost the DEA as an organization and criminalization of drugs as a way to solve social problems. However, the DEA and other criminal justice agencies are very committed to what drug researchers call a supply-side story, where the idea is if we can just stop the drugs from coming into the country or into our communities, if we can just stop people from selling drugs, we wouldn’t have a drug problem. That’s where we see all of these interdiction efforts. You get to see a lot of images of law enforcement who have seized piles and piles of drugs and guns, and all of this makes eye-catching news, but I don’t know that it solves the problem.

We tend to invest too much power in drugs themselves, as if the drug itself is what causes all kinds of social problems. For example, homelessness is often blamed on drugs instead of unaffordable housing markets. This over-focus on eliminating drugs is misguided. People get really creative with drug use. When I was in high school, people used to snort the gas that was in the bottom of whipped cream cans. We can’t eradicate all drugs from our country, so we have to figure out how to live good lives with drugs in our ecosystem. And, as the cultural studies scholar Ingrid Walker points out, we also have to grapple with how our society constructs drugs as “problem-solving versus problem-causing.” We have to recognize that some people use drugs chaotically and some people don’t. It’s only a small percentage of people who use drugs who actually have a substance use disorder. For that reason, drug policy researchers, myself included, argue that we need to get at the demand-side story—the question of why people use drugs chaotically in the first place. We have to reckon with the people who are using drugs to treat pain, the people who are using drugs for emotional and psychological reasons, and the people who are using drugs to get high. We have to really think about the demand and then find a way to make drug use safer because the bottom line is drugs are not going away.

OB: What is a Trojan Horse technology?

LC: A Trojan Horse technology is a technology that was designed for one field but is used in another, and in that process, it smuggles in the logics of the field of origin. It’s easier to explain if we go back to the original Trojan Horse story. The Trojans were horse people; that is why the Trojan Horse technology was so appealing to them because it was kind of their thing. So, they look at this huge wooden horse and think, of course, what better symbol to commemorate our victory than this enormous wooden horse? But they had no idea that when they wheeled it behind their city walls, hiding inside were Greeks who planned to kill them. And kill them they did, so they snatched defeat from the jaws of victory.

PDMPs work the same way by providing a similar kind of threat to our social fields and to institutions that we hold dear. In this case, the Trojan Horse is the prescription drug monitoring program, which looks a lot like other healthcare tools. Specifically, it looks like the electronic health record (EHR). The EHR is protected under HIPAA, has all kinds of privacy protections, is built for healthcare, and is used inside healthcare. It exists within this walled garden where other agencies outside can’t easily get access.

The PDMP was designed for law enforcement; it’s meant to engage in surveillance. It was handed to healthcare as a gift, literally, through these federal grants that came out of the Department of Justice. But what healthcare leaders and legislators didn’t realize as they started to adopt these technologies was that hiding inside were enforcement logics that would help to transform the healthcare field, shifting healthcare providers from treatment to surveillance and punishment.

One of the most interesting things that this book shows theoretically is the way that micro-level changes to behaviors fostered by technology use can help change the entire field. You can imagine the technology working in a couple of different ways. You could imagine that if technology is designed for one field and then moved into another field, It could become a tool of the destination field. A tech designed for law enforcement that moves into healthcare would become a healthcare tool. But what I’m showing is that the technology started as an enforcement tool and stays an enforcement tool, even though it’s being used in healthcare. In fact, it starts to undermine the logics, practices, and commitments of the healthcare field. That’s why I conceptualize it as a Trojan Horse technology. It ushers in enforcement logics and makes changes to healthcare fields in pretty fundamental ways.

OB: The book is a critique of surveillance technology (and rightly so). Can technology, broadly understood, play a positive role in providing care to people in need without criminalizing them?

LC: Absolutely. I mean, this work is not designed to condemn any and all technology, but it is designed to argue that we need to take a critical eye to the technologies that we allow into our lives and into our social fields, especially technology that seems to mean well and promises to solve social problems. We’ve seen this in many different areas. For example, Virginia Eubanks’s book Automating Inequality is a really good example of these underfunded social service programs that take this technology to try to make their very burdensome work easier and more efficient, and instead end up doing a lot of harm.

I think the other thing that’s important to remember is that tech companies developing health tech and healthcare do not necessarily have the same logics or the same commitments. Health tech is interested in making money for shareholders. Healthcare is ideally interested in serving patients, and a lot of times those interests are in conflict.

What happens is we put a lot of pressure on individual patients to be very careful about their data and how their data is used. We live in an age of surveillance capitalism where it’s not just the information that we give to companies, but in fact the information that they take from us that can be incredibly profitable and monetized.

There was this study that Shoshana Zuboff cites, where it said that to install a single Nest thermostat, you have to go through a minimum of 1,000 different privacy contracts. We don’t have time for that. That kind of individual responsibilization around our technology and how it’s used is incredibly problematic.

Tech can be helpful; it can help to solve social problems, but it’s really important for there to be guardrails placed around that technology to make sure that it’s being used in the public interest and not just to help shareholders make money.

OB: Anything else you would like to add?

LC: Fundamentally this is a book about the relationship between law, technology, organizations, and institutions in an attempt to solve social problems. I think it offers a very strong cautionary tale that can be summarized as essentially: “always look a gift horse in the mouth.”

You know the saying “never look a gift horse in the mouth” actually has nothing to do with the Trojan Horse. It has to do with horses’ teeth growing longer as they age. Older horses are less valuable, so looking at a gift horse’s mouth is essentially confirming how much something is worth before you decide to accept the gift, which is certainly a faux pas.

But I wonder what would have happened if healthcare had looked this gift horse in the mouth. If they had anticipated how that technology could change our healthcare fields and undermine doctor-patient and pharmacist-patient relationships, would they have accepted the gift? Would it have been worth the cost? Other institutions that are similarly underfunded and grappling with social problems also need to be cautious about technology because what seems easy and efficient on the surface can actually do a lot of harm to social fields if it is not properly couched in the public interest.

Onur Bakiner

October 16, 2024