Podcast 270: Is healthcare privacy possible if “all data are health data”?

20:14
 
Share
 

Manage episode 267218128 series 1250291
By NEJM Group. Discovered by Player FM and our community — copyright is owned by the publisher, not Player FM, and audio is streamed directly from their servers. Hit the Subscribe button to track updates in Player FM, or paste the feed URL into other podcast apps.

Don’t expect HIPAA regulations to protect your “digital health footprint” from prying eyes.

Every time you swipe your card to buy goodies at the supermarket (are you risking diabetes with all that ice cream?), or binge-watch that kinky series (how’s your mental health these days, really?), or let your step-tracker show you’ve fallen off the pace (can you afford those extra pounds?), there’s another little distinguishing feature added to your footprint.

This weeks’s guest, Dr. David Grande, and his associates asked a group of experts what they thought about all this accumulating personal data that’s outside HIPAA’s purview.

Listen in.

(Running time: 20 minutes)

Dr. Grande’s paper in JAMA Network Open

Franklin Foer’s Atlantic essay on the downside of Big Data’s help in the pandemic

New York Times‘s privacy project

TRANSCRIPT:

Joe Elia: You’re listening to Clinical Conversations. I’m your host Joe Elia. This week’s chat isn’t about COVID-19 (or maybe it is really, but we’ll get back to that).

What we’re pursuing here is this: You know all that data you’re constantly contributing to what we call the “digital realm”? You know, your Whole Foods discounts from your Amazon Prime account to your Netflix records from all that binge watching, the GPS navigation stuff? Well, who’s guaranteeing the privacy of all those data points? Not HIPAA. They’ve got your medical records covered, but what about the other stuff that points at your health status? The groceries, the streaming selections, the places you visit in your free time.

Our guest this week has concluded that you’re creating what he calls a “digital health footprint” with all that data — that, in fact, all data are health data. Dr. David Grande is the first author on a recent JAMA Network Open paper on all this. In it, his team surveyed a couple of dozen experts on privacy and data mining to gauge their thoughts and their concerns.

Dr. Grande is a physician at the Leonard Davis Institute of Health Economics at the University of Pennsylvania and in the Division of General Internal Medicine at the Perelman School of Medicine there. Welcome, Dr. Grande.

Dr. David Grande: Thank you.

Joe Elia: I hope I didn’t overstate or sensationalize your conclusions in the introduction. What got you interested in this whole question?

Dr. David Grande: You know going back years I’ve been interested in some of the ways in which marketing to doctors happens, marketing to patients happens; and what’s really happened over the last 10 years is all of that’s become supercharged by data. And a lot of it, when you’re talking about patients and consumers these days, marketing in health and healthcare is becoming supercharged by all those digital footprints that people are leaving behind. And while we may not necessarily think about our trip to the grocery store or the things we post on social media as being things that are fundamentally about health, when you talk to data scientists they’ll tell you oh it’s very much about health. Our ability to make inferences about health from all of that is extremely powerful today.

Joe Elia: Can you tell us briefly how you went about interviewing a couple of dozen experts, as I’m recalling, and why you undertook the study?

Dr. David Grande: Sure. Well, I think everybody — at some high level consumers have some level of awareness now of kind of the fact that people are being tracked in various ways. But frankly, it’s really hard to understand every time you sign up for something or use a new piece of technology it asks you to turn things on and off and you really don’t know what to make of it. And so we really wanted to talk to experts in the field to try to understand what is going on in the technology industry. And the data scientists and people who understand regulation and whatnot, like, how should we understand what’s going on out in the world in terms of data mining and the collection of people’s digital footprints?

And not surprisingly, you know, what we heard is it’s everywhere. It’s what we expected to hear to some extent.

But then the second part of that conversation was really focused on two things. One is, is it possible anymore to draw a line between health and non-health or has that become an artificial distinction? And then second, what are some of the, like, key big problems and challenges that we see that may need to be addressed by regulators? What is unique about this space? And that really was the focus of our conversation.

Joe Elia: And you interviewed a variety of people. Can you give us a sense of what fields they came from?

Dr. David Grande: Yeah. They’re certainly people who have been in the area of kind of thinking about privacy law and ethical issues around privacy. But then we also talked to people who are involved in digital technology and designing products in various ways. And trying to understand some of the decision-making that goes on and how those products are designed and engineered and where data comes into play. We talked to people who do sort of like predictive analytics in healthcare and how that world’s kind of vastly expanding now in terms of thinking about all the data inputs that go into predictive algorithms. We see that in healthcare delivery, but then of course we certainly see it in consumer products and advertising. So we tried to go pretty broad to really, you know, get a lot of perspectives.

Joe Elia: So you talked with people dealing with ethics as well as data mining, so you got a wide range of expertise there. Having done all of that interviewing what were the principal conclusions that you arrived at?

Dr. David Grande: Sure. So, that first question that I mentioned about can we draw a line anymore between health and non-health data? It was interesting. So even these experts in the field that we talked to, we actually had them answer some questions and kind of rating different kinds of sources of data and say how do you think about this one? Does it sound very health-related or not very much so? And actually [they] did rate some very high and some very low. So things like your Fitbit or food you buy at the grocery store people could say yeah I can see that’s health-related. And then, other things like your E-ZPass in your car or your other things with your travel habits or maybe things about your email or texting habits and things like that people were like “Maybe that’s not as health-related.”

But then when we actually dug into the conversation almost everyone we talked to was like “You really can’t draw a line.” I mean, when you think about modern data science today it’s really not about a single piece of data anymore. It’s not about just exactly what your language was on social media. But it’s linking that to a whole range of other things, which creates very powerful predictive capabilities. I mean now, we hear a lot of people talking about being able to make clinical diagnoses, you know? And so, I think one of the people said something like taking consumer grade data and turning it into a medical grade diagnoses and seeing that that’s quite possible now with a lot of data that people don’t really think of that way.

And so that was really an interesting part of the conversation. Again, I think if you ask consumers the same thing they would probably say yeah I can see these things as very health-related or not very health-related. But then again when you really look under the hood it’s really an artificial distinction.

Joe Elia: Let’s do a thought experiment. If I had enough data points, if you had enough data points about me and you knew my dietary habits, my exercise habits from my bicycle odometer, you had access to my travel through E-ZPass, et cetera, and you were a hospital administrator in the marketing department and you said “You know this guy Joe Elia? I’ve looked at his CVS records he’s not on any antidiabetic drugs, he’s sort of fallen off on his bicycling and he’s tending to drive himself places now. Maybe what we could market to him is a kind of a screening exercise for type 2 diabetes.”

And before I knew what was going on I’d have an email from David Grande saying “Joe, why don’t you come in for free screening?” I mean, that’s quite possible isn’t it?

Dr. David Grande: Oh, absolutely. I mean, again, there’s not a lot of transparency behind a lot of what’s going on in the companies who aggregate this data and use that to generate ads. But people are seeing it, you know, the output of that has become more evident, I think, to a lot of consumers. They say, “Hey it’s kind of weird I went on the internet yesterday, you know, looking for X, Y or Z and gee-whiz like suddenly I’m being approached today.” How else would that have happened? So, it’s becoming much more commonplace, I think, for people to have the experience that you just described.

I think what’s really hard to know is the accuracy of those analytics and those predictive models that are happening these days. We have some examples in research where people have validated ways of, for example, making a diagnosis of clinical depression. But we know very little in the advertising industry about how exactly they’re crunching data and deciding that you, Joe, have progressive diabetes or maybe a new diagnosis of diabetes. We don’t really know. All we know is that sometimes something shows up on a webpage you’re browsing or in your email inbox and you have to try to deduce why that happened.

Joe Elia: One of your experts said something, commented something that I circled because I thought, “Oh this is so true.” And the quote is that “it would be very odd if someone followed you around making notes of everywhere you went and how much money you spent, and what you saw and who you interacted with. We would call it stalking, right? but in the digital world that’s accepted behavior.”

Dr. David Grande: Yeah, it’s commonplace and it is odd, right? You know there’s a level of surveillance in the marketplace now that I think no one ever would’ve dreamed was possible a decade or two ago. And I think if someone had told you that that’s what would be happening in the year 2020 — if we had this conversation in the year 2000 — they wouldn’t have believed you, probably, and secondly they would’ve said that’s outrageous we would never let that happen. But instead what we have seen happen is, because it’s so baked into all the things we use in the world it’s unavoidable now. Like, you would have to disconnect yourself from modern society at this point to really not be contributing data in this way. It’s not really a choice that consumers have anymore.

Joe Elia: Yeah, it’s Orwellian. It’s actually infra Orwellian or super pro-Orwellian.

Dr. David Grande: Yeah.

Joe Elia: We’ve gone beyond good old George.

And thinking back only seven years to the Snowden revelations in 2013 about how the government was, you know, it had access to all of our communications.

Actually, Snowden had a recent interview in which he said COVID-19 actually might be another way, another goad for the high tech companies to have more information about you. And in fact a couple of the high tech companies are getting together and saying “Yeah we’ll help contact tracing.”

Well, wait a minute. Okay, so when the pandemic goes away we’re going to have this method of tracing people’s contacts? And in a free society that’s not a problem. In the repressive society, and I’m thinking of you know facial-recognition technology in use in China, you know, North Korea, other places that can be dangerous. And so how is it that we can say okay you can use it for this purpose, for contact tracing in a pandemic, but you can’t use it for contact tracing politically?

Dr. David Grande: Yeah, that’s a great question. I mean, I kind of look at this question and say that what we’re allowing to happen now in the private sector is complete Wild West. Like, the technology companies right now, at least in the American context, are largely unfettered — they can do almost anything. We really have no strong privacy regime around these questions in America. On the other hand we do apply a fairly, we apply a lot more scrutiny around these issues where government may be involved in some way that could actually have immense social benefits. So if we take COVID as an example I tend to believe that if we use these technologies responsibly we can make an enormous difference in the impact of the pandemic.

Now, the question is how do you avoid “mission creep”?, which is I think what you’re really asking. And I think for that to happen you need to write some pretty ironclad regulations about how you’re going to use these technologies so that they do get turned off.

Now, again, you have to maintain the kind of political energy when push comes to shove to actually turn them off. But you know what Google and Apple have done, and we can speak separately about what their motives might be for doing this, but what they have done is they’ve put forward a model that involves far more privacy protections around these COVID uses than what would be routine with other uses of technology.

So specifically, they are just making changes to their operating system and they are saying that for this to work a public health entity has to layer an app on top of that technology.

So the phone itself, there’s no app operating in the background of your iPhone that’s doing contact tracing. A public health agency has to put an app on top of that functionality in the operating system. But then they’re also saying we are not going to allow your phone to actually automatically transmit your personal information to the public health authorities it stays locally on your phone. And actually public health officials are not happy about that. Because one of the ways that contact tracing works is that you actually notify the public health authorities and you share data.

So there’s almost an odd paradox here but the world hasn’t faced a health crisis like this in a very long time. We have these digital tools that are now very powerful and it’s really public health that we’ve decided to take a firm stand on privacy about as opposed to all these other commercial applications where we seem to be a-okay with the status quo. So it’s a fine line to balance because, again, you get back to the mission-creep issue and it’s hard to turn that stuff off once you turn it on, but there are probably ways to do it.

Joe Elia: Yeah. Well, you would think that there would be legislative ways to do it, except that the legislators are subject to lobbying. And so if you have the companies that are at risk from this legislation writing or helping to write the regulations then there could be backdoors left open. I mean, I don’t want to sound totally paranoid, but a little paranoia is a good thing, I think.

Dr. David Grande: I do think though, back to that, I think the bigger issue of what’s going on is whether or not these technology companies are hopeful that by being seen as altruistic and responsible in some way that it will leave them in a better place in terms of what kinds of regulations they may face in the future. And that would be a mistake, like, we shouldn’t use a little bit of responsible behavior during the COVID pandemic to justify not taking actions at a later point.

Joe Elia: So what would you like to see happen, Dr. Grande, as a result of your raising these issues?

Dr. David Grande: Well, again, I’m actually not a legal expert by any means. But in talking to a lot of the folks that we did, I mean certainly the European Union has blazed a trail in this space. You know some of the people we interviewed who are really experts on international law and policy in this space talked about how the US really stands out in the world by taking this very sector-specific approach to privacy. So we have HIPAA, right, and we have GINA that protects genetic information. There is no place in the US federal government where anyone’s thinking about these issues across the economy.

You know, we’ve got people who think about health privacy at the Department of Health and Human Services, but it’s not the Department of Health and Human Services that can ultimately address this issue. And I think a lot of other countries around the world do specifically have privacy officers, agencies and whatnot to really come up with a more holistic way to think about these issues. Because we’ve moved far past, as I mentioned earlier, the idea that like health privacy can all be addressed through a health agency because it really is ubiquitous now.

And I think for many people and I think a lot of consumers — I don’t know this for sure — but I think if you asked them and said, “What do you worry more about, like, the privacy of the last blood pressure reading in your doctor’s visit or the social media posts and whether they reveal something about your mood?” And I think people would be more worried about their social media posts and their mood and people using that information in ways that they would rather not. So we have a lot to learn from the EU. I think we’re still seeing how that it’s relatively new, seeing how it’s going to play out in terms of the behavior of these companies. But we certainly need to start taking a similar direction here in the US.

Joe Elia: And speaking of behaviors have you modified any of your own behaviors over time? Have you thrown away your GPS or your cell phone? Or have you stopped using Netflix or Amazon or going to Whole Foods?

Dr. David Grande: I’ve tried. But I go back to my earlier point, which is it’s become almost impossible to be a modern day consumer and not leave these footprints behind. Even if you think about the basics of using a smartphone these days, God forbid you turn off all this functionality. Your phone’s going to tell you pretty much every day that you know you need to turn this thing back on or it won’t work properly. You know it’s like do you want to allow it once? Do you want to allow it always? And then when you know you allow it once is it really once? Is the app still operating in the background? You really have to become a computer scientist now to even interpret what you’re saying yes or no to anymore. And I think it’s asking a lot of consumers, but I’ve tried. I certainly have tried, but it’s a frustrating endeavor.

Joe Elia: Well, I want to thank you, Dr. Grande, for talking about your work with me today.

Dr. David Grande: Oh, absolutely, my pleasure.

Joe Elia: That was our 270th podcast. They’re all available free at podcasts.jwatch.org. We come to you from the NEJM Group and our executive producer is Kristin Kelley. I’m Joe Elia. Thanks for listening.

100 episodes