Privacy is now an emergency in digital health

In 2014, my coauthors and I were putting together the concepts and passages that became The Internet of Healthy Things, published in 2015. Chapter 1 of that book describes a lively digital coach/companion, called Sam, who follows me throughout the day and, based on data collected through the Internet of Things, offers me advice and steers me towards healthier behaviors. Just five years ago, Sam was a figment of my imagination, and much of the book was devoted to pointing out the gaps in innovation needed to create tools like Sam. Fast forward to 2020, and all of the components that were necessary to make Sam successful are in the marketplace. In most cases, they are not yet knit together in a way that makes a virtual assistant like Sam so engaging and effective, but they are there. I recently learned of an Israeli company called Sweetch, that comes as close as I’ve seen to achieving the vision of highly customized, automated, inspirational messaging driven by algorithms that are educated through the ‘digital dust’ we leave behind every day.

So, what is the newsflash? For a moment, let’s acknowledge just how gratifying it is to see these technologies come together in a way that will help us improve population wellness and get us on the highway to a one-to-many care delivery model. This shows tremendous promise. But as I look back on what we wrote in the book, a new formidable barrier has come into focus in the last two years. A barrier that we did not call out in sufficient detail when we wrote the book in 2014.

I’m talking about the gridlock in adoption that has been created by consumers’ fears of data privacy.

I can’t say this is new — it abruptly bubbled to the surface in March of 2018, when the world was made aware of the unholy alliance between Cambridge Analytica and Facebook. Ever since then, the fear of misuse of personal data has gripped not only the media but the general public. Several publications recently brought this into focus for me. A recent Boston Globe article displays a photo of a gentleman who tore out a monitoring device from his apartment because he feared his privacy would being invaded. The New York Times published an account of a bracelet consumers can wear that emits a frequency that makes it impossible for Alexa to “listen to them.”

© Petra Ford for The New York Times

There are examples in the marketplace where consumers are sharing personal data in exchange for a tangible financial reward. Possibly the largest is Unitedhealthcare’s Motion program, where employees can get up to $4 per day for achieving activity-related goals, as measured by an Apple Watch, Fitbit, etc. Other examples of trading data for a tangible asset include Walgreens’ Balance Rewards program, as well as employee health programs from The Vitality Groupand Virgin Pulse. I was unable to retrieve data on how any of them are doing or whether they’ve seen a downturn in engagement lately.

I’d be interested to hear from others on this phenomenon. From what I see in the market, we need to rise to the occasion to assure consumers/patients that their privacy will be respected or we won’t realize the full value of digital health as it relates to behavior change. What do you think it will take? A digital health version of the Hippocratic Oath? Is data privacy a potential competitive advantage in building brand and an organization’s reputation? How do we achieve transparency about how personal health data is collected and used? Do we put the ball in the consumers’ court, allowing consumers to either opt in or opt out to sharing their data?