Ethical Dilemmas of a Hypothetical Healthcare startup

During our class on Ethics in Design, I had the opportunity to facilitate a discussion on the ethics of privacy. We had all read a collection of articles on current issues in the privacy landscape relevant to designers, and using those as a jumping-off point I wanted to create a space for us to synthesize those ideas and continue to hone our individual points of view about the implications of privacy laws and industry practices. My goals were to create a space for people to process the readings and develop new ideas together. 

The Warm-Up

We started with a personal privacy inventory. I shared some brief scenarios of personal information sharing and asked people to reflect on how they felt about sharing this data and then summarize that feeling with a word or an emoji. Responses ranged from  ‾\_(ツ)_/‾  to “HELL NO!!” I wanted to start our conversation by placing ourselves in the context of a user who has their own opinions, boundaries, and concerns about how corporations, governments or strangers might interact with our personal data. We discussed the range of responses and considered common themes that emerged within and between participants. Do I consistently trust the government more with my data? Or do I trust corporations more? What about having my data publicly available? Remember phone books? Why did having your address and phone number available there feel less ‘creepy’ than having it public on your facebook page might feel?

The Main Event

Next, I wanted us to shift our mindset from being a user to functioning as a designer. Borrowing from the tradition of the ‘murder mystery dinner party,’ I created a scenario for us to consider synthesized from a couple of real-world examples of privacy dilemmas faced by companies that require access to personal data to provide their services or products. In particular, I was intrigued by uncovering that the HIPAA rules are only enforced on covered entities–namely, health care providers, insurers, and research institutions. This creates an interesting ethical ‘gray area’ in which policymakers have clearly established best practices for managing patient health information, but for the entities outside the narrow purview of the legislation, there is no enforcement. This loophole means that the many apps and third-party companies that have popped up in the healthcare space are not accountable to any rules about their handling of patient health information.

I imagined an app that attempts to reduce unnecessary costs borne by families who unintentionally sought health care at facilities that were not well-matched to their concern (i.e. visiting an emergency room rather than a primary care physician, or an out of network clinic rather than an in-network one). The app was serving as a link between people seeking health care, their insurer and providers while moving patient health information between all three stakeholders. I shared a hypothetical problem that this startup was facing in regards to how they were managing patient data that they were using to provide this service. This kicked off a guided a discussion that allowed us to grapple with the options available to our company and the implications of those choices. 

We asked and answered many questions. Do we have an ethical responsibility to comply with HIPAA even though we aren’t legally compelled to comply? Are our users operating with a false assumption about our compliance with HIPAA standards? Is there a risk of being”outed” as non-compliant? What might the consequence of that be? Are we exposing our partners (insurers and providers) to liability by not being fully compliant? Are there ways of mitigating the risks of associated with transferring this data between parties? What if there was a third-party company with an API that we could use to outsource some of that risk? Would it be ethical to transfer responsibility and accountability if it meant also ceding control? 

Taking it up a notch

Then recognizing that designers work at the intersection of many different stakeholders within their organizations, I challenged my classmates to shift their perspective from that of a designer to the perspective of another person within this hypothetical company. I gave each participant a card with a role printed on it (Investor, CEO, VP of Marketing, CTO, etc.) and some more information about considerations that were specific to each of those roles. We continued the discussion with this new framing and attempted to answer the question, “What should we do next?” This generated ideas not just about the ethics, but the concrete actions that we could take in support of those values within real-world constraints. 

I loved the way that people engaged with this premise. There were ways that people contributed that I had anticipated (such as proposing solutions that I had considered at the time I design the activity). But they were also taking the hypothetical far beyond what I had originally considered, raising questions about additional dimensions of the ethical dilemma and supporting their thinking with examples from the background readings. The conversation was lively and although I was facilitating the discussion, there was space for other people to express dissenting opinions or guide the group to explore aspects of the scenario beyond my prompts. Although we didn’t arrive at a specific decision that we should do this or that, there was some consensus that gently bending some of these rules was acceptable, especially given that it was in the interest of protecting consumers from unintentional overspending on health care. Our group felt more comfortable with not fully conforming to HIPAA if it was truly in service of the patient needs. We also discussed the ‘slippery slope’ of setting precedents, either internally or externally, about use of user data. Do we run the risk of establishing norms that give people permission to bend the rules in similar ways even if their product is not as altruistic as we belived our to be?  

Learning and Take-Aways

In presenting this to my group, I realized that there were nuances of the problem that I had not sufficiently articulated in the written briefing I created and had to supplement that with a further explanation of exactly how this use case was in conflict with existing HIPAA regulations. I also realized that aspects of how the US healthcare system functions were not common knowledge to everyone in the group. Given that just last week we had been talking about inclusion in design, I felt badly that I had been operating on an unfair assumption that the processes involved in using your health insurance, seeking care and medical billing would be familiar to everyone. A quick primer on those things would have made this discussion more inclusive. 

Finally, I designed my progression to start with a ‘warm-up’ meant to prime us to empathize with users, by putting the group in the mindset of a user, before asking them to take the perspective of a designer and then finally asking them to pivot to taking the perspective of another contributor within company. Making my intentions about each of those choices more clear or being more explicit about the connection between the first activity and the second one might have been useful to the participants.