[MUSIC]
Welcome to Building AI Boston.
Our guest today is Sheila Pissil. She is a social change futurist. With nearly two decades of experience shaping health care innovation, her forthcoming book remembering how to care, explores reimagining health care in the age of AI.
Sheila, welcome to the show. Thank you for having me happy to be here. You are right in the pocket when it comes to why we do the show. You've been a leader at Dana Farber Cancer Institute in Boston Medical Center. Just to name a few, like, wow.
This is a great background. And two decades, I'm sorry, I didn't realize your age, but that's pretty phenomenal. But you've kind of been a leader since the beginning.
“I want to ask you, like, why healthcare and why Boston?”
So it was an interesting part that got me here.
I was initially pre-bed in undergrad. My parents were very excited about me becoming a doctor. I only had three options anyways. [LAUGHTER] And it was actually an experience during my,
I think it might have been either my junior senior year in college. I had a chance to go to Haiti as part of a volunteer medical mission. We went to Port of France. This was actually before the earthquake. Okay.
And we were doing rehab specialty care, providing families with children and loved ones with disabilities with, you know, highly specialized care with prosthetics and wheelchairs and other tools to kind of support them. And what was interesting in that experience was, after being there for a little while,
the families would begin returning the equipment that we were giving them. And being super baffled about what was going on, I started just talking to people because I spoke the language. And what I learned from them was,
it wasn't that they were not grateful,
“which I think was the initial assumption.”
It didn't work for them. And they described having to, you know, walk up a side of a mountain and cross a river to get home and we had told them, you know, these things can't get wet or, you know, to take good care of it.
And it was actually made them more immobile, trying to figure out how to use crutches. And that's when a light bulb went off for me because I was like, okay, so we're doing all this great fancy care for people. It's cost a lot of money.
And we're doing it for free, but it's not actually what people need. And I was talking to a mentor of mine at the same time. And I was expressing my frustration to him. And I was like, you know what?
We would have been better off giving these people skateboard. But if we had actually had the conversation with people, and I was like, why is this a problem? And, you know, and it seems like poverty is also a big barrier here. And that's when he pointed me in a direction of public health.
That's when you told me about Boston University. So I applied, I went to grad school there and actually became an employee there as well. And then the rest of the history. And you and Cara have some overlap in the healthcare.
Right, Cara? Yeah, Boston University. So I worked at the public health, the school of public health at Boston University for about, I think, about eight years. And I do think it was my favorite job ever.
It's a really amazing place.
And even though I was in the communications team, I really became very interested in public health generally. And I think it's what led me to sort of having this focus on ethical AI. And how AI can really address these giant systemic social issues that are untouchable without it.
And one of my favorite stories about public health and why why it's so important is our dean at the time, Sandra Galea, at a great analogy that he used to be an emergency medicine physician. And, you know, he switched to public health. And he said the reason was is because he found himself at the end of the river.
Fishing people, you know, metaphorically. Fishing people out of the river and saving them from drowning over and over and over. And then just realize he's like, I want to walk up to the top of the river and find out why people are falling in in the first place. And that's public health, which I thought was just so exceptionally well put
and, you know, moving up to that source is what it's all about.
“So that's what you're doing Sheila, which is pretty cool.”
And I like that you were informed in such a far away place. And, and then had that awareness because right now you're the founder of Facilitate Change and Innovation Studio helping startups. Investors and health systems implement ethical patient-centered solutions. So your, your background was really the catalyst for that.
It sounds like.
Can you talk about your leadership role there?
Yeah, absolutely.
“So Facilitate Change as you mentioned, like, is really kind of the culmination of”
that that question that surfaced in Haiti and then nearly 20 decades working inside of Boston health care systems. And doing my best to make sure that I was constantly being aware of are we talking to the right people? Am I asking the right questions?
So that I'm getting to the root of the problem. And one of the things that I learned as a student at the School of Public Health is called that forever change my life. Every system is perfectly designed to produce the results that it gets. So when we're seeing a result, we have to look at the system itself.
And systems are valuable, right? We can change them. We can redirect them. We can get them to give us different results if we design them differently. So I was constantly tinkering with at these ideas throughout my career.
And I got to do very interesting work. And it's all of it around technology. But I also got to see a lot of failure. Including failure that I feel personally responsible for that led to excluding low income and marginalized communities from high quality
and very urgent care. So facility changes really birth from this idea that if we really want to redesign healthcare, we have to understand why we're making these same mistakes. Over and over again. And my career has taught me that a lot of times it's the way that we think about
innovation and healthcare that leads to these high failure rates and the disconnect from what's happening in people's lives. So our job really is to make sure that we're giving leaders and decision makers in systems of care, the tools that they need so that they're getting access to the right information and centering the lived experience of the most vulnerable
populations. And now we've actually gone further and focused our attention on AI. Because that's obviously where innovation is happening right now. And it's showing up at the front door and in the back door of healthcare systems, which I hope we talk a little bit more about that.
And we really need to be a very aware of what we're introducing and to already stressed already like broken health systems.
“Does this play into how you develop the seeds of innovation framework?”
So you've developed a framework to kind of bypass the trial and error that you've sensed to the systems we're predictive about comes in. Yes, absolutely.
So the first, the first inkling of season innovation actually started when I was working as
director of innovation for the health equity accelerator at Boston Medical Center, a little bit of history there, you know, COVID, George Floyd, Black Lives Matter, our institution took a pause and we were looking at our results and we saw the same systemic issues despite having a history of centering the most vulnerable populations where you had Black and Brown patients outcomes were just much lower.
And honestly, the way that I describe this is, you know, if every system is perfectly designed to produce the results it gets, it is definitely evidence that every system is perfectly designed to cut Black Lives short. And we had to think about how do we do this differently.
So my domain of focus was around social determinants of health. Those upstream things like access to housing and food and employment. Those are the things that help people stay healthy. And but we were serving over 76 languages, patients who were dealing with housing and stability substance use immigration issues, all the things.
And so I was really looking for technology that suited what our patients needed. So I started by talking to them and we started, I'm listening to our providers and our teams that are staying in the barriers. And then we went out to resource the technology. And when we implemented the technology and I will say this,
we actually implemented the first AI based technology while I was at Boston Medical Center
in August of 2020. Thanks for coming. Yes. A company, it was called Nutrible when I met the founder. It since changed our name to Thrive Link.
We were doing a survey at the time called Thrive. And we actually implemented it in our geriatric clinic. And it was a tool that essentially was using voice bots at the time. This was very early voice AI to talk to patients and gather information for there. Applications and then submit it.
“So they didn't have to, you know, have you ever filled out an application for like Medicaid or Medicare?”
And it's hard, even if you speak English, it's hard. And it was such a great success. And what I found, what I learned from that process and also from the technologies that didn't work,
There is a specific lens to think through how you make decisions around the t...
And that's what births the seeds of innovation framework.
Yeah. And it's, it's, I love that as the technologies, you still start by talking what listening. Yeah. So important.
“And I think that's just something that, you know, obviously this shows about AI and we talk about AI.”
But it's almost in some way, not the point. And the theme that keeps coming up and you just really have on it is the importance of listening to each other and understanding the problem. So I just think that's really interesting. No, I think it's great that you started in geriatrics because we all know. We've seen funny Saturday Night Live skits about, you know, older people not being able to remember the name of Alexa and, you know, it's just interesting that at a time when maybe it wasn't a sophisticated 2023.
We've come a long way that you started there and I think that's the fact that you said we had success. That's pretty impressive. Absolutely impressive. Yeah. Yeah.
It was also tactical to start with the geriatrics clinic because if the technology will work for my grandma, it will work for my grandmother. It will work for me. Right. Right. It's a very basic premise. And even when we think about user experience, user design. When you design around the most vulnerable populations, everyone benefits.
It's kind of like the wheelchair ramp effect, right? When the when the disabilities act came on and you create those ramps for people with wheelchairs, but it also helps the mom with the stroller or the passenger with the carry on bag. You know what I mean? So we've seen this example all over our society when we think about the most vulnerable. We end up making better decisions.
Yeah. I love that about Boston because we've had women on from the Perkins School for the blind and I love talking about accessibility. And I love that you're looking at it in a holistic way. I think when we label each other and say this problem doesn't affect me, but yet I have a stroller. I mean, that's a fair point about Boston. I think that you do think holistically. And the temptation when you're in innovation is to silo and not care about anything, but what you're doing,
but you're you're such an umbrella.
And and Cara really hit on it. Can you talk a little bit about the listen first platform?
“Because I think that's truly your secret sauce.”
Yes. So listen first was my answer to this question of like, okay, how do we help people who are making decisions around innovation, whether it be startups, health system, even policy makers, even investors, how do we help them understand what the lived experiences? Because one of the big challenges that you had is like, how do you collect lived experience data? What I was director of innovation at Boston Medical Center.
We were running focus groups with often like 10 to 15 patients and we did it in multiple languages English, Spanish, and I had a team to help me do that. And it was difficult to do, especially if you wanted to make sure that you were reaching patients who were dealing with homelessness or substance user, other barriers, patients who were pregnant, patients who were elderly. So we paid them for their time. We covered transportation, right?
We provided food. We booked it at a time when that was convenient for them. It a lot of thought went into it to make sure that we were really engaging people. Most companies either don't have the interest or the time or resources to do that effectively, right? So this is really the solution to that. How can we use voice AI and conversations with patients using tools that can adapt to their literacy? They can adapt to their language to understand their lived experience.
And not just looking at the quantitative data because data in itself, although it can help us understand kind of directions and trends and what's happening from that perspective, the questions that we ask of data are going to be filtered through our bias. So the answers that we get are also going to be biased, right? So the only real source of truth is what people are telling you their experience is, right?
And so our job then is to collect that information at scale, do the analysis on your behalf, using human experts and AI, and then present you with the insights,
“so that you understand how you need to make the decision.”
And I do think our timing is really incredible because we are moving from what I call a data economy to an insights economy.
We're no longer seeking to get a bunch of answers and we're just learning what the right answer is. Think about people switching from a Google search now to a chat conversation, right? Yeah. Go back to the next city. Now when you ask a question, you're getting one answer, right? Not a series of answers that you can pick from.
And that's the word, the direction that we're seeing going in innovation as well, where people are now just, we got to get better at asking the right questions, which we also support people with, and then get to the right answers, because we're sourcing it from the people who know the truth because they're living it.
One of the things that's interesting to me is, you know,
in this country, we spend so much money, right, on the health care,
and so many brilliant people, including the people at Biomedical Center and others, are working on this problem, and yet our outcomes still are so poor. You know, so it's, it's just one of the great mysteries.
“You know, it's like, what is this, what is this barrier?”
So, I'm not saying, you know, one tool will solve it all, but maybe talk a little bit about how this insights versus data could help maybe close that gap a little bit finely for us. Yeah, you bring up an excellent point, Cara, because this is also something I've grappled with,
understanding that as a developed country, we're the bottom, when it comes to life expectancy, right? And we spend the most.
And when you, when you stratify the data a bit more,
what you realize is happening is you have folks who are privileged, generally white upper class versus the bottom of society, who are living, there's drastic differences in life expectancy, right? So even in Boston, from back Bay to Roxbury, which is a couple stops on the train.
The life expectancy changes by 25 years. Wow. Yeah, 25 years, right? Now, those kinds of data in statistics we've known all along. And of course, during COVID, it really surface when we were watching
the devastation and black and brown communities. Now, when you ask about the reason why, it is the structural systems that are in place that prevent people from living up to their highest health, right? Whether that's access to food and housing and transportation and employment.
But beyond that, once you arrive at the doorstep of health care institutions, then you're facing a whole another barrier, right? Where you have systemic bias and racism and other things that are in place as well. But even without that, even if we assume that everyone was getting the same care, where are the way that we innovate and health care and the tools that we're using
to support our physicians and our nurses and our teams and providing care, they're actually quite sub-par. I mean, think about it. We're still using beepers and fax machines and sending patients radiologists on CDs, right?
That's our health care right now. That is hard.
That one I'll never fully understand.
That's right. Right, right? We're in like, in transformation triage. It just seems like which problem do we solve first. But thankfully you're grappling with something so key as you're talking about it.
I can't help but think about us as a whole. You know, if we can really tackle this and really just expose that bias, it's not about us and them. It's literally this whole system and facilitating the change that needs to be done. So thankfully, you're a sought after public speaker.
“How often are you on the road just talking about this?”
Are you really, are you reaching platform? That kind of the exciting thing that you get to do? Yes, I obviously like I'm passionate about this and I love to talk about it. But I will say, I am getting to the point where I'm starting to feel a little bit like a broken record. And I also like acknowledging that, you know,
the political climate that we're in, people aren't as open to these conversations. But I'm also realizing that there is a very alarming and disturbing trend happening in healthcare right now. That hasn't even more terrified than I've ever been. I do think that the introduction of AI to our health systems, which is happening rapidly and fast, is going to accelerate the downfall of our healthcare system.
So think about the banking crisis that we saw in 2008, but think about health systems failing and this has been a trend that has been going on already. Since COVID, we've seen an acceleration in the number of hospitals closing. The bill that was recently passed is going to devastate community health centers all over the country, right? So you no longer even have like the funding coverage to ensure access to care.
And rising unemployment made the CEO of anthropic himself said he's expecting 50% unemployment in the next couple years. I don't think he's wrong. I do think that number is validated with the smartest AI in the world. I'm right. Yeah.
Thank you. I would like it, too.
“Right. So when you think about really think about what that means, the great depression was 25% unemployment, right?”
That is human devastation on a scale we don't understand. And most of us, right, our health insurance benefits are tied to our job. So yes, your unemployed, which means you aren't able to cover your basic necessities.
You're stressed out as a result, and you don't have access to healthcare beca...
There's not really a sustainable public market for this.
So as we're going down that train, because that train is left the station, our health systems that are underfunded with doctors that are burned out with the nurses that are burned out with the pipeline of specialists that is shrinking and we've known these problems for a very long time. And another alarming trend, people with money are buying doctors out of public institutions to become concierge physicians to ensure that they have access to healthcare. And I don't blame them for doing that.
“Who wants to wait a year to see a specialist, right?”
And Boston, we have some of the best medical institutions in Thor Rock, and there's one or the corner, right? But no mammogram to get a mammogram right now, you're waiting six months to get a sleep study a year, right?
Literally, I saw someone post on LinkedIn, they're needed to see a specialist because they have a special type of endocrine issue a year wait a year.
So sad. Yes, and now we're introducing AI that's supposed to fix this, but AI has come through the innovation engine that also produced digital health. Digital health companies fail at a 98% failure rate. So what do we think we're going to be introducing into healthcare? AI is going to fail 98% of the time.
Wow, that's a very important conversation, yeah. It is, and so okay, so I've tried to figure out the best way to frame this. So everything you say is accurate, and I agree, and the whole unemployment thing is is really terrifying.
“But if we step back and try to look at it through a slightly different lens, and I honestly can't think of a better lens to look at it through the lens of public health, right?”
So I don't know if anyone in public health is thinking about this yet, but something like that would be a public health emergency, right? Because we would affect so many people's wellbeing. So we do have an opportunity right now, this is a little polyana, but to reimagine our social compassion, right? So let's talk about that a little bit, and it doesn't mean, you know, it'll be easy, it'll be exceptionally difficult. But our usual pattern is, oh, you know, a couple rich guys get richer and everyone else gets nothing and it's horrible in technology changes, right?
But we actually as individuals could insist for something different, so think about that. Like what would you say if we wanted, you know, people who don't know what public health is or how we think about things, maybe help us think of, yes, but we could take this path, if we wanted to reimagine our society and how it functions.
“Do you see any glimmer of hope there that we could use this massive technology change to actually rethink how individuals live in a society?”
That is a lovely question, and actually do have a lot of ideas about that. I'm writing a book called Remembering How to Care, reimagining health care in the age of AI all about this topic and what it would look like for us to really redesign health care from the bottom up. Several ideas that I have, but one that is I think the most potent is this concept of data sovereignty. Have you heard of it before? Well, yes, because I talked to you, but go for it.
It's not just our audience, isn't it? Right. So right now, data in general, right? We live in a society where our data is consumed and then it's analyzed and then it's used to, you know, sell us things or create services or design interventions all the rest, right? In health care, you have laws that protect privacy and access to data, hip up, right? And what it can but tells us is you are essentially like institutions that take the data. You are allowed to move the data and do whatever you want with it, essentially as long as it is de-identified meaning you remove the personal health information, the person's name,
a state of birth, there's a codes and other things, right? And what that has effectively done, it's created a hundred billion dollar industry of infrastructure around collecting, analyzing and making sense of data, right?
But on the other side of that, you also have a hundred billion dollars and wasted innovation.
So what does that tell you? The way that we relate to understand and analyze data is fundamentally flawed. And I would actually go as far as calling it a Ponzi scheme because what you're effectively doing, what's in, let me explain what you're effectively doing with data is like, let's say I tell you all like, didn't give me your pinky. I'm going to cut off everyone's pinky, we're donating it to research, okay, great. But now you've given me your pinky and now I've decided to do a little bit of research, but now I'm going to grind up some of it and turn it into dog food or turn it into an ornament or, you know, sell it on the black mark, whatever it might be, right?
You didn't have no idea that that's what I was going to do with your data, ri...
And all, but because you can no one can tell that it's your pinky versus someone else's, they're allowed to do whatever they want with it. And that whole cycle is what keeps producing garbage in garbage out in terms of innovation. So in order to fix that problem, it's not better data, it's not more data, it's data sovereignty. My data needs to belong to me as an individual, it is an extension of my identity, of my sovereignty, of my selfhood. And I ought to control who has access to it, how they are using it, how long they have access to it.
And that's actually the engine behind listen first.
We ensure data sovereignty, any information, narrative or clinical data that people share with us, they retain full ownership of it. We're just asking for permission to analyze that data to generate the insights. And most importantly, we're paying people for it because it is an asset.
“So it's kind of like having money in the bank, right?”
It's your money, you put it in bank of America or any other bank. And if they decide to loan it out, you get interest, same thing with your data. And until that changes and remember, AI needs what data to function, right? Because data is now the new oil.
That is the first call that we need to change and how we think about redesigning healthcare.
Because only then can we get to higher quality insights and level the playing field so that the people who are experiencing the pain can really drive the innovation. Wow, I can see why you are a social change futurist. And I don't think that's, I think that's just fantastic. I've said this before, but I will say it now that, you know, my soul is not an algorithm. And I like this idea of data sovereignty.
Wow, Sheila, any final words because we've had a good time with you. And I think car we want to have her back and we definitely want to talk to you when you do this book launch because this is very exciting, he's gathering momentum. Yeah, it's a privilege to get your insights and it's a privilege to see that you represent Boston in this way. I think it's a Boston frame of mind and I think it's like I said in the beginning.
“I think you're right in the pocket of why we love to do this show, right, Cara?”
Yeah, I think Boston's going to be that to center for ethical. Yeah, that's my, that's my prediction. It's coming up. We got a lot of future shows about ethical AI, but you know, Sheila, thanks for framing some very practical way of looking at this.
I think when we talk about ethics, we forget the human first story.
The fact that you started out as a young person, you were 14 when you did your first nonprofit, yeah. Yes. Wow. Well, you're certainly inspirational to younger people, but I will say for my age, I'm glad you're in charge of so many things. Any final words? I'll give you that now.
“Yeah, I would say the final word is this concept of sovereignty is one that I think we need to permeate throughout society and in all systems of care.”
And probably by the time this podcast release, I would have sent out an open letter to the CEO of anthropic, where I do believe that the way that we're designing AI models currently is leading to a dramatic undermining of human sovereignty. And I do think that that is the crux of the issue, whether it's in how we build AI models, how we design help care. We have to hold on to this idea that as sovereign beings, we get to decide what health is, we get to decide what privacy is, we get to decide what safety is, and it's going to be different for every person.
And there's no bad or right answer, right? It's allowing the individual that control. So, you know, so I think like perpetuating this idea and inserting it into the conversation that's happening around innovation and how we tackle some of these issues as we're going through. Probably the most dramatic change we've experienced in generations. How do we protect human sovereignty? Um, I hope we have that conversation as well. Carl, let's, let's get the brother sister duo of anthropic on and we'll have Sheila ask the questions.
I think that will be great. And well, if it will lead to the tier of the letter in the show notes because I'm sure people will love to see that. On her first send, thank you audience for listening. You know, this has been a very special episode. So Sheila, thanks for kicking us off right in the new year. Check out the bonus links and we have some special content coming out this month. Sheila, please come back and share your progress with us. Happy to. All right. Thanks both of you. We'll see you next time. Yeah.
Thank you for joining us on building AI Boston.
Stay tuned for more enlightening episodes that put you at the forefront of the conversations shaping our future.


