The following post is part of a year-long online campaign highlighting #WomenofPenn. The campaign, developed by FOCUS on Women’s Health and Leadership and Penn Medicine Communications promotes the work being done by women at Penn Medicine and aims to inspire early-career women in academic medicine through the examples of successful women role models.
If you’ve ever wanted to throw your computer or smartphone across the room because of an irritating piece of software, then even if you may not know it, you probably appreciate the work of human factors scientists who have shaped the design of other software programs that just seem to work without your having to think about it. You may not even know that human factors scientists exist — and especially that some of them work in health care, helping to shape the way doctors, nurses, and other clinicians can best improve the health of patients.
Susan Harkness Regli, PhD, is one of these rare specialists at Penn Medicine. Readers of the Penn Medicine News Blog first encountered Regli and her work in a series of two posts last year. First, we highlighted Palliative Connect, a project fueled by predictive analytics to help more patients with serious illness to have important conversations about their care goals. We then looked deeper into what Penn Medicine data scientists are learning about how to use predictive analytics most effectively to improve care.
We recently sat down with Regli to learn more about her specific expertise as a human factors scientist and to better understand her relatively unusual career in health care and the path that brought her here.
Q: What does it mean to be a human factors scientist, and how are you using that in health care?
A: As a human factors scientist, I put a unique focus on the human to see what are the inputs or outputs for the human being as part of a system. The example I usually give is a pilot in a cockpit. You have to think about how big the person is, what they can reach, what are they looking at, what information needs to be front and center for them to always be monitoring, vs. what information needs to have an alert to notify them when something is over a threshold. Aviation is really one of the places where human factors started to be seen as a field. Because if the engine fails, the plane goes down. But if the pilot fails, the plane can also go down. You don't want to just jump all the time to saying, “Well that was human error.” Humans are always going to make errors because they're human. The system needs to be designed as best as possible enhance human behavior and to limit negative consequences from error.
The “system” the human being is part of also goes beyond just the pilot and the plane. The system is the airport that the plane is running in. The system is the colleagues who the pilot is working with, the management that makes decisions about what thresholds dictate if the pilot should fly or not, and what the consequences are for mistakes. Those are all part of it. When specialists like me look at the human in the system, we tend to focus primarily on the connection to the immediate system, but also bring people in to collaborate with us, to help with the interface with larger systems. That could be an organizational behavior expert, a patient safety expert, or an architecture expert for designing a building, for example.
Q: What is it like working in this specialty in health care, after previously working at Lockheed Martin?
A: I always say that electronic health records are tremendously more difficult to design than systems for running an airplane. There's a lot of things you need to do, a lot of different people who need to ‘fly’ it, people who might say, “I’ve been ‘flying’ this way for 30 years and I’m not going to change now,” even though there are five different people metaphorically ‘flying the same plane.’ An EHR is a very challenging system to design. It's one of the things that attracted me to health care because it's such a fascinating and critical design space.
I don't think enough people in my field, especially in my specialty of human-computer interaction (HCI), are able to get behind the scenes to actually have an effect on EHRs. It's hard to get into a hospital system. To complicate things, hospital systems are not the ones designing the EHR usually; it's the EHR vendors. But the way that the EHR is implemented in the hospital, changes the interaction because individual hospitals implement EHRs their own way, to support their own best practices. From an HCI standpoint, that can change your success. It can improve it or it can degrade it, based on what you've done in your implementation. That means another level of usability testing is needed beyond the usability testing that vendors do themselves, and the second round of testing at the hospital usually is not built into the implementation process.
Q: So then is that your role? User testing of the electronic health records as they’re implemented here at Penn Medicine?
A: I wouldn't say I've cracked the nut on how to do that across the whole health system, but Penn has made some real inroads and innovations to improve the safety and usability of our EHR. One thing we did before PennChart went live across the Penn Medicine was a set of interactive walkthroughs for “wicked workflows,” as I call them. These are workflows that require multiple roles to work with one another, across the doctors, nurses, and a pharmacist, and maybe a bed management person or admissions person. We got everybody in a room with each role actively engaged in the process, like a demo on steroids, as we walked through the steps: “OK the doctor enters this, and then we're going to go and see what the nurse sees when he logs in.” We’d proceed carefully through questions we’d ask at each step. This allowed us to test out two things: One was to identify if there were any break fixes in the system that needed to be changed before we went live, and the other was increasing the staff members’ comfort level with the system before there was a patient in front of them. PennChart analysts then used similar processes when we went live with PennChart at Princeton.
About one-third of my work is directly on usability and workflows in PennChart. I am also a member of the Penn Value Improvement team. We work to spread improvements that are tried and tested in one area, entity, or service line, and extend them across the health system.
Probably the coolest part of my job is the work I do with our Data Science team. We are pushing the boundaries of human-computer interaction by trying to bridge from advanced technology and machine learning into the day-to-day reality of clinical workflows. A big part of what I do is setting up evaluation protocols to measure outcomes in such a way that you can see a significant difference even though human beings have such variation in what they do. It's sort of like behavioral psychology evaluation. I love doing that because it’s like a puzzle. I also work with clinicians to develop workflows that can use the information produced by data science to create better outcomes for patients and the health system.
Q: What are some other projects you’ve worked on with the data science team, other than Palliative Connect, which we’ve discussed before? Can you walk through what you brought to the table as a human factors scientist?
A: Lung Connect would be a good example. The team working on this project wanted to reduce inappropriate or unnecessary emergency department visits among patients with lung cancer. So the data science team created an algorithm to predict patients that were likely to end up in the emergency room within the next two weeks.
When we started the project, the silent pilot had the doctors going through the list and deciding who to call and who not to call based on what they knew about the patients. What we realized was, from an evaluation protocol standpoint, that was kind of counteracting the algorithm. We were trying to use the algorithm to predict something about these patients that the doctors might not be able to predict, because they weren't able to process thousands of pieces of data all at one time and monitor patients constantly. We recommended calling everyone the algorithm suggested above a certain threshold for a short period of time, and using a scripted workflow — while also collecting data to evaluate if this algorithm is choosing people we can make a difference on.
I helped design a simple user interface, because the more of that data we capture about what happens with these patients, the better we can understand what is and isn't working in the program. Then we were able to look for outcomes. Not just using a predictive algorithm to generate a prediction about patients returning to the ED, but building and evaluating how effectively we built an intervention around that prediction.
We learned that we need an intervention that can get data from people more often, something that would be less intrusive than a phone call, so now they're considering doing a text-messaging intervention among a few other alternatives. The very process of trying to articulate and test this uncovered what kinds of work actually needed to be done to keep those patients from needing to go to the ED.
Q: How did you get started in this field?
A: My background was writing and English literature and poetry. I started working in the technical writing field early on in the days of computer software being built. My task might be to write about what a 2-D drawing program did so that the users could use it, and I realized that I didn't really know what the users wanted to do with it. I got interested in the question of, how do we know how somebody is going to use a system? That then leads to, how do we design a system to support better work practices?
The path I’ve taken from writing to technical writing to studying human/computer interaction may sound disconnected but it’s actually very intuitive. When you're doing technical writing, you're trying to understand a system in a way you can communicate to people in text. Now as a human factors scientist who specializes in HCI, I'm trying to understand computer software in a way that I can communicate to people through the system itself.
I think about HCI as facilitating smoother conversations. Whatever is on a computer screen is trying to communicate to a person what should be done and what can be done, and whatever the person knows has to somehow get into that machine in order to accomplish something. If you have an electronic health record and you have a system that does clinical decision support, for example, the clinical decision support algorithm can't do anything without the person entering the data. It’s that interaction of the person entering the data, then the system doing what it can do best and presenting that information to the person at the right time so that it can help them—I think of that as like a conversation. The machine is trying to tell the person something and vice versa.
Q: Does your background in literature and poetry shape the way you approach your work?
A: I would say yes. Poetry, the way I learned it, makes you really hone in on the way that you're communicating to people and pay attention to what words need to be in a poem and what should be eliminated. It’s a precise yet creative process. When I'm doing a heuristic review of an interface, checking for adherence to basic design principles, for example, I'm looking a lot of times for what I can remove rather than what I can add, in order to refocus people on what's really important. I also use my creativity in learning how people do their work and imagining with them how there might be better ways to do it.
There's an emotional piece to the connection as well. I care a lot about health care. I was working at Lockheed Martin when my mom died in 2013, and my dad died six months later. That was a really emotional time for me. That summer was when I decided to make some changes. I had believed I needed to work in health care for ages, but I couldn’t get out there and do it while I was in my job. I left my job and went on a quest to talk to health professionals all over the city about human factors and human-computer interaction. I eventually ended up in [Penn Medicine Chief Medical Information Officer] Bill Hanson's office where I found a leader who understood and valued the perspective I would bring to clinical informatics. During that time, I also started writing a book of poems that I eventually self-published. It feels like there's a connection between that surge of emotion and energy and my entry into healthcare. It’s the reason why when I have a frustrating day, I say, well, look, I’m doing something really important. Sometimes when I head to a meeting I motivate myself with a short little phrase: “Changing the world!”