Saving lives through human-centered health technology

In the rush to integrate technology into healthcare, it’s easy to focus solely on what the latest tools can do. But at the University of Michigan, faculty in the Human-Centered Computing (HCC) Laboratory in Computer Science and Engineering (CSE) are leading the charge to ensure that health technology is not just powerful but also human-focused, accessible, and capable of improving lives. Their work is reshaping medical training, patient care, and accessibility, all while asking: “Who is this for, and how can we help them most?”
Researchers at U-M are designing tools that aren’t just based on the latest technical advances but significantly improve the experience of medical professionals and their patients. This includes tools that help teach doctors-in-training and directly assist in taking care of patients, from monitoring symptoms and diagnosis to improving treatment uptake.
Smarter training for tomorrow’s surgeons
Clinical environments are high-pressure, unpredictable, and the stakes are high—nowhere is this truer than in the operating room (OR). Assistant Professor Xu Wang blends computer science, medicine, and human-centered design to create new approaches for surgical training that better prepare clinicians for the realities of the OR.
Backed by a $1.2 million NSF grant, she and collaborators in Michigan Medicine are developing advanced systems that capture nuanced multimodal data during live surgeries—gaze using eye-tracking glasses, audio from real OR conversations, and video from laparoscopic cameras—and use these streams to analyze and improve training.
Her award-winning CHI 2024 paper focused on examining how visual alignment and feedback between trainers and residents can shape both teaching and patient safety. In parallel, her team’s Surgment video-based feedback system segments and annotates surgical scenes, allowing for targeted review and actionable debriefs. Wang’s group is also testing augmented reality visualizations to directly enhance coordination and learning in the OR itself.

“By leveraging advanced computational methods and close partnerships with clinicians, our goal is to make surgeries safer and learning more effective,” said Wang.
By blending immersive simulation, explainable machine learning, and real-world clinical data, Wang’s team is helping shape future generations of surgeons and better prepare them for both the routine and the unexpected, with the long-term goal of improving patient safety and surgical outcomes.
Optimizing teamwork in critical care
Outside of the operating room, in high-pressure emergencies like cardiac arrest, seconds are precious and team performance is critical. Associate Professor Alanson Sample, working with colleagues in Michigan Medicine, is developing VR-based team training to bridge the gap between simulation and real-world practice. The platform immerses healthcare trainees in realistic, high-stress environments where timing, coordination, and communication are put to the test.
But what happens after the simulation can be just as important. Recognizing the limitations of traditional, instructor-led debriefings—which can be inconsistent and subjective—the team’s NSF-funded project is creating a multimodal debriefing system that unobtrusively collects physiological, behavioral, and cognitive data streams (like decision-making patterns and communication signals) and then visualizes them for both trainees and instructors.
The goal is to generate objective, actionable, and personalized feedback that goes beyond generic advice, accelerating skills training and teamwork in critical care settings. “Providing teams with this kind of objective, personalized feedback lets them see what really happened and how they can improve, rather than relying solely on memory or individual observation,” explained Sample.
This human-centered framework not only stands to transform the way clinicians train for urgent, life-saving interventions, but also points the way toward safer, more effective teamwork in situations where quick thinking and collaboration can make all the difference.
Monitoring, mental health, and the road ahead
In addition to strengthening clinician training, innovations from the HCC Lab at U-M are transforming how we monitor patient health—not just in the hospital, but wherever care happens.
Sample’s group, for instance, has developed an intelligent eye drop bottle sleeve that tracks medication usage in real time and securely relays adherence data to healthcare providers. By wirelessly recording each use, measuring fluid levels, and even documenting technique, this technology helps clinicians provide truly personalized support to patients managing chronic conditions like glaucoma—potentially improving outcomes for some of the most at-risk populations.

In another example, Professor Emily Mower Provost’s work extends these monitoring innovations into the realm of mental health. Her lab is pioneering AI-driven systems to track symptoms, detect mood changes, and better personalize treatment plans for individuals with mental illness. In her recent IEEE Transactions on Affective Computing paper, she and her colleagues developed and validated a new pipeline for passively measuring emotion through everyday speech, moving beyond survey-based self-reporting to more scalable, real-world symptom monitoring.
By using smartphone-based audio data from individuals with bipolar disorder, the team demonstrated that both passively collected and self-reported emotion measures can help estimate mood severity, opening up opportunities for more timely intervention and personalized care.
This research, supported by the NIH and major philanthropic foundations, is laying a foundation for privacy-focused, robust, and highly adaptable digital health tools for mental health detection and monitoring. “The overarching goal of this project is to bring measurement out of the clinic and into the real world, supporting long-term, longitudinal tracking of health,” said Mower Provost.
Other CSE researchers are advancing digital tracking for conditions like dizziness and balance issues, remote telemedicine tools, and developmental monitoring in infants. As the department grows, incoming faculty member Daniel Adler will add new expertise in digital interventions for mental health, helping ensure U-M remains at the cutting edge of HCC for well-being.
Health technology that listens and learns
What unites the various efforts of HCC researchers at Michigan is a shared belief that technology should adapt to people, not the other way around. Across training, monitoring, and mobile access, CSE researchers are laying the foundation for the next era of health and medicine, where advances are measured not just by technical excellence but by meaningful improvements in clinical outcomes, patient experience, and health equity.
