Evidence-based Clinical Decision Support

Background Context

When we choose doctors, we prefer the ones with more knowledge and experience. Evidence-based medicine has been the standard of practice for thousands of years. With increasing access to clinical trials and patient outcome data, more and more physicians are starting to lean towards evidence-based medicine, in which they use the best available evidence, along with their clinical expertise and the patient's values and preferences, to make decisions.

Company Context

Flatiron Health has always functioned as two separate organizations with two distinct groups of target audience. On one side of the organization, we create digital products for cancer clinics and oncologists; on the other side, we produce packaged real-world evidence for academic researchers and pharmaceutical teams to supplement their clinical research needs. This project was a serendipitous opportunity that attempted to bridge the gap between clinical research and patient care, I considered myself extremely lucky to gain precious experience from both sides! This is an exploratory project sponsored by the senior executives in the company. Our head of clinical science and the VP of of product both wanted to better understand if we can leverage our real-world evidence (RWE) to create a unique value proposition for our doctor-facing products.

My Responsibilities

I led both the research and design efforts of this project as a Design Director. From the methods of research investigation to the workshop with oncologists to the validation of the design prototype, I was responsible for the quality of learning and the creative process of the project.

First and foremost, we aligned on the initial hypothesis:

Real World Evidence integration can become a unique differentiator for Flatiron Assist to guide treatment decisions at the point of care.

During this two-phased process, I also got to partner with many passionate colleagues from Data Science, Clinical Oncology and Biostatistics (quantitative scientists):

  • Phase I: Physician workshop to assess the opportunity

Partners: VP of clinical care, pharmacy director, product manager, and product designer on my team.

  • Phase II: Prototype to validate hypotheses

Partners: a couple of quantitative scientists, clinicians, a senior back-end engineer, and agency consultants on data modeling.

 

Phase I - Physician Workshop Design:

“We try to get away from experience-based medicine and towards evidence-based medicine, but when your evidence base is really shallow, just sometimes, someone who says ‘Oh yeah, I saw something like that and I gave them carboplatin and radiation and it worked and that can be helpful.” — Dr. Arrowsmith

Participants

We had the FAB workshop with 6 panelists to discuss complex clinical scenarios and the feasibility of real-world evidence for clinical decision support. Prior to the actual FAB meeting, we also did a workshop dry run with 7 physicians from our internal clinical team. Out of the 13 participants, it’s about 50/50 between community experiences vs. health systems. And 2 of them are Radiation Oncologists, the remaining are Medical Oncologists. 

 

Phase II - Prototype Planning & Design Validation:

Based on the Phase I learning, another team from a different part of the organization decided to run a design experiment to help physicians compare the effectiveness of immunotherapy (IO treatment) alone vs immunotherapy combined with chemotherapy (Chemo-IO treatment) during point of care. I was the connective tissue between Phase I and Phase II, working closely with data scientists to plan and experiment a plug-in for the Clinical Decision Support tool.

Hypotheses

  • Clinicians want to view S curve , hazard ratio, numbers at risk in a easy to view table format

  • When there is strong real-world evidence, clinicians want to quickly comprehend on which treatment works best for their patient.

  • When the input value does not trigger enough real-world data evidence, physicians need to know which treatment would a similar patient get assigned.

  • For the more data-driven users, they would want to know methodologies used for data collection, processing and cohort analysis, we need to give them transparency to build trust.

 
 

Impact & Key Learnings

Overall, the demonstration design project was well received. Physician users found the concept valuable: both therapy options were NCCN-approved, and identifying patients who could benefit from immunotherapy alone (without chemotherapy) was seen as a meaningful opportunity to reduce side effects and improve quality of life.

Key Learnings

1. Workflow usability vs. system integration
Physicians found the end-to-end workflow — from capturing disease characteristics to generating treatment recommendations — intuitive and easy to use. However, they strongly preferred a more seamless experience where patient data could be automatically pulled from the EHR, rather than requiring manual data entry. Reducing duplication was viewed as essential for real-world adoption.

2. Trust in decision support depends on evidence transparency
The combination of “patients like this” comparisons and clearly communicated statistical confidence emerged as the foundation of trust in evidence-based clinical decision support. Missing either element significantly reduced credibility.

To move forward, the solution would require robust real-world data to:

  • Represent clinically relevant patient cohorts that mirror those seen in practice

  • Establish high confidence in outcome differences between IO alone and IO + chemotherapy

Ultimately, the modeling team was unable to detect statistically significant differences across NSCLC cohorts. As a result, the team — in alignment with executive leadership — decided not to productize the concept.

3. Digestibility is as critical as accuracy
Physicians emphasized that the way insights are presented is just as important as the analysis itself. Visualizing complex data must balance clarity with depth, allowing users to progressively explore evidence based on their intent.

Doctors described an ideal decision support tool as more than a recommendation engine — they envisioned a continuous learning environment that supports clinical reasoning, patient conversations, and ongoing education as new evidence emerges.

Previous
Previous

Informing Redesign from Design Principles

Next
Next

System Thinking Zero to One