"I think judgment, I've been honing in on that word more frequently recently because I feel like the judgment piece is the piece that feels particularly like human in this decision."
— Dr Graham Walker
Listen now on Apple, Spotify, YouTube or wherever you get your podcasts.
In this episode of Clinical Changemakers, Dr Graham Walker, an ER doctor and AI healthcare leader, discusses his role at Kaiser Permanente and the challenges and successes of integrating AI into healthcare. He emphasizes the importance of communication, leadership, and the need for alignment in large healthcare systems. Dr. Walker shares insights on predictive AI tools, the operationalisation of AI in patient care, and the ethical considerations surrounding AI's explainability. He also highlights the rapid adoption of AI scribes across 4.6 million members and the balance between AI benefits and risks in clinical practice.
Key Takeaways
Leadership through servant mentality: Successful healthcare innovation requires leaders who ask "How can I help you?" rather than directing from above. Clear communication, removing blockers, and building relationships are essential for large-scale change.
The 85% capacity rule: Healthcare systems running at 100% capacity fail due to variance in patient complexity and unpredictable demand. Queuing theory suggests keeping 10-15% flexibility to handle the inherent unpredictability of medical care.
Predictive AI has staying power: Kaiser's Advanced Alert Monitoring system, running live since 2018, successfully predicts ICU transfers and rapid responses across all inpatients. The operational challenge isn't building the model—it's fine-tuning sensitivity/specificity and creating sustainable workflows.
AI scribes: Healthcare's fastest adoption: The rapid rollout of AI scribes represents perhaps the fastest technology adoption in healthcare history, driven by ease of use (just install an app and hit record) and immediate physician benefit in reducing "pyjama time" documentation.
Judgment remains human: While AI excels at pattern recognition and data synthesis, clinical judgment—weighing risks, understanding context, and making treatment decisions—remains distinctly human. The scraped knee doesn't need septic workup; that's judgment, not diagnosis.
Where to Find Our Guest
Dr. Graham Walker (LinkedIn)
In This Episode
00:00 - Introduction: Graham Walker's role at Kaiser Permanente Northern California
04:36 - Innovation framework: Value-based care vs. fee-for-service and finding ideas with potential
07:08 - Commercialisation strategy: Building internal tools for external deployment
11:00 - Leadership lessons: Coalition building and relationship management in large institutions
16:34 - Queuing theory in healthcare: Why running at 100% capacity breaks systems
19:48 - Predictive AI success story: Advanced Alert Monitoring system in production
26:00 - The human element: Why unexplainable AI might force better clinical thinking
32:33 - AI scribes rollout: Fastest healthcare technology adoption in history
39:30 - Information fidelity challenges: Note bloat, omissions, and human nature
45:02 - The judgment frontier: What remains uniquely human in clinical decision-making
50:20 - Advice for healthcare leaders: Use AI tools daily to understand capabilities and limits
53:57 - Off-call introduction: Building transparency in physician work and workload data
Referenced
Dr Walker’s company website Offcall
Dr Walker’s Medical Calculation Website MedCalc
Paper: Automated Identification of Adults at Risk for In-Hospital Clinical Deterioration (Link)
Paper: Ambient Artificial Intelligence Scribes to Alleviate the Burden of Clinical Documentation (Link)
Queueing Theory (Wikipedia)
Servant Leadership (Link)
Contact
If you have any feedback, questions or if you'd like to get in touch, reach out at jono@clinicalchangemakers.com
Music Attribution: Music by AudioCoffee from Pixabay.
Before you go! 🙏
If you enjoyed the podcast, please share the love by rating us and sharing it with a friend or colleague.
Share this post