Session #20 
How to Drive Clinical Improvement That Get Results 
Tom Burton 
And the Catalyst Academy Education Team
2 
What is a Clinical Program? 
• Organized around care delivery processes 
• Permanent integrated team of clinical and 
analytics staff 
• Creates a iterative continuous learning 
environment 
• Focus is on sustained clinical outcome 
improvement (not revenue growth) 
• Not a Clinical Service Line (although you can 
Leverage Service Lines as a good start)
3 
Organizational AGILE Teams 
• Permanent teams that meet weekly 
• Integrated clinical and technical members 
• Supports multiple care process families 
MD Lead 
RN SME 
= SubjectMatter Expert 
= Data Capture 
= DataProvisioning & Visualization 
= Data Analysis 
Women & Children’s Clinical Program Guidance Team 
Pregnancy 
Knowledge 
Manager 
MD Lead 
RN SME 
Data 
Architect 
Guidance Team MD lead 
RN, Clin Ops Director 
Application 
Administrator 
MD Lead 
RN SME 
Normal Newborn 
Gynecology
4 
Incorporating the most effective 
learning methods 
Teach Others - 90% 
Practice by Doing- 75% 
Discussion Group- 50% 
Demonstration- 30% 
Audiovisual- 20% 
Reading- 10% 
Lecture- 5% 
% represents average 
information retained through 
the particular learning method 
‒ Duke University 
0 50 100
5 
Session Objective 
4 Learning Experiences 
Clinical Programs that Get Results Principles 
 Choose the right initiative 
 Understand variation 
 Improve data quality 
 Choose the right influencers
6 
Choose the right initiative
7 
Deal or No Deal Exercise
8 
DEAL or NO DEAL
9 
First Principle 
• Picking an improvement opportunity randomly 
is like playing traditional DEAL or NO DEAL 
• You might get lucky 
• Choosing the loudest physician or the 
choosing based on non-data driven reason 
can dis-engages other MDs and use scarce 
analytical resources on projects that may not 
be the best investment 
• It takes about as much effort to work on a 
large process as it does on a small process
10 
Pareto Example: Resources Consumed 
• 80% of all in-patient resources are represented by 21 Care Process Families 
10 
Cumulative % 
% of Total Resources Consumed for each 
clinical work process 
Key Findings: 
Analytic 
System 
50% 
• 50% of all in-patient resources are represented by 7 Care Process Families 
7 CPFs Number of Care Process Families 
(e.g., ischemic heart disease, pregnancy, bowel disorders, spine, heart failure) 
21 CPFs 
80%
11 
Dr. J. 
15 Cases 
$60,000 Avg. Cost Per Case 
Mean Cost per Case = $20,000 
$40,000 x 15 cases = 
$600,000 opportunity 
Total Opportunity = $600,000 
Total Opportunity = $1,475,000 
$35,000 x 25 cases = 
$875,000 opportunity 
Total Opportunity = $2,360,000 
Total Opportunity = $3,960,000 
Cost Per Case, Vascular Procedures 
Analytic 
System
Improvement Approach - Prioritization 
12 
Poor Outcomes Excellent Outcomes 
# of 
Cases 
Poor Outcomes Excellent Outcomes 
# of 
Cases 
Excellent Outcomes 
# of 
Cases 
Poor Outcomes 
Excellent Outcomes 
# of 
Cases 
Poor Outcomes 
1 
2 
3 
4 
High 
Variability 
Low 
Low Resource Consumption High 
12
Improvement Approach - Prioritization 
13 
Poor Outcomes Excellent Outcomes 
# of 
Cases 
Poor Outcomes Excellent Outcomes 
# of 
Cases 
Excellent Outcomes 
# of 
Cases 
Poor Outcomes 
Excellent Outcomes 
# of 
Cases 
Poor Outcomes 
1 
2 
3 
4 
High 
Variability 
Low 
Low Resource Consumption High 
13
14 
Internal Variation versus Resource Consumption 
Y- Axis = Internal Variation in Resources Consumed 
3 
4 
Bubble Size = Resources 
1 
2 
Consumed X Axis = Resources Consumed Bubble Color = Clinical Domain
15 
DEAL or BETTER DEAL 
15
16 
Understand Variation
17 
The Popsicle Bomb Exercise 
Timer 
11234510123456789M101234567890123456789 
When you’re finished note 
your time and enter it in the 
HAS app – Poll Question 1
18 
Variation in Results 
• Corp Analytics – shows results
19 
Less Effective Approach to improvement: 
“Punish the Outliers” 
# of 
Cases 
Current Condition 
• Significant Volume 
• Significant Variation 
# of 
Cases 
Option 1: “Punish the Outliers” or 
“Cut Off the Tail” 
Strategy 
• Set a minimum standard of quality 
• Focus improvement effort on those 
not meeting the minimum standard 
Mean 
Focus on 
Minimum 
Standard 
Metric 
Poor Outcomes Excellent Outcomes Poor Outcomes Excellent Outcomes 
1 box = 100 cases in a year
20 
Effective Approach to improvement: 
Focus on “Better Care” 
Poor Outcomes Excellent Outcomes 
# of 
Cases 
Current Condition 
• Significant Volume 
• Significant Variation 
Excellent Outcomes 
# of 
Cases 
Option 2: Identify Best Practice 
“Narrow the curve and shift it to the right” 
Strategy 
• Identify evidenced based “Shared Baseline” 
• Focus improvement effort on reducing 
variation by following the “Shared Baseline” 
• Often those performing the best make the 
greatest improvements 
Mean 
Focus on 
Best Practice 
Care Process 
Model 
Poor Outcomes 
1 box = 100 cases in a year
21 
Round 2 
Timer 
11234510123456789M101234567890123456789 
When you’re finished note 
your time and enter it in the 
HAS app – Poll Question 2
22 
Reduced Variation in Results 
• Corp Analytics – shows results
23 
Improve Data Quality
24 
The Water Stopper Exercise
25 
Information Management 
DATA CAPTURE 
• Acquire key data elements 
• Assure data quality 
• Integrate data capture into operational 
= Subject Matter Expert 
= Data Capture 
= Data Provisioning 
= Data Analysis 
25 25 
workflow 
DATA ANALYSIS 
• Interpret data 
• Discover new information in the data 
(data mining) 
• Evaluate data quality 
DATA PROVISIONING 
• Move data from transactional systems into 
the Data Warehouse 
• Build visualizations for use by clinicians 
• Generate external reports (e.g., CMS) 
Knowledge Managers (Data 
quality, data stewardship and 
data interpretation) 
Application Administrators 
(optimization of source systems) 
Data Architects 
(Infrastructure, visualization, analysis, reporting) 
Fix it Here 
Not Here 
Not Here
26 
Data Capture Quality Principles 
• Accuracy 
 Does the data match reality? 
 Example: Operating Room Time Stamps 
• Timeliness 
 What is the latency of the data capture? 
 Example: Billing data delay; end of shift catch-up 
• Completeness 
 How often is critical data missing? 
 Example: HF Ejection Fraction
27 
Challenges with Data “Scrubbing” 
Analyst time spent on re-working 
scrubbing routines 
Root cause never identified 
Early binding vs. late binding – 
what you consider dirty data may 
actually be useful for others 
analyzing process failures. 
Using data to punish vs. data to 
learn – punish strategy promotes 
hiding the problem so clinicians 
don’t look bad
28 
Choose the right influencers
29 
Paul Revere's ride Exercise
30 
Revere vs. Dawes 
Paul Revere 
"Revere knew exactly which 
doors to pound on during his 
ride on Brown Beauty that April 
night. As a result, he awakened 
key individuals, who then 
rallied their neighbors to take 
up arms against the British.” 
William Dawes 
"In comparison, Dawes did not 
know the territory as well as 
Revere. As he rode through 
rural Massachusetts on the 
night of April 18, he simply 
knocked on random doors. The 
occupants in most cases 
simply turned over and went 
back to sleep." 
Diffusion of Innovations (Free Press, 2003) by Everett M. Rogers
31 
Innovators. Recruit 
innovators to re-design 
care delivery 
early 
adopters 
Innovators 
early 
majority 
laggards 
(never adopters) 
late 
majority 
* Adapted from Rogers, E. Diffusion of Innovations. New York, NY: 1995. 
processes (like 
Revere) 
Early adopters. Recruit 
early adopters to chair 
improvement and to lead 
implementation at each site. 
(key individuals who can 
rally support) 
The Chasm 
N = number of individuals in group 
N 
N = number needed to influence group 
(but they must be the right individuals)
32 
W&N 
Small Teams 
(Designs Innovation) • Meet weekly in iteration planning meeting 
• Build DRAFT processes, metrics, interventions 
• Present DRAFT work to Broader Teams 
OB 
Early Adopters 
Innovators 
Guidance Team 
(Prioritizes Innovations) 
• Meet quarterly to prioritize allocation of technical staff 
• Approves improvement AIMs 
OB Newborn GYN • Reviews progress and removes road blocks 
W&N 
Innovators 
Innovators 
Broad Teams 
(Implements Innovation) 
• Broad RN and MD representation across system 
• Meet monthly to review, adjust and approve DRAFTs 
• Lead rollout of new process and measurement 
W&N 
OB 
W&N 
W&N 
Early Adopters 
Innovators 
Early Adopters
33 
Organizational AGILE Teams 
• Permanent teams 
• Integrated clinical and technical members 
• Supports multiple care process families 
• Choose innovators and early adopters to lead 
MD Lead 
RN SME 
= SubjectMatterExpert 
= Data Capture 
= DataProvisioning & Visualization 
= Data Analysis 
Women & Children’s Clinical Program Guidance Team 
Pregnancy 
Knowledge 
Manager 
MD Lead 
RN SME 
Data 
Architect 
Guidance Team MD lead 
RN, Clin Ops Director 
Application 
Administrator 
MD Lead 
RN SME 
Normal Newborn 
Gynecology 
Innovators 
Early Adopters
34 
How to identify innovators 
and early adopters 
• Ask 
 Innovators (inventors) 
- Who are the top three MDs in our group who are 
likely to invent a better way to deliver care? 
 Early Adopters (thought leaders) 
- When you have a tough case who are the top three 
MDs you trust and would go to for a consult? 
• Fingerprinting selection process 
 Invite innovators to choose identify their top three 
MD choices from the early adopters to lead the 
Clinical Program
35 
Conclusion – TEACH OTHERS
36 
Teach Others Exercise 
 Deal or No Deal 
- Choose the right initiative 
- Prioritize based on process size and variation 
 Popsicle Bomb 
- Understand variation 
- Measure variation and standardize processes 
 Water Stopper 
- Improve data quality 
- Fix the problem at the source 
 Paul Revere’s Ride 
- Choose the right influencers 
- Identify Innovators and Early adopters to 
accelerate diffusion of innovation 
Take 1 minute and describe the purpose of each 
exercise to your neighbor, then swap and let 
them teach you 
Timer 
11234510123456789M101234567890123456789 
11234510123456789M101234567890123456789
37 
Exercise Effectiveness Q1 
Overall, how effective were the exercises in 
explaining the principles? 
1) Not effective 
2) Somewhat effective 
3) Moderately effective 
4) Very effective 
5) Extremely effective
38 
Exercise Effectiveness Q2 
How effective was the Deal or No Deal Exercise 
at teaching the principle of prioritizing based on 
process size and variation? 
1) Not effective 
2) Somewhat effective 
3) Moderately effective 
4) Very effective 
5) Extremely effective
39 
Exercise Effectiveness Q3 
How effective was the Popsicle Bomb Exercise 
at teaching the principle of understanding 
variation and standardizing processes? 
1) Not effective 
2) Somewhat effective 
3) Moderately effective 
4) Very effective 
5) Extremely effective
40 
Exercise Effectiveness Q4 
How effective was the Water Stopper Exercise 
at teaching the principle of fixing data quality 
issues at the source? 
1) Not effective 
2) Somewhat effective 
3) Moderately effective 
4) Very effective 
5) Extremely effective
41 
Exercise Effectiveness Q5 
How effective was the “Paul Revere Ride” 
exercise at teaching the principle of choosing 
the right influencers based on their capabilities 
as innovators and early adopters? 
1) Not effective 
2) Somewhat effective 
3) Moderately effective 
4) Very effective 
5) Extremely effective
42 
Exercise Effectiveness Q6 
Are you interested in running these same 
exercises in your organizations? 
a) Yes 
b) No
Analytic 
Insights 
Questions & 
A 
Answers
Session Feedback Survey 
44 
1. On a scale of 1-5, how satisfied were you overall with this session? 
1) Not at all satisfied 
2) Somewhat satisfied 
3) Moderately satisfied 
4) Very satisfied 
5) Extremely satisfied 
2. What feedback or suggestions do you have? 
3. On a scale of 1-5, what level of interest would you have for 
additional, continued learning on this topic (articles, webinars, 
collaboration, training)? 
1) No interest 
2) Some interest 
3) Moderate interest 
4) Very interested 
5) Extremely interested

How To Drive Clinical Improvement Programs That Get Results - HAS Session 20

  • 1.
    Session #20 Howto Drive Clinical Improvement That Get Results Tom Burton And the Catalyst Academy Education Team
  • 2.
    2 What isa Clinical Program? • Organized around care delivery processes • Permanent integrated team of clinical and analytics staff • Creates a iterative continuous learning environment • Focus is on sustained clinical outcome improvement (not revenue growth) • Not a Clinical Service Line (although you can Leverage Service Lines as a good start)
  • 3.
    3 Organizational AGILETeams • Permanent teams that meet weekly • Integrated clinical and technical members • Supports multiple care process families MD Lead RN SME = SubjectMatter Expert = Data Capture = DataProvisioning & Visualization = Data Analysis Women & Children’s Clinical Program Guidance Team Pregnancy Knowledge Manager MD Lead RN SME Data Architect Guidance Team MD lead RN, Clin Ops Director Application Administrator MD Lead RN SME Normal Newborn Gynecology
  • 4.
    4 Incorporating themost effective learning methods Teach Others - 90% Practice by Doing- 75% Discussion Group- 50% Demonstration- 30% Audiovisual- 20% Reading- 10% Lecture- 5% % represents average information retained through the particular learning method ‒ Duke University 0 50 100
  • 5.
    5 Session Objective 4 Learning Experiences Clinical Programs that Get Results Principles  Choose the right initiative  Understand variation  Improve data quality  Choose the right influencers
  • 6.
    6 Choose theright initiative
  • 7.
    7 Deal orNo Deal Exercise
  • 8.
    8 DEAL orNO DEAL
  • 9.
    9 First Principle • Picking an improvement opportunity randomly is like playing traditional DEAL or NO DEAL • You might get lucky • Choosing the loudest physician or the choosing based on non-data driven reason can dis-engages other MDs and use scarce analytical resources on projects that may not be the best investment • It takes about as much effort to work on a large process as it does on a small process
  • 10.
    10 Pareto Example:Resources Consumed • 80% of all in-patient resources are represented by 21 Care Process Families 10 Cumulative % % of Total Resources Consumed for each clinical work process Key Findings: Analytic System 50% • 50% of all in-patient resources are represented by 7 Care Process Families 7 CPFs Number of Care Process Families (e.g., ischemic heart disease, pregnancy, bowel disorders, spine, heart failure) 21 CPFs 80%
  • 11.
    11 Dr. J. 15 Cases $60,000 Avg. Cost Per Case Mean Cost per Case = $20,000 $40,000 x 15 cases = $600,000 opportunity Total Opportunity = $600,000 Total Opportunity = $1,475,000 $35,000 x 25 cases = $875,000 opportunity Total Opportunity = $2,360,000 Total Opportunity = $3,960,000 Cost Per Case, Vascular Procedures Analytic System
  • 12.
    Improvement Approach -Prioritization 12 Poor Outcomes Excellent Outcomes # of Cases Poor Outcomes Excellent Outcomes # of Cases Excellent Outcomes # of Cases Poor Outcomes Excellent Outcomes # of Cases Poor Outcomes 1 2 3 4 High Variability Low Low Resource Consumption High 12
  • 13.
    Improvement Approach -Prioritization 13 Poor Outcomes Excellent Outcomes # of Cases Poor Outcomes Excellent Outcomes # of Cases Excellent Outcomes # of Cases Poor Outcomes Excellent Outcomes # of Cases Poor Outcomes 1 2 3 4 High Variability Low Low Resource Consumption High 13
  • 14.
    14 Internal Variationversus Resource Consumption Y- Axis = Internal Variation in Resources Consumed 3 4 Bubble Size = Resources 1 2 Consumed X Axis = Resources Consumed Bubble Color = Clinical Domain
  • 15.
    15 DEAL orBETTER DEAL 15
  • 16.
  • 17.
    17 The PopsicleBomb Exercise Timer 11234510123456789M101234567890123456789 When you’re finished note your time and enter it in the HAS app – Poll Question 1
  • 18.
    18 Variation inResults • Corp Analytics – shows results
  • 19.
    19 Less EffectiveApproach to improvement: “Punish the Outliers” # of Cases Current Condition • Significant Volume • Significant Variation # of Cases Option 1: “Punish the Outliers” or “Cut Off the Tail” Strategy • Set a minimum standard of quality • Focus improvement effort on those not meeting the minimum standard Mean Focus on Minimum Standard Metric Poor Outcomes Excellent Outcomes Poor Outcomes Excellent Outcomes 1 box = 100 cases in a year
  • 20.
    20 Effective Approachto improvement: Focus on “Better Care” Poor Outcomes Excellent Outcomes # of Cases Current Condition • Significant Volume • Significant Variation Excellent Outcomes # of Cases Option 2: Identify Best Practice “Narrow the curve and shift it to the right” Strategy • Identify evidenced based “Shared Baseline” • Focus improvement effort on reducing variation by following the “Shared Baseline” • Often those performing the best make the greatest improvements Mean Focus on Best Practice Care Process Model Poor Outcomes 1 box = 100 cases in a year
  • 21.
    21 Round 2 Timer 11234510123456789M101234567890123456789 When you’re finished note your time and enter it in the HAS app – Poll Question 2
  • 22.
    22 Reduced Variationin Results • Corp Analytics – shows results
  • 23.
  • 24.
    24 The WaterStopper Exercise
  • 25.
    25 Information Management DATA CAPTURE • Acquire key data elements • Assure data quality • Integrate data capture into operational = Subject Matter Expert = Data Capture = Data Provisioning = Data Analysis 25 25 workflow DATA ANALYSIS • Interpret data • Discover new information in the data (data mining) • Evaluate data quality DATA PROVISIONING • Move data from transactional systems into the Data Warehouse • Build visualizations for use by clinicians • Generate external reports (e.g., CMS) Knowledge Managers (Data quality, data stewardship and data interpretation) Application Administrators (optimization of source systems) Data Architects (Infrastructure, visualization, analysis, reporting) Fix it Here Not Here Not Here
  • 26.
    26 Data CaptureQuality Principles • Accuracy  Does the data match reality?  Example: Operating Room Time Stamps • Timeliness  What is the latency of the data capture?  Example: Billing data delay; end of shift catch-up • Completeness  How often is critical data missing?  Example: HF Ejection Fraction
  • 27.
    27 Challenges withData “Scrubbing” Analyst time spent on re-working scrubbing routines Root cause never identified Early binding vs. late binding – what you consider dirty data may actually be useful for others analyzing process failures. Using data to punish vs. data to learn – punish strategy promotes hiding the problem so clinicians don’t look bad
  • 28.
    28 Choose theright influencers
  • 29.
    29 Paul Revere'sride Exercise
  • 30.
    30 Revere vs.Dawes Paul Revere "Revere knew exactly which doors to pound on during his ride on Brown Beauty that April night. As a result, he awakened key individuals, who then rallied their neighbors to take up arms against the British.” William Dawes "In comparison, Dawes did not know the territory as well as Revere. As he rode through rural Massachusetts on the night of April 18, he simply knocked on random doors. The occupants in most cases simply turned over and went back to sleep." Diffusion of Innovations (Free Press, 2003) by Everett M. Rogers
  • 31.
    31 Innovators. Recruit innovators to re-design care delivery early adopters Innovators early majority laggards (never adopters) late majority * Adapted from Rogers, E. Diffusion of Innovations. New York, NY: 1995. processes (like Revere) Early adopters. Recruit early adopters to chair improvement and to lead implementation at each site. (key individuals who can rally support) The Chasm N = number of individuals in group N N = number needed to influence group (but they must be the right individuals)
  • 32.
    32 W&N SmallTeams (Designs Innovation) • Meet weekly in iteration planning meeting • Build DRAFT processes, metrics, interventions • Present DRAFT work to Broader Teams OB Early Adopters Innovators Guidance Team (Prioritizes Innovations) • Meet quarterly to prioritize allocation of technical staff • Approves improvement AIMs OB Newborn GYN • Reviews progress and removes road blocks W&N Innovators Innovators Broad Teams (Implements Innovation) • Broad RN and MD representation across system • Meet monthly to review, adjust and approve DRAFTs • Lead rollout of new process and measurement W&N OB W&N W&N Early Adopters Innovators Early Adopters
  • 33.
    33 Organizational AGILETeams • Permanent teams • Integrated clinical and technical members • Supports multiple care process families • Choose innovators and early adopters to lead MD Lead RN SME = SubjectMatterExpert = Data Capture = DataProvisioning & Visualization = Data Analysis Women & Children’s Clinical Program Guidance Team Pregnancy Knowledge Manager MD Lead RN SME Data Architect Guidance Team MD lead RN, Clin Ops Director Application Administrator MD Lead RN SME Normal Newborn Gynecology Innovators Early Adopters
  • 34.
    34 How toidentify innovators and early adopters • Ask  Innovators (inventors) - Who are the top three MDs in our group who are likely to invent a better way to deliver care?  Early Adopters (thought leaders) - When you have a tough case who are the top three MDs you trust and would go to for a consult? • Fingerprinting selection process  Invite innovators to choose identify their top three MD choices from the early adopters to lead the Clinical Program
  • 35.
    35 Conclusion –TEACH OTHERS
  • 36.
    36 Teach OthersExercise  Deal or No Deal - Choose the right initiative - Prioritize based on process size and variation  Popsicle Bomb - Understand variation - Measure variation and standardize processes  Water Stopper - Improve data quality - Fix the problem at the source  Paul Revere’s Ride - Choose the right influencers - Identify Innovators and Early adopters to accelerate diffusion of innovation Take 1 minute and describe the purpose of each exercise to your neighbor, then swap and let them teach you Timer 11234510123456789M101234567890123456789 11234510123456789M101234567890123456789
  • 37.
    37 Exercise EffectivenessQ1 Overall, how effective were the exercises in explaining the principles? 1) Not effective 2) Somewhat effective 3) Moderately effective 4) Very effective 5) Extremely effective
  • 38.
    38 Exercise EffectivenessQ2 How effective was the Deal or No Deal Exercise at teaching the principle of prioritizing based on process size and variation? 1) Not effective 2) Somewhat effective 3) Moderately effective 4) Very effective 5) Extremely effective
  • 39.
    39 Exercise EffectivenessQ3 How effective was the Popsicle Bomb Exercise at teaching the principle of understanding variation and standardizing processes? 1) Not effective 2) Somewhat effective 3) Moderately effective 4) Very effective 5) Extremely effective
  • 40.
    40 Exercise EffectivenessQ4 How effective was the Water Stopper Exercise at teaching the principle of fixing data quality issues at the source? 1) Not effective 2) Somewhat effective 3) Moderately effective 4) Very effective 5) Extremely effective
  • 41.
    41 Exercise EffectivenessQ5 How effective was the “Paul Revere Ride” exercise at teaching the principle of choosing the right influencers based on their capabilities as innovators and early adopters? 1) Not effective 2) Somewhat effective 3) Moderately effective 4) Very effective 5) Extremely effective
  • 42.
    42 Exercise EffectivenessQ6 Are you interested in running these same exercises in your organizations? a) Yes b) No
  • 43.
  • 44.
    Session Feedback Survey 44 1. On a scale of 1-5, how satisfied were you overall with this session? 1) Not at all satisfied 2) Somewhat satisfied 3) Moderately satisfied 4) Very satisfied 5) Extremely satisfied 2. What feedback or suggestions do you have? 3. On a scale of 1-5, what level of interest would you have for additional, continued learning on this topic (articles, webinars, collaboration, training)? 1) No interest 2) Some interest 3) Moderate interest 4) Very interested 5) Extremely interested

Editor's Notes

  • #45 Follow up group participation 1Would you like to participate in a follow up group on this topic that would meet 2-3 times next year to share progress, challenges and best practices? (Yes, No)