GME Evaluation Task Force Recommendation - UCSF School of ...
GME Evaluation Task Force Recommendation - UCSF School of ...
GME Evaluation Task Force Recommendation - UCSF School of ...
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
<strong>UCSF</strong> Office <strong>of</strong> Graduate Medical Education<br />
<strong>GME</strong> <strong>Evaluation</strong> <strong>Task</strong> <strong>Force</strong><br />
Report and <strong>Recommendation</strong>s<br />
July 1, 2008<br />
<strong>UCSF</strong> <strong>School</strong> <strong>of</strong> Medicine<br />
Graduate Medical Education<br />
500 Parnassus Ave. MU 250E<br />
San Francisco, CA 94143-0474<br />
www.medschool.ucsf.edu/gme
June 2008<br />
Preface<br />
<strong>UCSF</strong> Offi ce <strong>of</strong> Graduate Medical Education<br />
The AC<strong>GME</strong> Outcome Project was initiated in 2000 and is currently in Phase 3 <strong>of</strong> its implementation.<br />
The goal for Phase 3 is “full integration <strong>of</strong> the competencies and their assessment with learning<br />
and patient care.” Programs are required to use a minimum <strong>of</strong> 2 methods for evaluating each <strong>of</strong><br />
the 6 AC<strong>GME</strong> general competencies and to use resident performance data as the basis for program<br />
improvement.<br />
Implementing robust assessment systems for the 6 competencies has proved diffi cult for most<br />
residency programs. Logistical challenges and a lack <strong>of</strong> centralized tools are primarily responsible.<br />
The AC<strong>GME</strong> has provided guidance in the form <strong>of</strong> a Toolbox, Think Tank <strong>Recommendation</strong>s and, most<br />
recently, Implementation Guides. These materials are available at: htt p://www.acgme.org/outcome/.<br />
However, these resources <strong>of</strong>f er annotated lists <strong>of</strong> recommended assessment tools and do not propose<br />
a core set to be used across <strong>GME</strong> programs.<br />
In mid-2007 the Associate Dean for Graduate Medical Education appointed a <strong>GME</strong> <strong>Evaluation</strong> <strong>Task</strong><br />
<strong>Force</strong> to be chaired by the <strong>GME</strong> Director <strong>of</strong> Curricular Aff airs. The <strong>Task</strong> <strong>Force</strong> was charged to “use its<br />
theoretical and practical expertise to review currently used tools and propose a centralized set that all<br />
<strong>GME</strong> programs will be encouraged to use. The set will include assessments <strong>of</strong> resident competency<br />
that are common across programs, evaluations <strong>of</strong> faculty teaching, and approaches for benchmarking<br />
program performance.”<br />
In making our recommendations the <strong>Task</strong> <strong>Force</strong> aims to supplement the AC<strong>GME</strong> resources by<br />
identifying best practices at <strong>UCSF</strong> and proposing a core set <strong>of</strong> tools for the assessment <strong>of</strong> competencies<br />
shared across all <strong>GME</strong> programs at <strong>UCSF</strong>: pr<strong>of</strong>essionalism, interpersonal & communication skills,<br />
and elements <strong>of</strong> practice based learning and improvement.<br />
The <strong>Task</strong> <strong>Force</strong> met 1-2 times each month from September 2007 to June 2008 and searched for best<br />
practices among the tools used by residency and fellowship programs at <strong>UCSF</strong>. We applied our<br />
knowledge <strong>of</strong> the literature and evidence on the reliability and validity <strong>of</strong> assessment in medical<br />
education to help guide the recommendations. Theoretical considerations were especially helpful<br />
when evidence was lacking or incompletely developed. Dr. Lee Learman, <strong>Task</strong> <strong>Force</strong> Chair, wishes to<br />
thank its members for their many hours <strong>of</strong> hard work and excellent collaboration: Patricia O’Sullivan,<br />
Ed.D. and Arianne Teherani, Ph.D. (Offi ce <strong>of</strong> Medical Education), Sumant Ranji, M.D. (Department <strong>of</strong><br />
Medicine), and Gitanjali Kapur (<strong>GME</strong>). We also thank two consultants to the <strong>Task</strong> <strong>Force</strong>, Drs. Susan<br />
Promes (Emergency Medicine) and John Young (Psychiatry), and Laura Pliska, Ob/Gyn Residency<br />
Program Manager, who generously contributed workfl ow documents to the <strong>Task</strong> <strong>Force</strong>.<br />
2
June 2008<br />
Table <strong>of</strong> Contents<br />
<strong>UCSF</strong> Offi ce <strong>of</strong> Graduate Medical Education<br />
General Considerations 4<br />
Annotated Glossary <strong>of</strong> <strong>Evaluation</strong> Tools 5<br />
Core Measures for <strong>UCSF</strong> <strong>GME</strong> 7<br />
• Global Assessments<br />
8 - 12<br />
•<br />
•<br />
•<br />
Patient Care<br />
Mini-Clinical <strong>Evaluation</strong> Exercise (Mini-CEX) 13 - 16<br />
Focused (Checklist) Assessments 17 - 19<br />
Pr<strong>of</strong>essionalism, Interpersonal & Communication Skill<br />
Health Care Team and Self <strong>Evaluation</strong>s 20 - 26<br />
Patient Surveys 27 - 31<br />
Practice-based Learning and Improvement<br />
Critical Appraisal 32 - 34<br />
Clinical Teaching 35 - 44<br />
Additional <strong>Recommendation</strong>s 45<br />
• Medical Knowledge<br />
46 - 47<br />
• Systems-based Practice<br />
48 - 50<br />
• Progress Report for Semi-Annual Review<br />
51 - 52<br />
• Closing the Loop: Annual Program Review<br />
53<br />
Confi dential Resident <strong>Evaluation</strong> <strong>of</strong> Faculty Teaching 54 - 64<br />
Program <strong>Evaluation</strong> by Faculty and Residents 65 - 66<br />
Progress Report for Annual Program Review 67 - 68<br />
APPENDICES<br />
A. AC<strong>GME</strong> Common Program Requirements IVB: General Competencies 69 - 70<br />
B. AC<strong>GME</strong> Common Program Requirements V: <strong>Evaluation</strong> 71 - 72<br />
C. Examples <strong>of</strong> Focused Assessment Tools 73 - 81<br />
3
June 2008<br />
General Considerations<br />
<strong>UCSF</strong> Offi ce <strong>of</strong> Graduate Medical Education<br />
The AC<strong>GME</strong> has defi ned 6 general competencies for residents that must be integrated into the curriculum<br />
and evaluated using objective assessments and multiple evaluators. Although many programs and<br />
institutions regard their trainee’s clinical teaching skills as an important 7th competency, participating<br />
in the education <strong>of</strong> patients, families, students, residents and other health pr<strong>of</strong>essionals is actually a<br />
component <strong>of</strong> practice-based learning and improvement. The AC<strong>GME</strong> General Requirements for<br />
curriculum and evaluation can be found in Appendices A and B. Like the AC<strong>GME</strong>, when we use the<br />
term ‘resident’ in our report we refer to both <strong>UCSF</strong> residents and fellows.<br />
The AC<strong>GME</strong> requires each competency to be assessed using a minimum <strong>of</strong> 2 diff erent tools. To assist<br />
programs in compliance with the requirements and improve uniformity in our assessments at <strong>UCSF</strong>, the<br />
<strong>Task</strong> <strong>Force</strong> proposes a set <strong>of</strong> core measures to be used across <strong>GME</strong> programs at <strong>UCSF</strong>. Training programs<br />
are encouraged to add additional items to these tools to address the specifi c objectives <strong>of</strong> their programs<br />
and to use additional tools when necessary.<br />
The assessments we conduct are formative in that they are used primarily by residents and their mentors<br />
to provide feedback and develop future learning plans and goals. According to the AC<strong>GME</strong> summative<br />
assessment occurs at completion <strong>of</strong> the training program, at which time the program director must verify<br />
that the resident has demonstrated suffi cient competence to enter practice without direct supervision<br />
It is important to distinguish our <strong>Task</strong> <strong>Force</strong>’s work from the implementation <strong>of</strong> electronic portfolios for<br />
learner assessment at <strong>UCSF</strong> <strong>School</strong> <strong>of</strong> Medicine. A portfolio is a purposeful and longitudinal collection <strong>of</strong><br />
tangible evidence <strong>of</strong> learner-selected work that exhibits the learner’s eff orts, progress or achievement. The<br />
portfolio features the criteria for selection and judging merit, and includes evidence <strong>of</strong> learner refl ection.<br />
In this context, evaluation tools provide ‘tangible evidence’ <strong>of</strong> learning. However, even the most robust<br />
assessment system should not be regarded as a portfolio without the key elements <strong>of</strong> learner-centered<br />
work and evidence <strong>of</strong> learner refl ection.<br />
This report includes a series <strong>of</strong> short guides with evaluation tools and workfl ow documents to help<br />
programs implement the assessments. We developed each guide to be a stand-alone module including<br />
only the most essential information for program directors and coordinators. The <strong>Task</strong> <strong>Force</strong> discourages<br />
programs from implementing an evaluation tool without fi rst considering the recommendations included<br />
in the corresponding guide.<br />
Our report also includes recommendations for the evaluation <strong>of</strong> medical knowledge and systems based<br />
practice, a resident progress report for semi-annual, review, and recommendations for conducting the<br />
annual program review including assessment <strong>of</strong> faculty teaching and a program progress report.<br />
The guides and tools are accessible via the <strong>GME</strong> website and the <strong>GME</strong> E*Value system.<br />
Questions about the tools and guides should be directed to:<br />
Gitanjali Kapur, <strong>GME</strong> Educational Technologies Analyst: kapurg@medsch.ucsf.edu<br />
<strong>GME</strong> <strong>Evaluation</strong> Handbook: htt p://medschool.ucsf.edu/gme/curriculum/evaltools.html<br />
4
June 2008<br />
Annotated Glossary <strong>of</strong> <strong>Evaluation</strong> Tools<br />
<strong>UCSF</strong> Offi ce <strong>of</strong> Graduate Medical Education<br />
The AC<strong>GME</strong>/ABMS Toolbox, AC<strong>GME</strong> Think Tank <strong>Recommendation</strong>s, and Implementation Booklets<br />
are available at htt p://www.acgme.org/outcome and describe the range <strong>of</strong> potential assessment<br />
methods. The summaries below are excerpted and adapted from the AC<strong>GME</strong> materials and include<br />
only the types <strong>of</strong> tools being recommended by the <strong>Evaluation</strong> <strong>Task</strong> <strong>Force</strong>.<br />
GLOBAL RATING FORMS: Usually completed by faculty supervisors at the end <strong>of</strong> a clinical rotation<br />
assignment, global forms ask judges to rate trainees on general categories <strong>of</strong> ability (e.g., patient<br />
care skills, communication skills, medical knowledge) rather than specifi c tasks, skills or behaviors.<br />
Global ratings are completed retrospectively and are based on general impressions collected over a<br />
period <strong>of</strong> time. As such, they are subject to subjectivity, recall bias, and halo eff ects in which positive<br />
or negative impressions <strong>of</strong> the trainee infl uence the specifi c ratings. Because they are relatively<br />
easy to collect, global rating forms are nearly ubiquitous in <strong>GME</strong>. Unfortunately, they do not yield<br />
suffi ciently reliable or valid data for assessment <strong>of</strong> the competencies and must be supplemented<br />
with a second, bett er measure. Another kind <strong>of</strong> global rating form is used by learners to assess their<br />
clinical educators at the end <strong>of</strong> a learning experience (rotation, continuity clinic, etc.).<br />
WRITTEN EXAMINATIONS: These are usually composed <strong>of</strong> multiple-choice questions (MCQ)<br />
selected to sample medical knowledge and understanding <strong>of</strong> a defi ned body <strong>of</strong> knowledge, not just<br />
factual or easily recalled information. Each question or test item contains an introductory statement<br />
followed by four or fi ve options in outline format. The examinee selects one <strong>of</strong> the options as the<br />
presumed correct answer by marking the option on a coded answer sheet. Only one option is keyed<br />
as the correct response. The introductory statement <strong>of</strong>t en presents a patient case, clinical fi ndings, or<br />
displays data graphically. The in-training examinations prepared by specialty societies and boards<br />
use MCQ type test items. A typical half-day examination has 175 to 250 test questions. Comparing<br />
the test scores on in-training examinations with national statistics can serve to identify strengths<br />
and limitations <strong>of</strong> individual residents to help them improve. Comparing test results aggregated<br />
for residents in each year <strong>of</strong> a program can be helpful to identify residency training experiences that<br />
might be improved.<br />
360 DEGREE EVALUATION: 360-degree evaluations consist <strong>of</strong> measurement tools completed by<br />
multiple people in a person’s sphere <strong>of</strong> infl uence. Evaluators completing rating forms in a 360-degree<br />
evaluation usually are superiors, peers, subordinates, and patients and families. Most 360-degree<br />
evaluation processes use a survey or questionnaire to gather information about an individual’s<br />
performance on several topics. Data are then shared with the learner, who compares the perspectives<br />
<strong>of</strong> others with their self-assessment <strong>of</strong> the same qualities. 360 assessments are most useful for the<br />
5
June 2008<br />
assessment <strong>of</strong> pr<strong>of</strong>essionalism and interpersonal & communication Skills.<br />
<strong>UCSF</strong> Offi ce <strong>of</strong> Graduate Medical Education<br />
FOCUSED OBSERVATION: Checklist <strong>Evaluation</strong>s evaluate the essential or desired specifi c<br />
behaviors, activities, or steps that make up a more complex competency or competency component.<br />
Checklists can be tailored to assessed detailed actions for a specifi c task. Typical response options<br />
on these forms are a check (¬) or “yes” to indicate that the behavior occurred or options to indicate<br />
the completeness (complete, partial, or absent) or correctness (total, partial, or incorrect) <strong>of</strong> the<br />
action. Checklists are useful for evaluating any competency and competency component that can<br />
be broken down into specifi c behaviors or actions. Checklists can be used as reliable measures <strong>of</strong><br />
patient care (e.g., surgical or other procedural skills) , interpersonal and communication skills<br />
(e.g., informed consent, disclosure <strong>of</strong> adverse outcome), and practice-based learning (e.g., critical<br />
appraisal <strong>of</strong> evidence).<br />
Another type <strong>of</strong> assessment based on direct observation is the Mini-Clinical <strong>Evaluation</strong> Exercise<br />
(Mini-CEX). The Mini-CEX is completed immediately aft er observing a patient encounter. However,<br />
instead <strong>of</strong> using skill checklists the raters complete scales covering important domains <strong>of</strong> patient<br />
care including interviewing and physical examination skills, humanistic qualities, counseling skills<br />
and clinical judgment. If enough observations are performed the Mini-CEX can be used as a reliable<br />
6
EVALUATION OF THE AC<strong>GME</strong> COMPETENCIES:<br />
CORE MEASURES FOR <strong>GME</strong> AT <strong>UCSF</strong><br />
7<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
<strong>GME</strong> <strong>Evaluation</strong> <strong>Task</strong> <strong>Force</strong> <strong>Recommendation</strong><br />
GLOBAL ASSESSMENTS<br />
Despite their common usage and familiarity, global assessments are considered weak<br />
evidence <strong>of</strong> learning. If used along with an evaluation tool <strong>of</strong> greater validity, global<br />
assessments may be used to evaluate all 6 AC<strong>GME</strong> competencies.<br />
Recommended Assessment Tool<br />
Many <strong>UCSF</strong> programs use a short global assessment tool that includes the minimum<br />
language version <strong>of</strong> the 6 competencies and a 9 point scale. This format is preferred to a<br />
longer assessment form listing the full language version <strong>of</strong> each competency in multiple<br />
items. The 9‐point scale is preferred to shorter scales because it allows raters to identify<br />
differences among residents who meet or exceed expectations. An additional item<br />
assessing overall performance is also recommended. A sample global assessment is<br />
provided below.<br />
Reliability and Validity<br />
Global assessments are subject to recall bias and halo effects which threaten validity and<br />
imprecision which threatens reliability. Given these limitations the least burdensome<br />
possible tool should be used. Greater numbers <strong>of</strong> evaluators may improve inter‐rater<br />
reliability but will not make up for the subjective nature <strong>of</strong> these assessments.<br />
Administration<br />
o Timing: after each clinical rotation; for continuity clinics at least twice annually.<br />
o Who Performs: only individuals who directly supervised the resident during the<br />
specific timeframe being assessed.<br />
o Format: Each competency is represented by one item rated on a 9‐point scale<br />
anchored with descriptive language on either end and divided into thirds (1‐3:<br />
unsatisfactory / does not meet expectations, 4‐6: satisfactory / meets expectations,<br />
7‐9: outstanding / exceeds expectations). An additional item assesses overall<br />
competency. Written comments are provided listing strengths and opportunities<br />
for improvement. Programs should review progress and share feedback with<br />
residents midway and after each rotation. Mid‐rotation feedback is especially<br />
important for residents who appear not to be meeting criteria for satisfactory<br />
performance.<br />
o Scoring Criteria and Training: The distinction between satisfactory and<br />
unsatisfactory performance is an important one. If guidelines are not available<br />
for making this distinction, standard‐setting can be used to improve accuracy.<br />
Standard‐setting is especially important for global assessments because raters<br />
tend to fall into stricter vs. lenient categories.<br />
8<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
o Documentation: Score summaries are generated automatically by the evaluation<br />
management system and can be accessed by residents after each rotation. Overall<br />
progress should be reviewed in writing at least twice annually.<br />
Uses <strong>of</strong> the Data<br />
o Comparing global evaluation scores to performance <strong>of</strong> peers or one’s own prior<br />
performance can help identify trends.<br />
o Summative Decisions: Global evaluation scores persistently falling below<br />
expectations (3 or less on the 9 point scale) could delay or prevent a resident from<br />
advancing or graduating. Generally, however, such decisions would be based on<br />
overall assessments <strong>of</strong> progress incorporating other mearues.<br />
o Remediation Threshold: Programs should communicate what performance on<br />
global assessments would trigger remediation. For example, residents and<br />
faculty would be informed in advance that scores < 4 are unsatisfactory and a<br />
score <strong>of</strong> 4 is marginal.<br />
9<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
University <strong>of</strong> California, San Francisco<br />
Core Measure for <strong>UCSF</strong> <strong>GME</strong>: Global Assessment Form<br />
Subject:<br />
Evaluator:<br />
Site:<br />
Period:<br />
Dates <strong>of</strong> Activity:<br />
Activity:<br />
<strong>Evaluation</strong> Type: Resident<br />
Faculty Contributing to <strong>Evaluation</strong>s (Question 1 <strong>of</strong> 12 - Mandatory)<br />
Patient Care (Question 2 <strong>of</strong> 12 - Mandatory)<br />
In evaluating this resident’s performance, use as your standard the level <strong>of</strong> knowledge, skills and attitudes expected from a clearly<br />
satisfactory resident at this stage <strong>of</strong> training<br />
Insufficient<br />
contact to<br />
judge<br />
Incomplete, inaccurate medical interviews,<br />
physical examinations, and review <strong>of</strong> other<br />
data; incompetent performance <strong>of</strong> essential<br />
procedures; fails to analyze clinical data and<br />
consider patient preferences when making<br />
decisions.<br />
Needs<br />
Improvement<br />
(1:3 require<br />
comment)<br />
Needs<br />
Improvement<br />
Needs<br />
Improvement<br />
Meets<br />
Expectations<br />
Meets<br />
Expectations<br />
10<br />
Superb, accurate, comprehensive medical<br />
interviews, physical examinations, review <strong>of</strong><br />
other data, and procedural skills; always<br />
makes diagnostic and therapeutic decisions<br />
based on available evidence, sound<br />
judgment, and patient preferences.<br />
Meets<br />
Expectations<br />
Exceeds<br />
Expectations<br />
Exceeds<br />
Expectations<br />
Exceeds<br />
Expectations<br />
0 1 2 3 4 5 6 7 8 9<br />
Medical Knowledge (Question 3 <strong>of</strong> 12 - Mandatory)<br />
In evaluating this resident’s performance, use as your standard the level <strong>of</strong> knowledge, skills and attitudes expected from a clearly<br />
satisfactory resident at this stage <strong>of</strong> training<br />
Insufficient<br />
contact to<br />
judge<br />
Limited knowledge <strong>of</strong> basic and clinical<br />
sciences; minimal interest in learning; does<br />
not understand complex relations,<br />
mechanisms <strong>of</strong> disease.<br />
Needs<br />
Improvement<br />
(1:3 require<br />
comment)<br />
Needs<br />
Improvement<br />
Needs<br />
Improvement<br />
Meets<br />
Expectations<br />
Meets<br />
Expectations<br />
Exceptional knowledge <strong>of</strong> basic and clinical<br />
sciences, highly resourceful development <strong>of</strong><br />
knowledge; comprehensive understanding<br />
<strong>of</strong> complex relationships, mechanisms <strong>of</strong><br />
disease.<br />
Meets<br />
Expectations<br />
Exceeds<br />
Expectations<br />
Exceeds<br />
Expectations<br />
Exceeds<br />
Expectations<br />
0 1 2 3 4 5 6 7 8 9<br />
Practice-Based Learning Improvement (Question 4 <strong>of</strong> 12 - Mandatory)<br />
In evaluating this resident’s performance, use as your standard the level <strong>of</strong> knowledge, skills and attitudes expected from a clearly<br />
satisfactory resident at this stage <strong>of</strong> training<br />
Insufficient<br />
contact to<br />
judge<br />
Fails to perform self-evaluation; lacks<br />
insight, initiative; resists or ignores<br />
feedback; fails to use information technology<br />
to enhance patient care or pursue selfimprovement.<br />
Needs<br />
Improvement<br />
(1:3 require<br />
comment)<br />
Needs<br />
Improvement<br />
Needs<br />
Improvement<br />
Meets<br />
Expectations<br />
Meets<br />
Expectations<br />
Constantly evaluates own performance,<br />
incorporates feedback into improvement<br />
activities; effectively uses technology to<br />
manage information for patient care and<br />
self-improvement.<br />
Meets<br />
Expectations<br />
Exceeds<br />
Expectations<br />
Exceeds<br />
Expectations<br />
Exceeds<br />
Expectations<br />
0 1 2 3 4 5 6 7 8 9<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
Interpersonal & Communication Skills (Question 5 <strong>of</strong> 12 - Mandatory)<br />
In evaluating this resident’s performance, use as your standard the level <strong>of</strong> knowledge, skills and attitudes expected from a clearly<br />
satisfactory resident at this stage <strong>of</strong> training<br />
Insufficient<br />
contact to<br />
judge<br />
Does not establish even minimally effective<br />
therapeutic relationships with patients and<br />
families; does not demonstrate ability to<br />
build relationships through listening,<br />
narrative or nonverbal skills; does not<br />
provide education or counseling to patients,<br />
families or colleagues.<br />
Needs<br />
Improvement<br />
(1:3 require<br />
comment)<br />
Needs<br />
Improvement<br />
Needs<br />
Improvement<br />
Meets<br />
Expectations<br />
Meets<br />
Expectations<br />
11<br />
Establishes a highly effective therapeutic<br />
relationship with patients and families;<br />
demonstrates excellent relationship building<br />
through listening, narrative and nonverbal<br />
skills, excellent education and counseling <strong>of</strong><br />
patients, families, and colleagues.<br />
Meets<br />
Expectations<br />
Exceeds<br />
Expectations<br />
Exceeds<br />
Expectations<br />
Exceeds<br />
Expectations<br />
0 1 2 3 4 5 6 7 8 9<br />
Pr<strong>of</strong>essionalism (Question 6 <strong>of</strong> 12 - Mandatory)<br />
In evaluating this resident’s performance, use as your standard the level <strong>of</strong> knowledge, skills and attitudes expected from a clearly<br />
satisfactory resident at this stage <strong>of</strong> training<br />
Insufficient<br />
contact to<br />
judge<br />
Lacks respect, compassion, integrity,<br />
honesty; disregards need for selfassessment;<br />
fails to acknowledge errors,<br />
does not consider needs <strong>of</strong> patients,<br />
families, colleagues; does not display<br />
responsible behavior.<br />
Needs<br />
Improvement<br />
(1:3 require<br />
comment)<br />
Needs<br />
Improvement<br />
Needs<br />
Improvement<br />
Meets<br />
Expectations<br />
Meets<br />
Expectations<br />
Always demonstrates respect, compassion,<br />
integrity, honesty, teaches/role models<br />
responsible behavior; total commitment to<br />
self-assessment; willingly acknowledges<br />
errors; always considers needs <strong>of</strong> patients,<br />
families, colleagues.<br />
Meets<br />
Expectations<br />
Exceeds<br />
Expectations<br />
Exceeds<br />
Expectations<br />
Exceeds<br />
Expectations<br />
0 1 2 3 4 5 6 7 8 9<br />
Systems-Based Learning (Question 7 <strong>of</strong> 12 - Mandatory)<br />
In evaluating this resident’s performance, use as your standard the level <strong>of</strong> knowledge, skills and attitudes expected from a clearly<br />
satisfactory resident at this stage <strong>of</strong> training<br />
Insufficient<br />
contact to<br />
judge<br />
Unable to access/mobilize outside<br />
resources; actively resists efforts to improve<br />
systems <strong>of</strong> care; does not use systematic<br />
approaches to reduce error and improve<br />
patient care.<br />
Needs<br />
Improvement<br />
(1:3 require<br />
comment)<br />
Needs<br />
Improvement<br />
Needs<br />
Improvement<br />
Meets<br />
Expectations<br />
Meets<br />
Expectations<br />
Effectively accesses/utilizes resources;<br />
effectively uses systematic approaches to<br />
reduce errors and improve patient care;<br />
enthusiastically assists in developing<br />
systems improvement.<br />
Meets<br />
Expectations<br />
Exceeds<br />
Expectations<br />
Exceeds<br />
Expectations<br />
Exceeds<br />
Expectations<br />
0 1 2 3 4 5 6 7 8 9<br />
Educational Objectives (Question 8 <strong>of</strong> 12 - Mandatory)<br />
Resident’s Achievement <strong>of</strong> Educational Objectives on This Rotation<br />
Insufficient<br />
contact to<br />
judge<br />
Needs<br />
Improvement<br />
(1:3 require<br />
comment)<br />
Needs<br />
Improvement<br />
Needs<br />
Improvement<br />
Meets<br />
Expectations<br />
Meets<br />
Expectations<br />
Meets<br />
Expectations<br />
Exceeds<br />
Expectations<br />
Exceeds<br />
Expectations<br />
Exceeds<br />
Expectations<br />
0 1 2 3 4 5 6 7 8 9<br />
Strengths (Question 9 <strong>of</strong> 12 - Mandatory)<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
Opportunities for Improvement (Question 10 <strong>of</strong> 12)<br />
System Ease <strong>of</strong> Use (Question 11 <strong>of</strong> 12)<br />
E*Value was easy to use.<br />
NA Strongly Disagree Disagree Neutral/Undecided Agree Strongly Agree<br />
0 1 2 3 4 5<br />
E*Value Comments: (Question 12 <strong>of</strong> 12)<br />
Comments entered here will be forwarded to E*Value technical support and will not be anonymous.<br />
12<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
<strong>GME</strong> <strong>Evaluation</strong> <strong>Task</strong> <strong>Force</strong> <strong>Recommendation</strong><br />
PATIENT CARE<br />
PATIENT CARE SKILLS (MINI‐CEX)<br />
Recommended Assessment Tool<br />
The Mini‐Clinical <strong>Evaluation</strong> Exercise (Mini‐CEX) was developed and studied<br />
extensively by the American Board <strong>of</strong> Internal Medicine. It is a focused assessment on<br />
specific aspects <strong>of</strong> a patient interaction. As such it can assess principles <strong>of</strong> patient care<br />
foremost and secondarily it asks for ratings <strong>of</strong> pr<strong>of</strong>essionalism and interpersonal and<br />
communication skills as these are important components <strong>of</strong> every patient interaction.<br />
Reliability and Validity<br />
For use at the semi‐annual review meeting a minimum <strong>of</strong> 6 forms/year would provide<br />
satisfactory reliability with 12 being optimal. The reliability and validity is based on the<br />
research done with the Mini‐CEX. The program must monitor that the forms are being<br />
completed correctly including signatures to ensure that they are measuring with the<br />
same psychometric rigor as was done in the research studies.<br />
Administration<br />
o Timing: 6 (up to 12) assessments per year<br />
o Who Performs: Skill assessments should be done through observation <strong>of</strong> the<br />
actual performance. It is possible that a faculty member could assess through<br />
video review <strong>of</strong> the performance, but the assessment reflects the skill <strong>of</strong> the<br />
resident on the performance date. The entire clinical encounter does not need to<br />
be observed – a shorter duration <strong>of</strong> observation may be more efficient.<br />
o Format: A checklist is the most appropriate format for evaluating specific<br />
procedural or communication skills (see section on focused assessment <strong>of</strong><br />
observed skills). Because the Mini‐CEX is developed for generic use across<br />
different encounter types, it uses scales to assess important elements in any<br />
encounter (e.g., history‐taking, the physical examination, humanistic qualities and<br />
clinical reasoning).<br />
o Scoring Criteria and Training: The Mini‐CEX research has been done on the 9<br />
point scale for assessment. While other research may suggest that fewer points<br />
will give the same decisions, it is recommended with maintaining the 9 point<br />
scale as designed. The form includes a glossary describing the skills being<br />
assessed but no criteria are provided for unsatisfactory vs. satisfactory vs.<br />
superior performance. Attending faculty review the form and it is considered self‐<br />
explanatory. This is not ideal, but most expedient.<br />
o Documentation: At minimum, twice annually as part <strong>of</strong> semi‐annual review<br />
meetings.<br />
13<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
Uses <strong>of</strong> the Data<br />
o Formative Feedback: Concurrent, written same‐day feedback is recommended.<br />
The Mini‐CEX is an observational form and must be completed in real time. It sets<br />
the expectation that resident and faculty member will discuss the observation and<br />
sign the form. More details may be found in an article by Holmboe (1).<br />
o Summative Decisions: Programs should inform residents that the criteria for<br />
judging progress will be the performance on the Mini‐CEX averaged across all<br />
observations. The program should indicate a standard that would generate action<br />
such as any single Mini‐CEX with an ʺunsatisfactoryʺ rating. These decisions<br />
criteria should be made explicit to the residents.<br />
o Remediation Threshold: Programs should communicate what performance on the<br />
Mini‐CEX would require remediation. However, the faculty must be willing to<br />
support such a process or they may be likely to inflate performance so as to not be<br />
burdened with remediation. Most programs would consider an average <strong>of</strong> 5 or<br />
below on the Mini‐CEX while supposedly indicating satisfactory performance<br />
worthy <strong>of</strong> a development plan for the resident.<br />
o Program Effectiveness: The Mini‐CEX is so intertwined with the fundamentals <strong>of</strong><br />
patient care that the data are to assess resident performance and generate plans as<br />
needed. They are less likely to be useful for program effectiveness.<br />
References<br />
1. Holmboe ES, Yepes M, Williams F, Huot SJ. Feedback and the Mini Clinical<br />
<strong>Evaluation</strong> Exercise. J Gen Intern Med 2004; 19(5 Pt 2): 558–561.<br />
14<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
15<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
16<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
<strong>GME</strong> <strong>Evaluation</strong> <strong>Task</strong> <strong>Force</strong> <strong>Recommendation</strong><br />
PATIENT CARE<br />
PATIENT CARE: FOCUSED ASSESSMENT OF OBSERVED SKILLS<br />
Evidence <strong>of</strong> Competency In:<br />
Surgical Skills, Procedural Skills, Specific Communication Skills<br />
Recommended Assessment Tool<br />
Unlike the shared general competencies, Patient Care objectives are defined by each<br />
RRC. Thus, no specific tool can be recommended across <strong>GME</strong> programs. Instead, we<br />
<strong>of</strong>fer principles to guide the development or selection <strong>of</strong> tools. Checklists define the<br />
discrete tasks that comprise the overall skill being assessed. Scales evaluate procedural<br />
skills that transcend individual tasks, such as knowledge <strong>of</strong> anatomy, instrument<br />
handling, flow <strong>of</strong> operation, etc. Richard Reznick and colleagues at the University <strong>of</strong><br />
Toronto developed scales for the Observed Structured Assessment <strong>of</strong> Technical Skills, or<br />
O‐SATS (1,2).<br />
Examples <strong>of</strong> skill checklists are included in the Appendix.<br />
Overview<br />
Focused assessments for skills are used primarily to assess patient care by determining if<br />
a psychomotor skill has been acquired. It is possible to combine some communication<br />
items if this is the only time a resident interacts with patients, but it is not the primary<br />
use <strong>of</strong> this assessment. It is also possible to develop a focused assessment <strong>of</strong> specific<br />
communication skill tasks, such as an informed consent discussion or specific counseling<br />
following a practice guideline.<br />
Reliability and Validity<br />
Skill checklists primarily have content validity. The items for a specific checklist may<br />
have come from the literature where someone has decided the checklist has content<br />
validity. Alternatively, if designing a checklist within the program, having those who<br />
are “expert” in the skill review and approve the checklist would serve as a level <strong>of</strong><br />
validity. Reliability exists in several dimensions: the ability <strong>of</strong> different assessors to<br />
come to the same decision (inter‐rater reliability) and the internal consistency <strong>of</strong> the<br />
checklist items (do they “fit” together). However, if checklists are used to identify when<br />
someone has mastered the skill, the internal consistency is not relevant (since everyone<br />
should get 100% eventually). Therefore, the best reliability evidence would be<br />
consensus <strong>of</strong> faculty concerning the decision that the resident is competent in that skill.<br />
17<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
Administration<br />
o Timing: Skill assessments should be performed until a resident can demonstrate<br />
competency. The sustained level <strong>of</strong> competency can be measured if a program is<br />
worried about “drift” from desired performance. This would require a recheck <strong>of</strong><br />
the skill at some systematic interval.<br />
o Who Performs: Skill assessments should be done through observation <strong>of</strong> the<br />
actual performance. It is possible that a faculty member could assess through<br />
video review <strong>of</strong> the performance, but the assessment reflects the skill <strong>of</strong> the<br />
resident on the performance date.<br />
o Format: A checklist is the most appropriate format, but the checklist may have<br />
some gradation reflecting the quality with which the specific step was performed,<br />
e.g.: not indicated (n/a), not performed but indicated, performed poorly, or<br />
performed well. General scales (such as for O‐SATS above) also exist and<br />
facilitate comparability across specific procedures. Written comments may be<br />
especially helpful for giving feedback.<br />
o Scoring Criteria and Training: There should be guidelines for the checklist<br />
describing the environment for the assessment and a description that<br />
accompanies what is meant by each step on the checklist. For example, if the<br />
checklist includes “washes hands”, does that mean a resident running his/her<br />
hands under water without soap is acceptable? Is scrubbing involved? For a<br />
minimum length <strong>of</strong> time? It is advisable to indicate to learners and evaluators the<br />
acceptable standard for checklist items. The training could be by reviewing the<br />
written guide. The checklist should contain a written standard by which the<br />
resident would know that the performance demonstrated competency.<br />
Generally, this would mean achieving 100% <strong>of</strong> the checklist items and/or overall<br />
judgment <strong>of</strong> competency by the assessor.<br />
o Documentation: At minimum, twice annually as part <strong>of</strong> semi‐annual review<br />
meetings.<br />
Workflow Procedures<br />
A systematic approach is recommended to maximize the use <strong>of</strong> the focused assessments<br />
and facilitate data management. A sample workflow document for focused assessment<br />
<strong>of</strong> surgical skills follows.<br />
References<br />
1. Winckel CP, Reznick RK, Cohen R, Taylor B. Reliability and construct validity <strong>of</strong><br />
a structured technical skills assessment form. Am J Surg1994;167(4):423‐7.<br />
2. Reznick R, Regehr G, MacRae H, Martin J, McCulloch W. Testing technical skill<br />
via an innovative ʺbench stationʺ examination. Am J Surg 1997;173(3):226‐30.<br />
18<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
ANNUALLY<br />
SAMPLE WORKFLOW FOR FOCUSED ASSESSMENT OF SURGICAL SKILLS<br />
TASK BY WHOM WHEN<br />
Review/update evaluation cards: determine Program Director, May<br />
which procedures for which years<br />
Manager<br />
Order evaluation cards Site Coordinator May<br />
Send instructions to each class <strong>of</strong> residents<br />
and supervising faculty in the OR on how to<br />
use the evaluation cards<br />
Program Director June<br />
Distribute evaluation cards to residents for Program Directors End <strong>of</strong> year class<br />
their two (2) respective index cases<br />
meetings (May/June)<br />
WITH EACH INDEXED CASE<br />
TASK BY WHOM WHEN<br />
Hand supervising faculty in OR the card for Resident Preferably immediately<br />
the index case<br />
afterward if necessary<br />
Complete evaluation card, give card to Faculty<br />
Immediately after the case<br />
resident and give resident verbal feedback supervising<br />
resident in the OR<br />
Turn in cards to MZ AAIII, SFGH<br />
Resident Within one week <strong>of</strong> the<br />
coordinator, or <strong>UCSF</strong> coordinator<br />
procedure<br />
Enter data into a database Site Coordinators Monthly and must be<br />
current when semi‐annual<br />
assessment meetings are<br />
scheduled<br />
AS NEEDED<br />
TASK BY WHOM WHEN<br />
Review performance <strong>of</strong> residents Program Directors,<br />
Advisors and<br />
Residents<br />
Review process for improvement Program Director,<br />
Manager, Site<br />
Coordinators<br />
19<br />
Periodically and prior to<br />
semi‐annual assessment<br />
meetings<br />
As needed<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
<strong>GME</strong> <strong>Evaluation</strong> <strong>Task</strong> <strong>Force</strong> <strong>Recommendation</strong>:<br />
PROFESSIONALISM<br />
PROFESSIONALISM, INTERPERSONAL AND COMMUNICATION SKILLS<br />
Overview<br />
360 evaluations include observations by a variety <strong>of</strong> individuals from the multiple<br />
contexts where pr<strong>of</strong>essionalism and communication skills are demonstrated. Ratings by<br />
self, peers, colleagues and faculty are generally collected using different tools than<br />
ratings by patients. The 360 method captures information on most <strong>of</strong> the competencies<br />
listed by the AC<strong>GME</strong> under “Pr<strong>of</strong>essionalism” and “Interpersonal & Communication<br />
Skills and is highly recommended for every program’ assessment system.<br />
Core Measure for <strong>UCSF</strong> <strong>GME</strong><br />
For Health Care Team and Self‐<strong>Evaluation</strong>, we recommend the Pr<strong>of</strong>essional Associate<br />
<strong>Evaluation</strong> Form, developed by the CREOG (Ob/Gyn) Competency <strong>Task</strong> <strong>Force</strong><br />
For Patient Surveys, we recommend the American Board <strong>of</strong> Internal Medicine (ABIM)<br />
Patient Survey<br />
Health Care Team <strong>Evaluation</strong>s: PROFESSIONALISM<br />
_______________________<br />
Reliability and Validity:<br />
The optimal number and frequency 360 assessments is uncertain. A feasible minimum<br />
to achieve inter‐rater reliability = .80 may be 5 non‐clinical and 6 clinical raters each on 2<br />
occasions (1).<br />
Preliminary evidence <strong>of</strong> construct validity shows modest growth in 360 scores<br />
comparing senior vs. junior residents with a magnitude similar to the growth in other<br />
competencies including critical self‐reflection skills (2).<br />
Content validity exists to the extent that the survey items actually assess the<br />
pr<strong>of</strong>essionalism and communication skills they are intended to measure. The 9‐item<br />
evaluation includes communication (patients/families, nursing/allied staff), respect<br />
(patients, nursing/allied staff), compassion, reliability, honesty/integrity, responsibility,<br />
and advocacy.<br />
20<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
Administration<br />
o Timing: More frequent observations tied to specific learning experiences provide<br />
more valid data than the minimum, i.e., global impressions provided twice<br />
annually.<br />
o Who Performs: Many individuals can legitimately contribute to 360 assessments.<br />
The list usually includes faculty, residents (supervisors, peers and juniors),<br />
nursing and other clinical staff, and consultants. Selection should be based upon<br />
the individual’s opportunities to directly observe the resident interacting with<br />
patients and the health care team.<br />
N.B. Faculty need to complete the 360 evaluation separately from their global<br />
ratings at the end <strong>of</strong> the rotation so the data can be summarized and reported<br />
accurately.<br />
o Format: 9 items sample aspects <strong>of</strong> pr<strong>of</strong>essionalism and communication skills.<br />
Each item is scored on 9‐point scales with 1‐3 = unsatisfactory, 4‐6 = meets<br />
expectations, and 7‐9=excellent.<br />
o Scoring Criteria and Training: Each item defines a specific trait in clear language.<br />
Standard‐setting would be helpful for consistently distinguishing the 3 levels <strong>of</strong><br />
unsatisfactory, the 3 levels <strong>of</strong> satisfactory and the 3 levels <strong>of</strong> excellent.<br />
o Documentation: Assessment can be documented on E‐value and learners can<br />
access the results confidentially.<br />
Uses <strong>of</strong> the Data<br />
o Summarizing the data: Score averages, ranges, and comparative data for the PGY<br />
year are provided as part <strong>of</strong> the data report residents review before their semi‐<br />
annual meetings. The other data sources for the 360 assessment include the<br />
relevant items (e.g., respect) from the residents’ clinical educator evaluations and<br />
patient surveys. These are also summarized as means and ranges compared with<br />
averages for the PGY year.<br />
o Formative Uses: Assessment results support development <strong>of</strong> pr<strong>of</strong>essionalism and<br />
communication skills by individual residents, identify trends in performance<br />
across PGY groups and spur possible improvements to the curriculum in these<br />
two competencies<br />
o Summative Decisions and Remediation: Scores in the unsatisfactory (scores 1‐3)<br />
would trigger remediation and a low satisfactory score (4) would trigger<br />
suggestions for improvement<br />
Workflow Procedures<br />
A systematic approach is recommended to maximize the use <strong>of</strong> the assessments and<br />
facilitate data management. An example follows in the Appendix.<br />
21<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
References<br />
1. Murphy DJ, Bruce DA, Mercer SW, Eva KW. The reliability <strong>of</strong> workplace‐based assessment in<br />
postgraduate medical education and training: a national evaluation in general practice in the<br />
United Kingdom. Adv in Health Sci Educ 2008 DOI 10.1007/s10459‐008‐9104‐8.<br />
2. Learman LA, Autry AM, OʹSullivan P. Reliability and validity <strong>of</strong> reflection exercises for<br />
obstetrics and gynecology residents. Am J Obstet Gynecol 2008;198(4):461.e1‐8; discussion<br />
461.e8‐10.<br />
22<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
University <strong>of</strong> California, San Francisco<br />
Core Measure for <strong>UCSF</strong> <strong>GME</strong>: Pr<strong>of</strong>essionalism / Communication<br />
Subject:<br />
Evaluator:<br />
Site:<br />
Period:<br />
Dates <strong>of</strong> Activity:<br />
Activity: Health Care Team <strong>Evaluation</strong><br />
<strong>Evaluation</strong> Type: [Evaluator Identity]<br />
Clinical Setting (Question 1 <strong>of</strong> 13 - Mandatory)<br />
Please indicate the clinical setting where you have interacted with the resident:<br />
Inpatient Wards<br />
ICU<br />
ER<br />
Outpatient Clinic<br />
Continuity Clinic<br />
Other:<br />
Clinical Observations (Question 2 <strong>of</strong> 13 - Mandatory)<br />
On average how many clinical observations did you have <strong>of</strong> the resident?<br />
N/A<br />
< 4<br />
5-10<br />
11-20<br />
21 ><br />
Communication: Patients and Families (Question 4 <strong>of</strong> 13 - Mandatory)<br />
Communicates clearly; is willing to answer questions and provide explanations; willing to listed to patients<br />
and families<br />
Unable to<br />
Assess<br />
Unsatisfactory Satisfactory Excellent<br />
0 1 2 3 4 5 6 7 8 9<br />
Communication: Nursing and Allied Health Staff (Question 5 <strong>of</strong> 13 - Mandatory)<br />
Consistently demonstrates willingness to listen to nursing and allied health staff<br />
Unable to<br />
Assess<br />
Unsatisfactory Satisfactory Excellent<br />
0 1 2 3 4 5 6 7 8 9<br />
Respectfulness: Patients (Question 6 <strong>of</strong> 13 - Mandatory)<br />
23<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
Treats others with respect; does not demean or make others feel inferior; provides equitable care to<br />
patients; uses respectful language when discussing patients; is sensitive to cultural needs <strong>of</strong> patients<br />
Unable to<br />
Assess<br />
Unsatisfactory Satisfactory Excellent<br />
0 1 2 3 4 5 6 7 8 9<br />
Respectfulness: Nursing and Allied Health Staff (Question 7 <strong>of</strong> 13 - Mandatory)<br />
Consistently courteous and receptive to nursing and allied health staff; acknowledges and respects roles<br />
<strong>of</strong> other health care pr<strong>of</strong>essionals in patient care<br />
Unable to<br />
Assess<br />
Unsatisfactory Satisfactory Excellent<br />
0 1 2 3 4 5 6 7 8 9<br />
Compassion: (Question 8 <strong>of</strong> 13 - Mandatory)<br />
Is kind to patient and families; appreciates patients and families special needs and accepts<br />
inconvenience when necessary to meet the needs <strong>of</strong> the patient; consistently attentive to details <strong>of</strong><br />
patient comfort<br />
Unable to<br />
Assess<br />
Unsatisfactory Satisfactory Excellent<br />
0 1 2 3 4 5 6 7 8 9<br />
Reliability: (Question 9 <strong>of</strong> 13 - Mandatory)<br />
Completes and fulfills responsibilities; responds promptly when on call or when paged; assists and fills in<br />
for others when needed<br />
Unable to<br />
Assess<br />
Unsatisfactory Satisfactory Excellent<br />
0 1 2 3 4 5 6 7 8 9<br />
Honesty/Integrity: (Question 10 <strong>of</strong> 13 - Mandatory)<br />
Knows limits <strong>of</strong> ability and asks for help when appropriate; is honest and trustworthy; does not falsify<br />
information; committed to ethical principles<br />
Unable to<br />
Assess<br />
Unsatisfactory Satisfactory Excellent<br />
0 1 2 3 4 5 6 7 8 9<br />
Responsibility: (Question 11 <strong>of</strong> 13 - Mandatory)<br />
Accepts responsibility (does not blame others or the system); committed to self-assessment; responds to<br />
feedback; committed to excellence and self-learning<br />
Unable to<br />
Assess<br />
Unsatisfactory Satisfactory Excellent<br />
0 1 2 3 4 5 6 7 8 9<br />
Advocate: (Question 12 <strong>of</strong> 13 - Mandatory)<br />
An advocate for patient needs, effectively accesses and coordinates medical system resources to<br />
optimize patient care, seeks to find and correct system causes <strong>of</strong> medical error<br />
Unable to<br />
Assess<br />
Unsatisfactory Satisfactory Excellent<br />
0 1 2 3 4 5 6 7 8 9<br />
24<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
Comments: (Question 13 <strong>of</strong> 13)<br />
Please provide comments regarding resident's strengths and / or areas <strong>of</strong> needed improvement:<br />
25<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
ANNUALLY<br />
SAMPLE WORKFLOW FOR HEALTHCARE TEAM EVALUATION<br />
TASK BY WHOM WHEN<br />
Review/update survey form Program Director,<br />
Manager and E‐<br />
Value Administrator<br />
Review/update rotation list Program Manager<br />
and E‐Value<br />
Administrator<br />
Review/update Pr<strong>of</strong>essional Associates list by<br />
rotation and residents’ peer list<br />
AT THE END OF EACH ROTATION<br />
26<br />
E‐Value<br />
Administrator<br />
May/June<br />
May/June<br />
June/July<br />
TASK BY WHOM WHEN<br />
Schedule pr<strong>of</strong>essional associate evaluations <strong>of</strong> E‐Value<br />
by July 1st<br />
residents on E‐Value for intramural rotations Administrator<br />
Solicit input from other pr<strong>of</strong>essional associates Designated<br />
Within one week <strong>of</strong> the<br />
involved in resident’s training on that rotation pr<strong>of</strong>essional<br />
associate<br />
completion <strong>of</strong> each rotation<br />
Enter resident evaluations into E‐Value Designated<br />
Within two weeks <strong>of</strong> the<br />
pr<strong>of</strong>essional<br />
associate and<br />
residents on same<br />
teams<br />
completion <strong>of</strong> each rotation<br />
Automatic reminders to faculty with outstanding E‐Value s<strong>of</strong>tware Sent every 1 week for 4<br />
evaluations<br />
weeks<br />
Follow‐up reminder for delinquent faculty and E‐Value<br />
End <strong>of</strong> the next rotation after<br />
resident evaluations from previous rotation (cc: Administrator the one the evaluation was<br />
Director and Associate Director)<br />
assigned<br />
AS NEEDED<br />
TASK BY WHOM WHEN<br />
Review performance <strong>of</strong> residents Program Directors,<br />
Advisors and<br />
Residents<br />
Remove and track suspended evaluations E‐Value<br />
Administrator<br />
Update pr<strong>of</strong>essional associates and their contact<br />
information<br />
E‐Value<br />
Administrator<br />
Periodically and prior to<br />
semi‐annual assessment<br />
meetings<br />
Quarterly<br />
As needed<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
Patient Surveys: PROFESSIONALISM<br />
______<br />
Reliability and Validity:<br />
The optimal number <strong>of</strong> patient surveys is uncertain. As initially used in an ABIM<br />
continuing pr<strong>of</strong>essional development context, 20 surveys were recommended. A<br />
feasibility study with Canadian internal medicine residents showed 12 outpatient<br />
surveys to be associated with poor reliability (0.56) (1).<br />
Data using the Consultation and Relational Empathy (CARE) survey, which is used for<br />
physician accreditation in Scotland, suggest that more than 40 patients may be necessary<br />
for good inter‐rater reliability (.80) with 25 patients providing adequate reliability (.70)<br />
(2).<br />
Content validity exists to the extent that a measure actually assesses the communication<br />
and pr<strong>of</strong>essionalism skills it is intended to measure. The ABIM’s 10 items sample<br />
multiple aspects <strong>of</strong> doctor‐patient communication (greeting, listening, establishing<br />
rapport, explaining, inviting participation in decision‐making) and pr<strong>of</strong>essionalism<br />
(truthfulness, respect, sensitivity to linguistic barriers). The physician characteristics<br />
evaluated using the CARE survey are similar to those assessed using the ABIM survey.<br />
Although the CARE survey has some advantages in how it describes the characteristics<br />
being evaluated, we recommend the ABIM survey because it is so widely used and<br />
studied in the United States.<br />
Administration<br />
o Timing: Patient satisfaction surveys may be obtained regularly as a quality<br />
measure. Otherwise, administration twice annually is the minimum<br />
recommended for assessment <strong>of</strong> competency.<br />
o Who Performs: Patients under the direct care <strong>of</strong> the resident.<br />
o Format: The ABIM Patient Survey includes 10 complex items. Both sample<br />
similar aspects <strong>of</strong> communication and pr<strong>of</strong>essionalism skills.<br />
o Scoring Criteria and Training: The ABIM survey uses scales ranging from 1‐5<br />
(poor, fair, good, very good, and excellent), uses simple language, and relies upon<br />
the subjective experience <strong>of</strong> individual patients. Patients receive a general<br />
orientation but no specific instructions regarding the scoring criteria. Although a<br />
9‐point scale is used for the 360’s Health Care Team and Self‐<strong>Evaluation</strong>s, the<br />
patient surveys use only 5 points because finer distinctions are challenging for<br />
patients. This 5‐ vs. 9‐point scale difference is important to note when scores are<br />
summarized and discussed with residents.<br />
27<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
o Documentation: Summaries with comparative data are made available for review<br />
at the semi‐annual meetings.<br />
Uses <strong>of</strong> the Data<br />
o Summarizing the data: Score averages, ranges, and comparative data for the PGY<br />
year are provided as part <strong>of</strong> the data report residents review before their semi‐<br />
annual meetings. The other data sources for the 360 assessment include the<br />
relevant items (e.g., respect) from the residents’ clinical educator evaluations and<br />
patient surveys. These are also summarized as means and ranges compared with<br />
averages for the PGY year.<br />
o Formative Uses: There are many useful ways to use the assessment results to<br />
support development <strong>of</strong> pr<strong>of</strong>essionalism and communication skills by individual<br />
residents, identify trends in performance across PGY groups and spur possible<br />
improvements to the curriculum.<br />
o Summative Decisions and Remediation: It is important to set criteria that would<br />
trigger a plan <strong>of</strong> improvement to explain the criteria to the residents before they<br />
are assessed.<br />
Workflow Procedures<br />
A systematic approach is recommended to maximize the use <strong>of</strong> the assessments and<br />
facilitate data management. An example follows in the Appendix.<br />
References<br />
1. Tamblyn R, Benaroya S, Snell L, McLeod P, Schnarch B, Abrahamowicz M. The<br />
feasibility and value <strong>of</strong> using patient satisfaction ratings to evaluate internal medicine<br />
residents. J Gen Intern Med 1994;9(3):146‐52.<br />
2. Murphy DJ, Bruce DA, Mercer SW, Eva KW. The reliability <strong>of</strong> workplace‐based assessment in<br />
postgraduate medical education and training: a national evaluation in general practice in the<br />
United Kingdom. Adv in Health Sci Educ 2008 DOI 10.1007/s10459‐008‐9104‐8.<br />
.<br />
28<br />
<strong>UCSF</strong> <strong>GME</strong> ‐ 7/08
Doctor's Name__________________________<br />
Date_____________<br />
Inpatient Outpatient<br />
(circle one)<br />
ABIM Patient Survey<br />
29<br />
Rating Scale<br />
HOW IS THIS DOCTOR AT. . . Poor Fair Good Very<br />
Good<br />
Greeting you warmly; calling you by the name you<br />
prefer; being friendly, never crabby or rude<br />
Letting you tell your story while listening carefully;<br />
asking thoughtful questions; not interrupting you<br />
while you are talking<br />
Showing interest in you as a person; not acting<br />
bored or ignoring what you have to say<br />
Treating you like you're on the same level; never<br />
"talking down" to you or treating you like a child<br />
Informing you during the physical exam about what<br />
he/she is going to do and why; telling you what<br />
he/she finds<br />
Explaining what you need to know about your<br />
problems, how and why they occurred, and what to<br />
expect next<br />
Using words you can understand when explaining<br />
your problems and treatment; explaining any<br />
technical medical terms in plain language<br />
Discussing options with you and asking your<br />
opinion; <strong>of</strong>fering choices and letting you help<br />
decide what to do; asking what you think before<br />
telling you what to do<br />
Encouraging you to ask questions; answering them<br />
clearly; never avoiding your questions or lecturing<br />
you<br />
Telling you everything; being truthful, upfront and<br />
frank; not keeping things from you that you should<br />
know<br />
Excellent Unable to<br />
Evaluate<br />
1 2 3 4 5 #<br />
1 2 3 4 5 #<br />
1 2 3 4 5 #<br />
1 2 3 4 5 #<br />
1 2 3 4 5 #<br />
1 2 3 4 5 #<br />
1 2 3 4 5 #<br />
1 2 3 4 5 #<br />
1 2 3 4 5 #<br />
1 2 3 4 5 #
ANNUALLY<br />
SAMPLE WORKFLOW FOR PATIENT SATISFACTION QUESTIONNAIRE<br />
TASK BY WHOM WHEN<br />
Review/update survey form Program Director,<br />
Manager, Site<br />
Coordinators<br />
Translate form into Spanish and Chinese and SFGH Coordinator;<br />
estimate numbers <strong>of</strong> each needed for each clinic Directors, Manager<br />
Order scannable forms Program Director,<br />
Manager, Site<br />
Coordinators<br />
Determine months for survey administration and<br />
inform clinic staff and residents<br />
EVALUATION MONTH<br />
Program Director,<br />
Manager, Site<br />
Coordinators<br />
31<br />
May<br />
May<br />
May<br />
June/July<br />
TASK BY WHOM WHEN<br />
Send a reminder to the continuity clinic staff<br />
(SFGH & MZ); surgical coordinator at SFGH; and<br />
nurses in the gyn clinic at MZ that patient<br />
satisfaction surveys will be administered next<br />
month<br />
Site Coordinators Send one week prior to the<br />
beginning <strong>of</strong> the survey<br />
months (September and<br />
March)<br />
Distribute forms to clinics Site Coordinators Within one week prior to<br />
the beginning <strong>of</strong> the survey<br />
months (September and<br />
March)<br />
Present forms to all patients in continuity clinics,<br />
post‐operative visits (SFGH) and gyn clinic (MZ)<br />
Create and place box for surveys at checkout<br />
counters at continuity clinics, post‐operative visits<br />
(SFGH) and gyn clinic ((MZ)<br />
Medical Assistants in<br />
continuity clinics;<br />
surgical coordinator<br />
at SFGH; and nurses<br />
in the MZ gyn clinic<br />
Site Coordinators<br />
with nursing staff<br />
During the two survey<br />
months<br />
(September and March)<br />
First day <strong>of</strong> each survey<br />
month<br />
(September and March)<br />
Collect forms from boxes Site Coordinators Weekly during the month<br />
and follow‐up for stragglers<br />
the following month<br />
Scan or enter surveys into database E‐Value<br />
Administrator<br />
AS NEEDED<br />
TASK BY WHOM WHEN<br />
Review performance <strong>of</strong> residents Program Directors,<br />
Advisors and<br />
Residents<br />
Determine response rate and review process for<br />
improvement<br />
Program Director,<br />
Manager, Site<br />
Coordinators<br />
By the 15 th day <strong>of</strong> the month<br />
following survey month<br />
Periodically and prior to<br />
semi‐annual assessment<br />
meetings<br />
After data is entered
<strong>GME</strong> <strong>Evaluation</strong> <strong>Task</strong> <strong>Force</strong> <strong>Recommendation</strong><br />
PRACTICE BASED LEARNING AND IMPROVEMENT<br />
PBLI: CRITICAL APPRAISAL SKILLS<br />
Overview<br />
Critical appraisal skills are essential for finding, evaluating and translating evidence from<br />
the literature into clinical practice.<br />
Core Measure for <strong>UCSF</strong> <strong>GME</strong><br />
The Critical Appraisal Exercise is to be used to evaluate residents’ ability to answer a<br />
focused clinical question by searching the medical literature, summarizing their findings,<br />
evaluate the limitations <strong>of</strong> the evidence, and discuss how their practice will change based on<br />
the evidence gathered. It is intended to be used in journal clubs, other critical appraisal<br />
presentations, and after ad hoc searches <strong>of</strong> the literature to guide clinical management.<br />
Templates for critically appraising articles <strong>of</strong> specific types can be found at Oxford’s Centre<br />
for Evidence‐based Medicine: http://www.cebm.net/index.aspx?o=1157.<br />
However, these worksheets are directed toward evaluating the quality <strong>of</strong> a study and not<br />
the critical appraisal abilities <strong>of</strong> the learner.<br />
Reliability and Validity<br />
The number <strong>of</strong> evaluators and assessment opportunities to assure adequate reliability are<br />
unknown. Content validity can be established to the extent that key components <strong>of</strong> critical<br />
appraisal are included in the checklist rating form.<br />
Administration<br />
o Timing: Depends on importance <strong>of</strong> developing critical appraisal skills. Programs<br />
should assign a certain number that need to be completed satisfactorily in a<br />
specified time frame.<br />
o Who Performs: Designated experts in critical appraisal. Generally, evaluators<br />
would be faculty although experts in library science could evaluate the learners’<br />
search strategies. Learners can also assess their own performance and compare it<br />
to the faculty rating.<br />
o Format: an exercise with instructions completed by the resident followed by a<br />
checklist <strong>of</strong> 6 components completed by the faculty member<br />
o Scoring Criteria and Training: No faculty training is required. The checklist<br />
components are judged as Yes/No.<br />
o Documentation: At minimum, twice annually as part <strong>of</strong> semi‐annual review<br />
meetings.<br />
32
Uses <strong>of</strong> the <strong>Evaluation</strong> Data<br />
o Formative Feedback: concurrent or same day written evaluation and debriefing<br />
o Tracking Resident Learning: individualized learning plans discussed with a<br />
mentor and skill development over time<br />
o Assess Program Effectiveness: aggregating data across all residents or by PGY<br />
group<br />
o Summative Uses: depending on how important critical appraisal skills are to the<br />
training program, poor performance could trigger remediation and affect<br />
promotion or progress decisions<br />
33
University <strong>of</strong> California, San Francisco<br />
Core Measure for <strong>UCSF</strong> <strong>GME</strong>: Critical Appraisal (PBLI)<br />
Critical Appraisal Exercise (PBLI)<br />
Name <strong>of</strong> Presenter _____________________________________________<br />
Date _____________________________<br />
Clinical Question(s)<br />
What was learned<br />
Search Strategy and<br />
Search Terms<br />
(MedLine, Cochrane, Textbook,<br />
on-line ref)<br />
*LEVEL OF EVIDENCE<br />
GOOD Large randomized trials with clear-cut results (and low risk <strong>of</strong> error)<br />
FAIR Small randomized trials with uncertain results (and moderate to high risk <strong>of</strong> error)<br />
or nonrandomized trials with concurrent or contemporaneous controls<br />
POOR Nonrandomized trials with historical controls or case series with no controls<br />
34<br />
Magnitude <strong>of</strong><br />
Expected Effect<br />
(e.g. # needed to treat<br />
for benefit & harm)<br />
Do you plan to change your<br />
practice? How?<br />
Modified from the<br />
CREOG Competency<br />
<strong>Task</strong> <strong>Force</strong><br />
Level <strong>of</strong><br />
Evidence*<br />
1 - Refined question to be clinically focused and relevant<br />
Goal Met<br />
� Yes � No<br />
2 - Used logical, focused search strategy � Yes � No<br />
3 - Summarized study design and findings into clinically relevant metric (e.g. NNT) � Yes � No<br />
4 - Critically appraised the study(ies) and identified strengths and threats to validity � Yes � No<br />
5 - Discussed applicability <strong>of</strong> study findings to patient population or context at hand � Yes � No<br />
6 - Considered health policy implications <strong>of</strong> findings (e.g. feasibility, cost, harms) � Yes � No<br />
7 - Discussed limitations <strong>of</strong> current evidence � Yes � No<br />
8 - Discussed areas <strong>of</strong> future research<br />
Signature<br />
� Yes � No<br />
Evaluator ____________________________________ _______________________________<br />
Please return this form to either the SFGH, <strong>UCSF</strong> or MZ Residency Program Coordinators
<strong>GME</strong> <strong>Evaluation</strong> <strong>Task</strong> <strong>Force</strong> <strong>Recommendation</strong><br />
PRACTICE BASED LEARNING AND IMPROVEMENT<br />
PBLI: TEACHING SKILLS<br />
Overview<br />
Per AC<strong>GME</strong> Common Program Requirements teaching skills are a component <strong>of</strong> PBLI<br />
(“participate in the education <strong>of</strong> patients, families, students, residents and other health<br />
pr<strong>of</strong>essionals”) and ICS (“communicate effectively with physicians, other health<br />
pr<strong>of</strong>essionals, and health related agencies”).<br />
If respect for learners or patients is included, teaching evaluations also provide evidence <strong>of</strong> a<br />
component <strong>of</strong> Pr<strong>of</strong>essionalism: “compassion, integrity and respect for others”<br />
Core Measure for <strong>UCSF</strong> <strong>GME</strong><br />
Observation is the primary method by which clinical educators are rated. The SOM Clinical<br />
Educator Teaching <strong>Evaluation</strong> Form was developed at <strong>UCSF</strong> as a global assessment<br />
conducted at the end <strong>of</strong> a clinical rotation to assess the quality <strong>of</strong> medical student teaching<br />
by residents and faculty. This form consists <strong>of</strong> 19 items: 11 items on a 5‐point Likert‐type<br />
scale, 4 are narrative/open ended items, and 4 items are triggered only if low scores are<br />
received on certain critical items on the form. However, experience and internal studies <strong>of</strong><br />
the form indicate it can be shortened without losing any reliability. We recommend this<br />
shorter form as our core measure <strong>of</strong> clinical teaching.<br />
Reliability and Validity<br />
There are many instruments developed to measure clinical teaching effectiveness. Most <strong>of</strong><br />
these instruments do tend to measure a global teaching effectiveness score, interpersonal,<br />
and clinical teaching/pedagogic practices dimensions, and have high internal consistency.<br />
Each item in the SOM Clinical Educator form includes detailed anchors illustrating each<br />
point on the 5‐point scale (1=poor, 5=excellent). Due to the internal consistency <strong>of</strong> these<br />
forms, a shorter item set has adequate reliability and content validity. Our proposed core<br />
measure includes: conveyance <strong>of</strong> information, teaching enthusiasm, direction and feedback,<br />
promotion <strong>of</strong> critical thinking, treat me with respect, treat others with respect, and overall<br />
teaching effectiveness. Research has recommended that scales be tailored to learner<br />
(medical student and resident) and setting (e.g. outpatient vs. inpatient); hence, additional<br />
items may be included but should be similar in format to the other items and include clear<br />
anchors.<br />
Current Procedures by the SOM<br />
o Administration: Practices and evaluation frequency for faculty/residents as clinical<br />
teachers vary by each (SOM) clerkship. Some rotations require certain<br />
number/types <strong>of</strong> interactions to occur in order for form to be assigned, some ask<br />
35
learners to designate faculty/residents they worked with and then the form is<br />
assigned, so on and so forth.<br />
o Dissemination: The forms are disseminated by each department however the<br />
(SOM) Office <strong>of</strong> Medical Education centrally oversees policies surrounding the<br />
used <strong>of</strong> a standardize form, user management, reporting, and procedural record<br />
keeping.<br />
o Reporting: Faculty and residents are able to view their own evaluation in real time.<br />
The SOM Office <strong>of</strong> Medical Education annually reports on aggregate faculty and<br />
resident teaching scores for each clerkship by site.<br />
Administration by <strong>GME</strong> programs<br />
o Frequency: It is recommended that teaching be evaluated after a designated<br />
number <strong>of</strong> interactions between teacher and learner. The number <strong>of</strong> interactions is<br />
dependent on the length <strong>of</strong> the rotation and should be designated accordingly.<br />
o Who Performs: The learners (students, more junior residents).<br />
o Scoring Criteria and Training: It is recommended that the form be publicly visible<br />
and that evaluators know the scoring/rating in advance so that they know what<br />
they are rating about their instructors. There is no training associated with the use<br />
<strong>of</strong> this evaluation.<br />
o Documentation: Twice annually as part <strong>of</strong> semi‐annual review meetings.<br />
Use <strong>of</strong> Data<br />
How assessment results are used is dependent on the program. Timely feedback both<br />
written and oral between teachers and program directors will help encourage those good<br />
teachers and well as remediate and improve teaching. It is recommended that certain<br />
critical items on the form (e.g. teaching effectiveness, respect) create low score triggers.<br />
These trigger should prompt additional evaluation items, closed ended or narrative, in order<br />
to allow the evaluator to elaborate on the low scores. A low score on any <strong>of</strong> the items,<br />
particularly critical items, should trigger remediation.<br />
o Formative uses: Most important use as part <strong>of</strong> mentored review <strong>of</strong> progress,<br />
guiding individualized learning plans<br />
o Summative uses: Usually not unless low scores contribute to a pattern <strong>of</strong> difficulty<br />
in one or more competency areas<br />
o Program benchmarking: Yes ‐ as % <strong>of</strong> residents and faculty achieving a criterion<br />
goal or standard for direct teaching performance<br />
Optional Items<br />
The <strong>GME</strong> <strong>Evaluation</strong> TF recommends that all <strong>GME</strong> programs use the SOM short form as a<br />
core set to facilitate benchmarking for individual programs and the <strong>School</strong>. Review <strong>of</strong> the<br />
other evaluation tools revealed potentially useful items that programs may choose to add to<br />
the basic form. We have included these as “Item Bank <strong>Recommendation</strong>s.” Remember<br />
ultimately what matters the most is the overall teaching effectiveness and comments.<br />
36
Item Bank for Optional Use:<br />
1. During this time I personally interacted with or observed the resident and base this<br />
evaluation on (very concrete item based on hours <strong>of</strong> contact)<br />
2. Refers learner to pertinent references for further reading<br />
3. Reviews exam findings with learner<br />
4. Discusses differential diagnosis and work‐up with learner<br />
5. Reviews treatment options with learner<br />
6. Provides follow‐up to learners on interesting cases<br />
7. Takes time to stress proper surgical technique<br />
8. Discusses rational for surgical judgment<br />
9. Please rate how well this resident emphasized problem solving (i.e. thought process<br />
leading to decisions)<br />
Other Notes:<br />
o Neurological Surgery had each <strong>of</strong> their questions categorized by AC<strong>GME</strong><br />
competencies. This was nice – easy to track later.<br />
o The IM Cardiology UC Consult form was nicely tailored to the specialty and type<br />
<strong>of</strong> education.<br />
o The LEAH Fellowship form was nice and brief although we would recommend a<br />
five point scale and spell out teaching effectiveness.<br />
Specific Assessment <strong>of</strong> Lecturing Skills<br />
Programs may choose to assess the development <strong>of</strong> residents’ didactic teaching skills, either<br />
alone or in conjunction with assessment <strong>of</strong> critical appraisal skills (e.g., after a lecture<br />
reviewing the evidence on a specific clinical question). The Teaching Observation Form<br />
developed by the Academy <strong>of</strong> Medical Educators is an excellent source <strong>of</strong> tailored and<br />
structured feedback by a trained peer or faculty evaluator. The resident giving the lecture<br />
would meet in advance with the trained observer and prioritize components <strong>of</strong> the lecture<br />
for feedback. Following the lecture structured feedback would be shared including future<br />
plans for improvement.<br />
Quantitative measures <strong>of</strong> lecture quality also exist. An example is provided that asks<br />
students to rate 8 attributes <strong>of</strong> the instructor on a 5‐level scale from strongly disagree to<br />
strongly agree. This sort <strong>of</strong> measure may be more appropriate for following the<br />
development <strong>of</strong> didactic teaching skills than the Clinical Educator Teaching <strong>Evaluation</strong><br />
Form, which addresses global teaching performance in the clinical context.<br />
37
University <strong>of</strong> California, San Francisco<br />
Core Measure for <strong>UCSF</strong> <strong>GME</strong>: Clinical Teaching (PBLI)<br />
N.B. This measure comprises a core subset <strong>of</strong> items included in the SOM Clinical Educator<br />
Teaching <strong>Evaluation</strong> Form used to assess medical student teaching by residents and faculty.<br />
<strong>Evaluation</strong> information entered here will be made available to the evaluated person in<br />
anonymous and aggregated form only.<br />
Please rate your instructor's ability to do the following:<br />
Conveyance <strong>of</strong> Information<br />
Convey information clearly.<br />
Insufficient<br />
contact to judge.<br />
1 Poor communication<br />
skills, conveying information in<br />
unclear manner or consistently<br />
failing to communicate<br />
important points to students.<br />
Teaching Enthusiasm<br />
Provide enthusiastic and stimulating teaching.<br />
Insufficient<br />
contact to judge.<br />
1 Lack <strong>of</strong> enthusiasm<br />
for teaching students; does<br />
not stimulate students’<br />
interest or curiosity in clinical<br />
setting.<br />
Direction and Feedback<br />
Provide direction and feedback.<br />
Insufficient<br />
contact to judge.<br />
1 Does not define<br />
expectations; fails to<br />
provide student with<br />
direction or feedback about<br />
clinical performance;<br />
devotes little time or<br />
attention to helping<br />
students improve.<br />
Promotion <strong>of</strong> Critical Thinking<br />
Promote critical thinking.<br />
Insufficient<br />
contact to judge.<br />
1 Does not discuss<br />
clinical reasoning and<br />
knowledge <strong>of</strong> underlying<br />
mechanisms <strong>of</strong> disease with<br />
students; does not encourage<br />
use <strong>of</strong> the literature to improve<br />
patient care or pursue selfdirected<br />
learning.<br />
2 3 Good<br />
communication skills;<br />
usually conveys<br />
information in a clear,<br />
comprehensive manner.<br />
2 3 Usually<br />
enthusiastic about<br />
teaching; maintains an<br />
interest in students’<br />
learning.<br />
2 3 Discusses<br />
expectations; provides<br />
some direction and<br />
feedback about clinical<br />
performance; devotes<br />
adequate time and<br />
attention to helping<br />
students improve.<br />
2 3 Promotes critical<br />
thinking through clinical<br />
reasoning, emphasis on<br />
underlying mechanisms <strong>of</strong><br />
disease, and use <strong>of</strong> the<br />
literature to improve patient<br />
care and encourage selfdirected<br />
learning.<br />
38<br />
4 5 Excellent<br />
communication skills;<br />
consistently conveys<br />
information in exceptionally<br />
clear, comprehensive<br />
manner.<br />
4 5 Consistently<br />
enthusiastic about teaching;<br />
outstanding at stimulating<br />
students’ interest in learning.<br />
4 5 Provides clear<br />
guidelines about expectations;<br />
provides specific, useful<br />
feedback to student verbally<br />
about strengths and areas for<br />
improvement; exceptional level<br />
<strong>of</strong> time and attention devoted to<br />
helping students improve.<br />
4 5 Exceptional ability to<br />
promote critical thinking through<br />
clinical reasoning, emphasis on<br />
the underlying mechanisms <strong>of</strong><br />
disease, and use <strong>of</strong> the literature<br />
to improve patient care and<br />
encourage self-directed learning.
Treat me with Respect<br />
I was treated with respect by this individual<br />
Insufficient<br />
contact to judge.<br />
1 This individual<br />
consistently failed to treat<br />
me with respect and<br />
generally displayed an<br />
unpr<strong>of</strong>essional or abusive<br />
manner during all<br />
interactions.<br />
2 This individual<br />
treated me with respect<br />
approximately half <strong>of</strong> the<br />
time; displayed an<br />
unpr<strong>of</strong>essional or<br />
disrespectful manner<br />
during the remainder <strong>of</strong> the<br />
time.<br />
39<br />
3 This<br />
individual<br />
treated me<br />
with respect<br />
most <strong>of</strong> the<br />
time.<br />
4 This<br />
individual<br />
treated me<br />
with respect<br />
almost<br />
always.<br />
5 This<br />
attending<br />
consistently<br />
treated me with<br />
respect<br />
throughout the<br />
rotation.<br />
Treat me with Respect - Reasons<br />
If you answered 2 or below on the previous question, please indicate in which way(s) you were not<br />
treated with respect by this educator or resident. (Mandatory for answers <strong>of</strong> 2 or below on the<br />
previous question.)<br />
Belittled or humiliated me<br />
Spoke sarcastically or insultingly to me<br />
Intentionally neglected or left me out <strong>of</strong> the communications<br />
Subjected me to <strong>of</strong>fensive sexist remarks or names<br />
Subjected me to racist or ethically <strong>of</strong>fensive remarks or names<br />
Engaged in discomforting humor<br />
Denied me training opportunities because <strong>of</strong> my gender<br />
Required me to perform personal services (i.e. babysitting, shopping)<br />
Threw instruments/bandages, equipment etc.<br />
Threatened me with physical harm (e.g. hit, slapped, kicked)<br />
Created a hostile environment for learning<br />
Other<br />
Treat me with Respect - Other<br />
If you chose other in the previous question, please explain in the comment section below.<br />
Treat Others with Respect<br />
I observed others (students, residents, staff, patients) being treated with respect by this individual<br />
Insufficient<br />
contact to judge.<br />
1 This individual<br />
consistently failed to treat<br />
2 This individual<br />
treated others with respect<br />
3 This<br />
individual<br />
4 This<br />
individual<br />
5 This<br />
attending
others with respect and<br />
generally displayed an<br />
unpr<strong>of</strong>essional or abusive<br />
manner during all<br />
interactions.<br />
Treat Others with Respect - Reasons<br />
approximately half <strong>of</strong> the<br />
time; displayed an<br />
unpr<strong>of</strong>essional or<br />
disrespectful manner<br />
during the remainder <strong>of</strong> the<br />
time.<br />
40<br />
treated others<br />
with respect<br />
most <strong>of</strong> the<br />
time.<br />
treated others<br />
with respect<br />
almost<br />
always.<br />
consistently<br />
treated others<br />
with respect<br />
throughout the<br />
rotation.<br />
If you answered 2 or below on the previous question, please indicate in which way(s) Patients or<br />
Health Pr<strong>of</strong>essionals were not treated with respect by this educator or resident. (Mandatory for<br />
answers <strong>of</strong> 2 or below on the previous question.)<br />
Patients - Discussed confidential information in an inappropriate setting (e.g. cafeteria, elevator)<br />
Patients - Made derogatory or disrespectful comments about a patient or family<br />
Patients - Treated patients differently because <strong>of</strong> their financial status, ethnic background, religious<br />
preferences or sexual orientation<br />
Patients - Threw instruments/bandages, equipment etc.<br />
Patients - Created a hostile environment for patient care and/or learning<br />
Health Pr<strong>of</strong>essionals - Made derogatory or disrespectful comments about some health pr<strong>of</strong>essionals<br />
Health Pr<strong>of</strong>essionals - Treated health pr<strong>of</strong>essionals differently because <strong>of</strong> their financial status, ethnic<br />
background, religious preferences or sexual orientation<br />
Health Pr<strong>of</strong>essionals - Made <strong>of</strong>fensive sexist, racist, or ethnically insensitive remarks/names about<br />
some health pr<strong>of</strong>essionals<br />
Other<br />
Treat Others with Respect - Other<br />
If you chose other in the previous question, please explain in the comment section below.<br />
Teaching Skills, Overall<br />
Overall teaching effectiveness.
Insufficient<br />
contact to judge.<br />
1 This attending was an overall<br />
poor teacher, either due to inadequate<br />
time spent teaching medical students,<br />
ineffective style, or unpr<strong>of</strong>essional<br />
manner .<br />
2 3 This attending was an<br />
overall good teacher through<br />
dedication <strong>of</strong> adequate time to<br />
teaching and a generally<br />
effective style.<br />
41<br />
4 5 This attending<br />
was an overall<br />
excellent teacher<br />
through dedication <strong>of</strong><br />
time to teaching and a<br />
highly effective style,<br />
enabling significant<br />
skill development<br />
throughout the<br />
rotation.<br />
Instructor Strengths<br />
What are the strengths <strong>of</strong> this instructor? (These comments will be viewed by the instructor, but will be<br />
anonymous and aggregated. For comments to be effective feedback, please be direct, specific, and<br />
constructive. General comments such as “good instructor” are too non-specific to be <strong>of</strong> value.)<br />
Instructor Improvements<br />
How could this instructor improve? (These comments will be viewed by the instructor, but will be<br />
anonymous and aggregated. For comments to be effective feedback please be direct, specific, and<br />
constructive. General comments such as “bad instructor” are too non-specific to be <strong>of</strong> value.)<br />
Confidential Comments, Educator<br />
This area is for giving constructive or corrective feedback that you don't feel comfortable giving directly.<br />
These comments are CONFIDENTIAL and will NOT go directly to the educator. They will be forwarded<br />
ANONYMOUSLY to the program director(s). Please be thoughtful, pr<strong>of</strong>essional, and constructive in your<br />
feedback.<br />
OPTIONAL CONFIDENTIAL COMMENT<br />
If you are willing to be contacted by the clerkship director to address a particularly concerning issue, please<br />
include your name and contact information below. This will only go the clerkship director and/or the site<br />
director with the goal <strong>of</strong> appropriately addressing the raised concerns.
<strong>UCSF</strong> Academy <strong>of</strong> Medical Educators<br />
TOP Observation Form Lecture<br />
NAME: _____________________________________ OBSERVER: ___________________________<br />
TOPIC: ______________________________________________________________________________<br />
FOCUS OF OBSERVATION (discuss w/ faculty in advance):<br />
INTRODUCTION<br />
1. Introduced topic, stated objectives, <strong>of</strong>fered<br />
preview.<br />
2. Gained attention and motivated learning.<br />
3. Established climate for learning and for<br />
participation.<br />
42<br />
OBSERVATIONS<br />
BODY OF LECTURE OBSERVATIONS<br />
4. Presented 3 – 5 main points in clear and<br />
organized fashion.<br />
5. Provided supporting materials, examples, and<br />
summaries.<br />
6. Content level<br />
7. Effectively used visuals, handouts, and/or<br />
demonstrations. Include AV problems (if any),<br />
effective use <strong>of</strong> slides (set stage for each slide,<br />
focused audience on important parts <strong>of</strong> slides),<br />
use <strong>of</strong> pointer.<br />
8. Varied presentations (Used blackboard, slides,<br />
visuals).<br />
9. Transitions between topics.<br />
CONCLUSION OBSERVATIONS<br />
10. Summarized major principles, key points<br />
without introducing new materials.<br />
11. Provided closure or stimulated further thought.
TEACHER DYNAMICS OBSERVATIONS<br />
12. Exhibited enthusiasm and stimulated interest in<br />
content.<br />
13. Used appropriate voice, gestures, movement,<br />
and eye contact. Avoidance <strong>of</strong> unconscious use<br />
<strong>of</strong> repeated words (eg “um”, “ok”).<br />
14. Encourage active participation.<br />
15. Used questions to stimulate thought and<br />
discussion. Response to questions (repeated or<br />
rephrased question, concise answer).<br />
DEBRIEF<br />
1. ELICIT SELF-ASSESSMENT BY MENTEE FIRST.<br />
2. SUMMARIZE YOUR ASSESSMENT OF MENTEE’S STRENGTHS AND YOUR RECOMMENDATIONS<br />
(KEEP IN MIND AREAS OF FOCUS).<br />
STRENGTHS<br />
1.<br />
2.<br />
3.<br />
3. ACTION PLAN (RESIDENT TO COMPLETE)<br />
RECOMMENDATIONS<br />
43
Instructor Student Rating Form<br />
Instructions: Please fill in the bubble that best describes your rating using the following scale:<br />
1 = Strongly Disagree<br />
2 = Disagree<br />
3 = Unsure<br />
4 = Agree<br />
5 = Strongly Agree<br />
Core Items 1 2 3 4 5<br />
1. Organization: Instructor presented material<br />
systematically and sequentially.<br />
2. Clarity: Instructor communicated effectively, presented<br />
content clearly, and gave understandable responses to<br />
questions.<br />
3. Enthusiasm: Instructor effectively stimulated learner<br />
interest.<br />
4. Contributions: Instructor discussed recent developments<br />
in the field, directed students to current reference materials,<br />
and provided additional materials to cover current topics.<br />
5. Rapport: Instructor listened attentively, was interested in<br />
students’ progress, and provided constructive feedback.<br />
6. Pr<strong>of</strong>essional Characteristics: Instructor demonstrated<br />
respect for students, cultural awareness, respect for health<br />
pr<strong>of</strong>essions, and other aspects <strong>of</strong> pr<strong>of</strong>essionalism.<br />
7. Attitude: Instructor was concerned about students<br />
learning the material, encouraged class participation, and<br />
respected differing views.<br />
8. Overall: I would rate this instructor as excellent. O O O O O<br />
Revised 6/10/05<br />
44<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O<br />
O
ADDITIONAL RECOMMENDATIONS<br />
OF THE<br />
EVALUATION TASK FORCE<br />
45
<strong>GME</strong> <strong>Evaluation</strong> <strong>Task</strong> <strong>Force</strong> <strong>Recommendation</strong><br />
MEDICAL KNOWLEDGE<br />
Overview<br />
Multiple choice examinations (MCQ) <strong>of</strong> medical knowledge are available from specialty<br />
boards, pr<strong>of</strong>essional societies and program director organizations. In many cases they are<br />
required as in‐service examinations to track residents’ progress and set the stage for written<br />
board certification examinations. Although MCQ’s provide valid assessments <strong>of</strong> medical<br />
knowledge, oral examinations and presentations <strong>of</strong> clinical cases are considered better<br />
measures <strong>of</strong> clinical reasoning.<br />
Reliability and Validity<br />
Generally MCQ’s are developed to achieve high internal consistency reliability for their<br />
subscales. Content validity is maximized by assuring that the examination items sample the<br />
full range <strong>of</strong> core knowledge. Construct validity is demonstrated when higher level<br />
residents have a greater % correct than lower level residents. Although performance on one<br />
MCQ tends to predict performance on subsequent ones, evidence is mixed correlating MCQ<br />
performance with other aspects <strong>of</strong> medical knowledge such as clinical reasoning.<br />
Administration<br />
o Timing: usually once per year<br />
o Who Performs: generally a secure examination administered by staff according to<br />
guidelines <strong>of</strong> the in‐service examination<br />
o Format: Each item contains an introductory statement or ‘stem’ followed by four or five<br />
response options, only one <strong>of</strong> which is correct. The stem is usually a patient case, clinical<br />
findings, or displays data graphically. A typical half‐day examination has 175 to 250 test<br />
questions.<br />
o Scoring Criteria and Training: Completed exams are generally returned to the<br />
organization that provides the test for scoring. Score reports can include raw % correct,<br />
scores standardized for PGY level, and subscores in key content areas.<br />
o Documentation: Achievement <strong>of</strong> the medical knowledge competency must occur at least<br />
twice a year at the semi‐annual review meeting. MCQ performance can inform one <strong>of</strong> the<br />
meetings, and other knowledge assessments (global evaluations, assessments <strong>of</strong> clinical<br />
reasoning, progress toward reading goals from last year’s MCQ) can inform both meetings.<br />
Uses <strong>of</strong> the Data<br />
o Comparing the test scores on in‐training examinations with national statistics can serve to<br />
identify strengths and limitations <strong>of</strong> individual residents to help them improve.<br />
o Summative Decisions: MCQ performance falling short <strong>of</strong> a minimum passing threshold<br />
could delay or prevent a resident from advancing or graduating. Generally, however, such<br />
46
decisions should be based on overall assessments <strong>of</strong> medical knowledge including clinical<br />
reasoning.<br />
o Remediation Threshold: Programs should communicate what performance on the MCQ<br />
would require remediation. The threshold for remediation may be determined by a national<br />
or local standard for passing performance or a score that portends difficulty passing the<br />
board certification examination. Generally a specific program <strong>of</strong> study would be established<br />
to close gaps in knowledge, and progress would be assessed short‐term using written or oral<br />
examinations. [N.B. because in‐service examinations are <strong>of</strong>ten administered only once<br />
annually, programs may need to rely on other measures <strong>of</strong> progress].<br />
Comparing test results aggregated for residents in each year <strong>of</strong> a program can be helpful to<br />
identify residency training experiences that might be improved.<br />
47
<strong>GME</strong> <strong>Evaluation</strong> <strong>Task</strong> <strong>Force</strong> <strong>Recommendation</strong><br />
SYSTEMS‐BASED PRACTICE<br />
The evaluation <strong>of</strong> systems‐based practice poses unique challenges because <strong>of</strong> the variation in<br />
learning and assessments opportunities among <strong>GME</strong> programs. We recommend that<br />
programs first identify where residents are routinely called upon to demonstrate systems<br />
based practice skills and then select methods for evaluating and documenting these skills.<br />
We list examples <strong>of</strong> these activities and how they might be assessed.<br />
Morbidity and Mortality Conference Presentation<br />
These occur on a regular basis in many residency programs and include resident<br />
presentations <strong>of</strong> clinical cases with an assessment <strong>of</strong> how care could be improved. If this<br />
assessment routinely includes an analysis <strong>of</strong> the system’s role in prevention <strong>of</strong> medical errors<br />
and harms, the resident’s analysis can be evaluated using a form or checklist.<br />
Quality Improvement Meeting<br />
Resident involvement in quality improvement activities is an AC<strong>GME</strong> requirement. If a<br />
resident routinely participates in a clinical quality improvement meeting during a particular<br />
PGY year or rotation, their participation can be assessed by peers or supervisors on the<br />
committee.<br />
Cost‐Effective Care Practices<br />
These skills are demonstrated when residents make decisions about screening tests,<br />
diagnostic tests treatments, and site <strong>of</strong> care decisions. If residents routinely present their care<br />
plans to faculty in rounds, morning report, conferences (pre‐op, tumor board) or other<br />
venues, the faculty can evaluate the degree to which the resident demonstrates an awareness<br />
<strong>of</strong> cost‐effectiveness.<br />
Quality Improvement Project<br />
There are many venues where individual residents or groups <strong>of</strong> residents can identify a<br />
process or outcome they wish to improve, develop a plan that includes consideration <strong>of</strong><br />
system issues, implement the plan and measure its success. The full project or any <strong>of</strong> its<br />
components could be assessed by experts in quality improvement. The Quality Improvement<br />
Project Assessment Tool (QIPAT‐7) is provided below as on example <strong>of</strong> how a resident<br />
project can be assessed. The QIPAT‐7 was developed based upon the input <strong>of</strong> national QI<br />
experts and its reliability has been demonstrated in an evaluation <strong>of</strong> 45 resident QI proposals<br />
(Leenstra 2007).<br />
48
The HealthCare Matrix from Vanderbilt uses quality aims from the Institute <strong>of</strong> Medicine and<br />
the AC<strong>GME</strong> competencies together to assess and improve care. The Matrix is described<br />
further in this abstract from the AC<strong>GME</strong> eBulletin from December 2006, page 10‐11 (Bingham<br />
2005).<br />
Using a Healthcare Matrix to Assess Care in Terms <strong>of</strong> the IOM Aims and the AC<strong>GME</strong><br />
Competencies<br />
Doris Quinn PhD, John Bingham MHA, Vanderbilt University Medical Center<br />
The study assessed how residents and faculty are using the HealthCare Matrix to assess and<br />
improve care. Whether care is safe, timely, effective, efficient, equitable, or patient‐centered is<br />
juxtaposed against the AC<strong>GME</strong> competencies. When care is assessed in this manner, learning<br />
the competencies becomes very relevant to the outcomes <strong>of</strong> care. It presented the work <strong>of</strong><br />
internal medicine residents who on their Ambulatory Rotation: 1) utilized the Matrix to assess<br />
the care <strong>of</strong> their patients; 2) demonstrated use <strong>of</strong> QI tools to improve care; and 3) improved<br />
publicly reported metrics for AMI and CHF by focusing in particular, system‐based practice<br />
and practice‐based learning and improvement. Residents first utilize the Matrix to assess care<br />
<strong>of</strong> one <strong>of</strong> their patient’s. Then, as a group, they choose a publicly reported metric and complete<br />
matrices for a panel <strong>of</strong> patients. The data from the matrices informs residents as to where more<br />
information or improvement is needed. This becomes the basis for an improvement project<br />
which is ultimately presented to senior leaders. To date, residents have improved the care <strong>of</strong><br />
patients with pneumonia, coronary artery disease, diabetes, and processes including obtaining<br />
consults, the VA phone Rx system and others. Public metrics <strong>of</strong> quality from CMS, JCAHO,<br />
and Leapfrog are utilized in the assessment. When the AC<strong>GME</strong> competencies are combined<br />
with the IOM aims and used to assess and improve care <strong>of</strong> patients in “real time”, developing<br />
the competencies becomes “the way residents learn” and not a burden or “add on”. This<br />
process allows residents, who are the most knowledgeable about workarounds and flaws in<br />
the system, to use their experience to improve care. Residents, faculty, the<br />
institution, and most importantly, the patients benefit.<br />
References<br />
Bingham JW, Quinn DC, Richardson MG, Miles PV, Gabbe SG. Using a healthcare matrix to<br />
assess patient care in terms <strong>of</strong> aims for improvement and AC<strong>GME</strong> core competencies. JC<br />
Journal on Quality and Patient Safety 2005;32(2): 98–105.<br />
Leenstra JL, Beckman TJ, Reed DA, Mundell WC, Thomas KG, Krajicek BJ, Cha SS,<br />
Kolars JC, McDonald FS. Validation <strong>of</strong> a method for assessing resident physiciansʹ quality<br />
improvement proposals. J Gen Intern Med. 2007 Sep;22(9):1330‐4. Epub 2007 Jun 30<br />
50
<strong>GME</strong> <strong>Evaluation</strong> <strong>Task</strong> <strong>Force</strong> <strong>Recommendation</strong><br />
PROGRESS REPORT FOR SEMI‐ANNUAL MEETING<br />
The AC<strong>GME</strong> requires a semi‐annual performance review meeting to discuss each resident’s<br />
progress toward goals and objectives for the year. As programs implement more resident<br />
evaluations, it will be essential to summarize the data on key outcomes and compare<br />
performance to the resident’s goals or benchmarks. The <strong>Task</strong> <strong>Force</strong> recommends a progress<br />
report following the template illustrated below. The progress report should be prepared in<br />
advance <strong>of</strong> the semi‐annual meeting, ideally by having evaluation data electronically<br />
populate the Performance Measures for each resident. Residents would review the report<br />
and come prepared to discuss their progress with the program director or designee. If a<br />
portfolio‐based system is used the resident would also provide evidence <strong>of</strong> learning they<br />
selected for showcasing.<br />
51
PROGRESS REPORT FOR SEMI‐ANNUAL MEETING<br />
Area / <strong>Evaluation</strong> Instrument Definition Mean (SD) or<br />
Percent<br />
Evidence <strong>of</strong> Resident Learning, including AC<strong>GME</strong> Competencies<br />
Global <strong>Evaluation</strong>s<br />
Medical Knowledge<br />
Patient Care<br />
Pr<strong>of</strong>essionalism<br />
Interpersonal & CS<br />
Practice‐Based Learning<br />
Systems‐Based Practice<br />
Medical Knowledge: MCQ<br />
Overall Score<br />
Subscore A<br />
Subscore B<br />
Subscore C.<br />
9‐option items completed after<br />
each rotation<br />
Standardized score where 200 is<br />
the mean and 20 is the SD based<br />
on national norms for each PGY<br />
52<br />
6.8 (1.1)<br />
7.1 (0.9)<br />
7.6 (0.8)<br />
8.3 (0.6)<br />
7.0 (1.0)<br />
6.2 (0.5)<br />
220<br />
201<br />
224<br />
215<br />
Standard<br />
(Goal)<br />
Patient Care: Mini‐CEX Average <strong>of</strong> 7 9‐option items 7.7 (1.2) 5<br />
Patient Care: Focused<br />
Assessment <strong>of</strong> Skills<br />
Procedures<br />
Communication<br />
Pr<strong>of</strong> / ICS: Team<br />
Percent <strong>of</strong> checklist items<br />
performed correctly<br />
82%<br />
90%<br />
5<br />
5<br />
5<br />
5<br />
5<br />
5<br />
200<br />
200<br />
200<br />
200<br />
70%<br />
90%<br />
Performance<br />
Relative to<br />
Standard<br />
Average <strong>of</strong> 9 9‐option items<br />
8.2 (0.7)<br />
5<br />
Met<br />
Pr<strong>of</strong> / ICS: Self<br />
Average <strong>of</strong> 9 9‐option items<br />
8.2 (0.7)<br />
5<br />
Met<br />
Pr<strong>of</strong> / ICS: Patients Average <strong>of</strong> 10 5‐option items<br />
4.5 (0.2) 4.0<br />
Met<br />
PBLI – Student teaching Average <strong>of</strong> 6 5‐option items<br />
4.2 (0.5) 4.0<br />
Met<br />
PBLI – Critical appraisal % checklist items met 90% 80% Met<br />
SBP – M&M evaluation<br />
Case logs<br />
Procedure A<br />
Procedure B<br />
Procedure C<br />
Met<br />
Met<br />
Met<br />
Met<br />
Met<br />
Met<br />
Met<br />
Met<br />
Met<br />
Met<br />
Met<br />
Met<br />
Met<br />
Average <strong>of</strong> 8 5‐option items 3.8 (1.2) 4.0 Met<br />
Cumulative number <strong>of</strong> cases<br />
since residency start<br />
Scholarly Project Progress PGY‐specific goals met?<br />
Learning Plan Progress Progress adequate since last<br />
review?<br />
42<br />
105<br />
23<br />
30<br />
60<br />
30<br />
Met<br />
Met<br />
Not Met<br />
Yes Yes Met<br />
Yes Yes Met<br />
Progress report form adapted from Knight DA, Vannatta PM, O’Sullivan PS. A Process to Meet the Challenge<br />
<strong>of</strong> Program <strong>Evaluation</strong> and Program Improvement. AC<strong>GME</strong> Bulletin 2006 (Sept): 5‐8.
<strong>GME</strong> <strong>Evaluation</strong> <strong>Task</strong> <strong>Force</strong> <strong>Recommendation</strong><br />
CLOSING THE LOOP: ANNUAL PROGRAM REVIEW<br />
The AC<strong>GME</strong> requires programs to conduct a formal and structured evaluation <strong>of</strong> the<br />
curriculum at least annually. Details <strong>of</strong> this review are outlined in the AC<strong>GME</strong> Common<br />
Program Requirements.<br />
The annual review must include consideration <strong>of</strong> whether the program’s self‐determined<br />
benchmarks have been met in the following areas: resident performance; faculty<br />
development; graduate performance, including performance <strong>of</strong> program graduates on the<br />
certification examination; and, program quality. Action plans must be created to address<br />
areas in which goals have not been achieved, and the faculty at large must approve the plans<br />
<strong>of</strong> action. Goals that emerge from the annual program review one year become<br />
benchmarks for the review next year, closing the loop and facilitating the longitudinal and<br />
continuous improvement <strong>of</strong> program quality.<br />
The <strong>Task</strong> <strong>Force</strong> recommends that sufficient time be set aside for a full discussion <strong>of</strong> the<br />
program’s progress and determination <strong>of</strong> the goals and action plan for the coming year.<br />
Programs should not underestimate the time and effort required to conduct a systematic<br />
review. Although the AC<strong>GME</strong> requirements can be met by conducting the annual meeting<br />
<strong>of</strong> faculty and resident representatives, many programs prefer a dedicated half‐day retreat<br />
for this comprehensive review.<br />
Several challenges emerge for conducting a robust annual program review. Implementing<br />
the <strong>Evaluation</strong> <strong>Task</strong> <strong>Force</strong> recommendations for AC<strong>GME</strong> competency assessment will yield<br />
resident performance data for each <strong>of</strong> the evaluation tools. Other required components<br />
include evaluations <strong>of</strong> faculty teaching and confidential evaluations <strong>of</strong> the program by<br />
residents and faculty. These data can be summarized in a progress report along with other<br />
program performance measures. The report can compare the program’s outcomes to goals<br />
set from the prior year.<br />
In this section we provide recommendations on assessment <strong>of</strong> faculty teaching, program<br />
evaluations by resident and faculty, and methods for selecting and reporting data essential<br />
to include in the annual program review.<br />
53
<strong>GME</strong> <strong>Evaluation</strong> <strong>Task</strong> <strong>Force</strong> <strong>Recommendation</strong>:<br />
CONFIDENTIAL RESIDENT EVALUATION OF FACULTY TEACHING<br />
Recommended Assessment Tool<br />
Observation is the primary method by which clinical educators are rated. The SOM Clinical<br />
Educator Teaching <strong>Evaluation</strong> Form was developed at <strong>UCSF</strong> as a global assessment<br />
conducted at the end <strong>of</strong> a clinical rotation to assess the quality <strong>of</strong> medical student teaching<br />
by residents and faculty. This form consists <strong>of</strong> 19 items: 11 items on a 5‐point Likert‐type<br />
scale, 4 are narrative/open ended items, and 4 items are triggered only if low scores are<br />
received on certain critical items on the form. However, experience and internal studies <strong>of</strong><br />
the form indicate it can be shortened without losing any reliability. We recommend this<br />
shorter form as our core measure <strong>of</strong> clinical teaching.<br />
Reliability and Validity<br />
There are many instruments developed to measure clinical teaching effectiveness. Most <strong>of</strong><br />
these instruments do tend to measure a global teaching effectiveness score, interpersonal,<br />
and clinical teaching/pedagogic practices dimensions, and have high internal consistency.<br />
Each item in the SOM Clinical Educator form includes detailed anchors illustrating each<br />
point on the 5‐point scale (1=poor, 5=excellent). Due to the internal consistency <strong>of</strong> these<br />
forms, a shorter item set has adequate reliability and content validity. Our proposed core<br />
measure includes: conveyance <strong>of</strong> information, teaching enthusiasm, direction and feedback,<br />
promotion <strong>of</strong> critical thinking, treat me with respect, treat others with respect, and overall<br />
teaching effectiveness. Research has recommended that scales be tailored to learner<br />
(medical student and resident) and setting (e.g. outpatient vs. inpatient); hence, additional<br />
items may be included but should be similar in format to the other items and include clear<br />
anchors.<br />
Administration by <strong>GME</strong> programs<br />
o Frequency: It is recommended that faculty teaching be evaluated after a designated<br />
number <strong>of</strong> interactions between faculty and residents. The number <strong>of</strong> interactions is<br />
dependent on the length <strong>of</strong> the rotation and should be designated accordingly.<br />
Immediately after a clinical rotation ends would be optimal, and the minimum<br />
frequency would be twice annually.<br />
o Who Performs: Residents.<br />
o Scoring Criteria and Training: It is recommended that the form be publicly visible<br />
and that evaluators know the scoring/rating in advance so that they know what<br />
they are rating about their instructors. Faculty should also be aware <strong>of</strong> the criteria<br />
by which their teaching will be judged. There is no training associated with the use<br />
<strong>of</strong> this evaluation.<br />
o Documentation: Annually as part <strong>of</strong> the program review meeting and otherwise as<br />
indicated by faculty mentorship meetings, promotion & advancement meetings,<br />
etc.<br />
54
Use <strong>of</strong> Data<br />
Each department and division will establish its own practices for review <strong>of</strong> faculty teaching<br />
evaluations. However, the program director is responsible for deciding which faculty<br />
members will teach the residents in the program. The AC<strong>GME</strong> requires an annual program<br />
review meeting at which time the aggregate data on faculty teaching is reviewed and<br />
compared with the program’s internally defined standards. Action plans are required if<br />
standards are not met.<br />
Optional Items<br />
The <strong>GME</strong> <strong>Evaluation</strong> TF recommends that all <strong>GME</strong> programs use the SOM form as a core<br />
set to facilitate benchmarking for individual programs and the <strong>School</strong>. Review <strong>of</strong> the other<br />
evaluation tools revealed potentially useful items that programs may choose to add to the<br />
basic form. We have included these as “Item Bank <strong>Recommendation</strong>s.” Remember<br />
ultimately what matters the most is the overall teaching effectiveness and comments.<br />
Item Bank <strong>Recommendation</strong>s:<br />
1. During this time I personally interacted with or observed the faculty and base this<br />
evaluation on (very concrete item based on hours <strong>of</strong> contact)<br />
2. Refers resident to pertinent references for further reading<br />
3. Reviews exam findings with resident<br />
4. Discusses differential diagnosis and work‐up with resident<br />
5. Reviews treatment options with resident<br />
6. Provides follow‐up to the residents on interesting cases<br />
7. Takes time to stress proper surgical technique<br />
8. Discusses rational for surgical judgment<br />
9. Please rate how well this attending emphasized problem solving (i.e. thought process<br />
leading to decisions<br />
10. Monitored my stress level (it requires a grammatically different scale than what they<br />
have included – item is not a must but definitely an interesting idea to monitor burnout)<br />
Other Notes:<br />
o Neurological Surgery had each <strong>of</strong> their questions categorized by AC<strong>GME</strong><br />
competencies. This was nice – easy to track later.<br />
o The IM Cardiology UC Consult form was nicely tailored to the specialty and type<br />
<strong>of</strong> education.<br />
o The LEAH Fellowship form was nice and brief although I would recommend a five<br />
point scale and spell out teaching effectiveness.<br />
55
Faculty Development: The <strong>UCSF</strong> Academy <strong>of</strong> Medical Educators uses Teaching<br />
Observation (TOP) Forms to provide structured feedback by a trained peer who has<br />
observed a faculty member doing a lecture or facilitating a small group. The forms include<br />
no scales and are not scored. Their principal use is to facilitate and tailor feedback. The<br />
number <strong>of</strong> faculty receiving structured feedback on their teaching can be used as a measure<br />
<strong>of</strong> faculty development for the Annual Program Review.<br />
56
University <strong>of</strong> California, San Francisco<br />
Confidential Resident <strong>Evaluation</strong> <strong>of</strong> Faculty Teaching<br />
N.B. This measure comprises a core subset <strong>of</strong> items included in the SOM Clinical Educator<br />
Teaching <strong>Evaluation</strong> Form.<br />
<strong>Evaluation</strong> information entered here will be made available to the evaluated person in<br />
anonymous and aggregated form only.<br />
Please rate your instructor's ability to do the following:<br />
Conveyance <strong>of</strong> Information<br />
Convey information clearly.<br />
Insufficient<br />
contact to judge.<br />
1 Poor communication<br />
skills, conveying information in<br />
unclear manner or consistently<br />
failing to communicate<br />
important points to residents.<br />
Teaching Enthusiasm<br />
Provide enthusiastic and stimulating teaching.<br />
Insufficient<br />
contact to judge.<br />
1 Lack <strong>of</strong> enthusiasm<br />
for teaching residents; does<br />
not stimulate residents’<br />
interest or curiosity in clinical<br />
setting.<br />
Direction and Feedback<br />
Provide direction and feedback.<br />
Insufficient<br />
contact to judge.<br />
1 Does not define<br />
expectations; fails to<br />
provide resident with<br />
direction or feedback<br />
about clinical<br />
performance; devotes little<br />
time or attention to helping<br />
residents improve.<br />
Promotion <strong>of</strong> Critical Thinking<br />
Promote critical thinking.<br />
Insufficient<br />
contact to judge.<br />
1 Does not discuss<br />
clinical reasoning and<br />
knowledge <strong>of</strong> underlying<br />
mechanisms <strong>of</strong> disease<br />
with residents; does not<br />
encourage use <strong>of</strong> the<br />
literature to improve patient<br />
care or pursue self-directed<br />
learning.<br />
Treat me with Respect<br />
I was treated with respect by this individual<br />
2 3 Good<br />
communication skills;<br />
usually conveys<br />
information in a clear,<br />
comprehensive manner.<br />
2 3 Usually<br />
enthusiastic about<br />
teaching; maintains an<br />
interest in residents’<br />
learning.<br />
2 3 Discusses<br />
expectations; provides<br />
some direction and<br />
feedback about clinical<br />
performance; devotes<br />
adequate time and<br />
attention to helping<br />
residents improve.<br />
2 3 Promotes critical<br />
thinking through clinical<br />
reasoning, emphasis on<br />
underlying mechanisms <strong>of</strong><br />
disease, and use <strong>of</strong> the<br />
literature to improve<br />
patient care and<br />
encourage self-directed<br />
learning.<br />
57<br />
4 5 Excellent<br />
communication skills;<br />
consistently conveys<br />
information in exceptionally<br />
clear, comprehensive<br />
manner.<br />
4 5 Consistently<br />
enthusiastic about teaching;<br />
outstanding at stimulating<br />
residents’ interest in<br />
learning.<br />
4 5 Provides clear<br />
guidelines about expectations;<br />
provides specific, useful<br />
feedback to resident verbally<br />
about strengths and areas for<br />
improvement; exceptional level<br />
<strong>of</strong> time and attention devoted<br />
to helping residents improve.<br />
4 5 Exceptional ability to<br />
promote critical thinking<br />
through clinical reasoning,<br />
emphasis on the underlying<br />
mechanisms <strong>of</strong> disease, and<br />
use <strong>of</strong> the literature to<br />
improve patient care and<br />
encourage self-directed<br />
learning.
Insufficient<br />
contact to judge.<br />
1 This individual<br />
consistently failed to treat<br />
me with respect and<br />
generally displayed an<br />
unpr<strong>of</strong>essional or abusive<br />
manner during all<br />
interactions.<br />
Treat me with Respect - Reasons<br />
2 This individual<br />
treated me with respect<br />
approximately half <strong>of</strong> the<br />
time; displayed an<br />
unpr<strong>of</strong>essional or<br />
disrespectful manner<br />
during the remainder <strong>of</strong><br />
the time.<br />
58<br />
3 This<br />
individual<br />
treated me<br />
with respect<br />
most <strong>of</strong> the<br />
time.<br />
4 This<br />
individual<br />
treated me<br />
with respect<br />
almost<br />
always.<br />
5 This<br />
attending<br />
consistently<br />
treated me with<br />
respect<br />
throughout the<br />
rotation.<br />
If you answered 2 or below on the previous question, please indicate in which way(s) you<br />
were not treated with respect by this educator or resident. (Mandatory for answers <strong>of</strong> 2 or<br />
below on the previous question.)<br />
Belittled or humiliated me<br />
Spoke sarcastically or insultingly to me<br />
Intentionally neglected or left me out <strong>of</strong> the communications<br />
Subjected me to <strong>of</strong>fensive sexist remarks or names<br />
Subjected me to racist or ethically <strong>of</strong>fensive remarks or names<br />
Engaged in discomforting humor<br />
Denied me training opportunities because <strong>of</strong> my gender<br />
Required me to perform personal services (i.e. babysitting, shopping)<br />
Threw instruments/bandages, equipment etc.<br />
Threatened me with physical harm (e.g. hit, slapped, kicked)<br />
Created a hostile environment for learning<br />
Other<br />
Treat me with Respect - Other<br />
If you chose other in the previous question, please explain in the comment section below.<br />
Treat Others with Respect<br />
I observed others (residents, residents, staff, patients) being treated with respect by this<br />
individual<br />
Insufficient<br />
contact to judge.<br />
1 This individual<br />
consistently failed to treat<br />
others with respect and<br />
generally displayed an<br />
2 This individual<br />
treated others with respect<br />
approximately half <strong>of</strong> the<br />
time; displayed an<br />
3 This<br />
individual<br />
treated others<br />
with respect<br />
4 This<br />
individual<br />
treated others<br />
with respect<br />
5 This<br />
attending<br />
consistently<br />
treated others
unpr<strong>of</strong>essional or abusive<br />
manner during all<br />
interactions.<br />
unpr<strong>of</strong>essional or<br />
disrespectful manner<br />
during the remainder <strong>of</strong> the<br />
time.<br />
59<br />
most <strong>of</strong> the<br />
time.<br />
almost<br />
always.<br />
with respect<br />
throughout the<br />
rotation.<br />
Treat Others with Respect - Reasons<br />
If you answered 2 or below on the previous question, please indicate in which way(s)<br />
Patients or Health Pr<strong>of</strong>essionals were not treated with respect by this educator or resident.<br />
(Mandatory for answers <strong>of</strong> 2 or below on the previous question.)<br />
Patients - Discussed confidential information in an inappropriate setting (e.g. cafeteria, elevator)<br />
Patients - Made derogatory or disrespectful comments about a patient or family<br />
Patients - Treated patients differently because <strong>of</strong> their financial status, ethnic background,<br />
religious preferences or sexual orientation<br />
Patients - Threw instruments/bandages, equipment etc.<br />
Patients - Created a hostile environment for patient care and/or learning<br />
Health Pr<strong>of</strong>essionals - Made derogatory or disrespectful comments about some health<br />
pr<strong>of</strong>essionals<br />
Health Pr<strong>of</strong>essionals - Treated health pr<strong>of</strong>essionals differently because <strong>of</strong> their financial status,<br />
ethnic background, religious preferences or sexual orientation<br />
Health Pr<strong>of</strong>essionals - Made <strong>of</strong>fensive sexist, racist, or ethnically insensitive remarks/names<br />
about some health pr<strong>of</strong>essionals<br />
Other<br />
Treat Others with Respect - Other<br />
If you chose other in the previous question, please explain in the comment section below.<br />
Teaching Skills, Overall<br />
Overall teaching effectiveness.<br />
Insufficient<br />
contact to judge.<br />
1 This attending was an<br />
overall poor teacher, either<br />
due to inadequate time spent<br />
teaching medical residents,<br />
ineffective style, or<br />
unpr<strong>of</strong>essional manner .<br />
2 3 This attending<br />
was an overall good<br />
teacher through<br />
dedication <strong>of</strong> adequate<br />
time to teaching and a<br />
generally effective<br />
style.<br />
4 5 This attending was an<br />
overall excellent teacher<br />
through dedication <strong>of</strong> time to<br />
teaching and a highly effective<br />
style, enabling significant skill<br />
development throughout the<br />
rotation.
Instructor Strengths<br />
What are the strengths <strong>of</strong> this instructor? (These comments will be viewed by the instructor, but will<br />
be anonymous and aggregated. For comments to be effective feedback, please be direct, specific, and<br />
constructive. General comments such as “good instructor” are too non-specific to be <strong>of</strong> value.)<br />
Instructor Improvements<br />
How could this instructor improve? (These comments will be viewed by the instructor, but will be<br />
anonymous and aggregated. For comments to be effective feedback please be direct, specific, and<br />
constructive. General comments such as “bad instructor” are too non-specific to be <strong>of</strong> value.)<br />
Confidential Comments, Educator<br />
This area is for giving constructive or corrective feedback that you don't feel comfortable giving<br />
directly. These comments are CONFIDENTIAL and will NOT go directly to the educator. They will<br />
be forwarded ANONYMOUSLY to the program director(s). Please be thoughtful, pr<strong>of</strong>essional, and<br />
constructive in your feedback.<br />
OPTIONAL CONFIDENTIAL COMMENT<br />
If you are willing to be contacted by the clerkship director to address a particularly concerning issue,<br />
please include your name and contact information below. This will only go the clerkship director<br />
and/or the site director with the goal <strong>of</strong> appropriately addressing the raised concerns.<br />
60
<strong>UCSF</strong> Academy <strong>of</strong> Medical Educators<br />
TOP Observation Form Lecture<br />
NAME: _____________________________________ OBSERVER: ___________________________<br />
TOPIC: ______________________________________________________________________________<br />
FOCUS OF OBSERVATION (discuss w/ faculty in advance):<br />
INTRODUCTION<br />
16. Introduced topic, stated objectives, <strong>of</strong>fered<br />
preview.<br />
17. Gained attention and motivated learning.<br />
18. Established climate for learning and for<br />
participation.<br />
61<br />
OBSERVATIONS<br />
BODY OF LECTURE OBSERVATIONS<br />
19. Presented 3 – 5 main points in clear and<br />
organized fashion.<br />
20. Provided supporting materials, examples, and<br />
summaries.<br />
21. Content level<br />
22. Effectively used visuals, handouts, and/or<br />
demonstrations. Include AV problems (if any),<br />
effective use <strong>of</strong> slides (set stage for each slide,<br />
focused audience on important parts <strong>of</strong> slides),<br />
use <strong>of</strong> pointer.<br />
23. Varied presentations (Used blackboard, slides,<br />
visuals).<br />
24. Transitions between topics.<br />
CONCLUSION OBSERVATIONS<br />
25. Summarized major principles, key points<br />
without introducing new materials.<br />
26. Provided closure or stimulated further thought.
TEACHER DYNAMICS OBSERVATIONS<br />
27. Exhibited enthusiasm and stimulated interest in<br />
content.<br />
28. Used appropriate voice, gestures, movement,<br />
and eye contact. Avoidance <strong>of</strong> unconscious use<br />
<strong>of</strong> repeated words (eg “um”, “ok”).<br />
29. Encourage active participation.<br />
30. Used questions to stimulate thought and<br />
discussion. Response to questions (repeated or<br />
rephrased question, concise answer).<br />
DEBRIEF<br />
4. ELICIT SELF-ASSESSMENT BY MENTEE FIRST.<br />
5. SUMMARIZE YOUR ASSESSMENT OF MENTEE’S STRENGTHS AND YOUR RECOMMENDATIONS<br />
(KEEP IN MIND AREAS OF FOCUS).<br />
STRENGTHS<br />
1.<br />
2.<br />
3.<br />
6. ACTION PLAN (RESIDENT TO COMPLETE)<br />
RECOMMENDATIONS<br />
62
<strong>UCSF</strong> Academy <strong>of</strong> Medical Educators<br />
TOP Observation Form Small Group<br />
FACULTY NAME:_________________________________________DATE:_____________<br />
GROUP SESSION:____________________________________________________________<br />
Describe specific observations for each element <strong>of</strong> the discussion.<br />
PROVIDED PREVIEW<br />
NOTES<br />
1. Introduced self and topic, <strong>of</strong>fered<br />
rationale for learning content and made<br />
connection to larger course clear.<br />
2. Stated objectives and provided preview<br />
<strong>of</strong> session content and process.<br />
3. Established positive learning climate and<br />
expectations for participation.<br />
4. Initiated discussion and captured<br />
attention.<br />
INVOLVED GROUP MEMBERS NOTES<br />
5. Encouraged active and balanced<br />
participation through in-class<br />
assignments, sub-grouping or other<br />
teaching techniques<br />
6. Used questions and silences or posed<br />
problems to stimulate thought and<br />
discussion.<br />
7. Exhibited enthusiasm and stimulated<br />
interest in content<br />
8. Managed group process issues<br />
63
COVERED CONTENT NOTES<br />
9. Progressed through content and<br />
focused discussion on main points<br />
10. Directed and paced discussion;<br />
managed time for each section<br />
11. Used teaching strategies to<br />
stimulate thinking and clarify ideas<br />
(e.g. provided analogies, examples or<br />
supporting data; rephrased and<br />
simplified complex statements;<br />
modeled reasoning process<br />
12. Used visuals to capture main ideas.<br />
13. Summarized periodically and<br />
bridged to next topic.<br />
PROVIDED SUMMARY NOTES<br />
14. Summarized key points (or asked<br />
others) and provided closure.<br />
15. Bridged to larger course or next<br />
small group session.<br />
16. Reviewed learning issues and made<br />
assignments<br />
17. Elicited feedback on session<br />
STRENGTHS<br />
64<br />
RECOMMENDATIONS
<strong>GME</strong> <strong>Evaluation</strong> <strong>Task</strong> <strong>Force</strong> <strong>Recommendation</strong><br />
PROGRAM EVALUATION BY FACULTY AND RESIDENTS<br />
The AC<strong>GME</strong> requires that residents and faculty evaluate the training program at least<br />
annually and that the evaluation results be used at the Annual Program Review meeting to<br />
assess the program’s performance and set future goals. Rather than recommend a specific<br />
evaluation form to be used by all <strong>UCSF</strong> <strong>GME</strong> programs, the <strong>Evaluation</strong> <strong>Task</strong> <strong>Force</strong> <strong>of</strong>fers<br />
these recommendations to help programs design their own forms.<br />
1. Align the program evaluation with the overall objectives <strong>of</strong> the program. For example,<br />
if the program aims to ‘create the next generation <strong>of</strong> academic physicians in . . . “, it<br />
would be important to address the career intention <strong>of</strong> the residents and the processes<br />
the program uses to encourage or support academic careers.<br />
2. Consider which program objectives and outcomes would be valid to assess using a<br />
survey <strong>of</strong> residents and faculty. The development <strong>of</strong> values, career priorities and<br />
intentions, perceived competency, and preparation for practice are examples <strong>of</strong><br />
outcomes that can be surveyed.<br />
3. Consider which methods and processes your program uses to achieve its objectives<br />
and survey faculty and residents on the perceived quality <strong>of</strong> these activities. Examples<br />
include the perceived quality <strong>of</strong> didactic experiences, clinical rotations, training sites<br />
and their resources; engagement <strong>of</strong> the faculty in general and at different training sites;<br />
support for research, pathways and individualized programs <strong>of</strong> study; access to the<br />
program director and other leaders; assessment <strong>of</strong> the learning environment.<br />
4. Allow open‐ended responses to questions about the programs strengths, opportunities<br />
for improvement, and specific suggestions for improvement.<br />
5. Emphasize the confidential nature <strong>of</strong> the survey and that the results will be shared<br />
only in composite form at the Annual Review Meeting.<br />
65
<strong>GME</strong> <strong>Evaluation</strong> <strong>Task</strong> <strong>Force</strong><br />
POTENTIAL MEASURES USED IN <strong>GME</strong> PROGRAM EVALUATION<br />
The Curriculum The Faculty Evidence <strong>of</strong><br />
Resident Learning,<br />
AC<strong>GME</strong><br />
Competencies<br />
Action plan from last<br />
Annual Program<br />
Review<br />
Action plan from last<br />
Annual Program<br />
Review<br />
Action plan from last<br />
Annual Program<br />
Review<br />
AC<strong>GME</strong> survey items AC<strong>GME</strong> survey items Global assessments by<br />
faculty<br />
Confidential program Confidential program MK: In‐training exam<br />
evaluation by faculty evaluation by faculty results<br />
and residents<br />
and residents<br />
Clinical activity Confidential<br />
Patient Care: Mini‐<br />
evaluations<br />
evaluations <strong>of</strong> faculty CEX, Focused<br />
teaching<br />
Assessments<br />
Didactic and skill Faculty development Pr<strong>of</strong>essionalism / ICS:<br />
laboratory evaluations activities: number and<br />
type<br />
360 degree evaluations<br />
In‐training exam Faculty mentorship PBLI<br />
results including and advising<br />
Critical appraisal<br />
content area scores<br />
assessments<br />
Faculty scholarly<br />
activity and<br />
involvement <strong>of</strong><br />
residents<br />
Program Resources,<br />
Achievement <strong>of</strong><br />
Overall Objectives<br />
Action plan from last<br />
Annual Program<br />
Review<br />
AC<strong>GME</strong> survey items<br />
Confidential program<br />
evaluation by faculty<br />
and residents<br />
Duty hours, stress and<br />
fatigue<br />
Post‐residency career<br />
trajectory<br />
Quality <strong>of</strong> rotation<br />
sites: clinical volume,<br />
resources, support<br />
Clinical teaching<br />
assessments<br />
SBP<br />
Performance on board<br />
M&M presentation, QI certifying examination<br />
project, or other<br />
activity<br />
Patient or procedure Protected time for<br />
log<br />
faculty teaching<br />
Scholarly project(s) Funding to support<br />
PD, PC, faculty leaders<br />
and program<br />
administrators<br />
Portfolio including Resident salaries and<br />
reflections and benefits including<br />
products<br />
travel support,<br />
protected time for<br />
electives / Pathways<br />
Individual Learning Resident recruitment<br />
Plan<br />
and retention<br />
Alumni survey<br />
66
Area / <strong>Evaluation</strong><br />
Instrument<br />
<strong>GME</strong> <strong>Evaluation</strong> <strong>Task</strong> <strong>Force</strong> <strong>Recommendation</strong><br />
PROGRESS REPORT FOR ANNUAL PROGRAM REVIEW<br />
The Curriculum (completed by residents)<br />
Clinical activity evals<br />
<strong>UCSF</strong>/MZ<br />
SFGH<br />
VAMC<br />
Other<br />
Didactic evaluations<br />
Seminars<br />
Skill labs<br />
In‐training Exam<br />
Total Score<br />
Content Area A<br />
Content Area B<br />
Content Area C<br />
Definition Mean (SD)<br />
or Percent<br />
3 5‐option items rated at end <strong>of</strong><br />
each rotation<br />
4 5‐option items rated after each<br />
session<br />
Standardized score where 200 is<br />
the mean and 20 is the SD based on<br />
national norms for each PGY<br />
67<br />
4.04 (0.50)<br />
3.96 (0.48)<br />
4.10 (0.35)<br />
3.60 (0.88)<br />
3.88 (0.42)<br />
4.50 (0.38)<br />
205 (20)<br />
180 (25)<br />
208 (18)<br />
202 (22)<br />
Standard<br />
(Goal)<br />
4.0<br />
4.0<br />
4.0<br />
4.0<br />
4.0<br />
4.0<br />
200<br />
200<br />
200<br />
200<br />
Performance<br />
Relative to<br />
Standard<br />
Met<br />
Not Met<br />
Met<br />
Not Met<br />
Not Met<br />
Met<br />
Met<br />
Not Met<br />
Met<br />
Met<br />
The Faculty (completed by residents unless otherwise noted)<br />
Confidential evals<br />
Average <strong>of</strong> 10 5‐option items<br />
<strong>UCSF</strong>/MZ<br />
4.04 (0.20) 4.0<br />
Met<br />
SFGH<br />
4.08 (0.18) 4.0<br />
Met<br />
VAMC<br />
4.10 (0.24) 4.0<br />
Met<br />
Volunteer<br />
4.00 (0.33) 4.0<br />
Met<br />
AC<strong>GME</strong> survey items re:<br />
faculty involvement<br />
Average % Yes (range <strong>of</strong> 6 items) 92% (88 ‐ 96) 90% Met<br />
Peer feedback sessions Percent <strong>of</strong> faculty who were<br />
observed at least once during past<br />
year<br />
28% 50% Not Met<br />
Evidence <strong>of</strong> Resident Learning, including AC<strong>GME</strong> Competencies<br />
Global <strong>Evaluation</strong>s<br />
PGY 1<br />
Average <strong>of</strong> 6 9‐option items<br />
6.8 (1.1) 5<br />
Met<br />
PGY 2<br />
completed after each rotation 7.1 (0.9) 5<br />
Met<br />
PGY 3<br />
7.6 (0.8) 5<br />
Met<br />
PGY 4 …<br />
8.3 (0.6) 5<br />
Met<br />
Medical Knowledge: MCQ Standardized score where 200 is<br />
PGY 1, etc.<br />
the mean and 20 is the SD based on<br />
national norms for each PGY<br />
220 (10) 200<br />
Met<br />
Patient Care: Mini‐CEX Average <strong>of</strong> 7 9‐option items<br />
PGY 1, etc.<br />
completed by faculty<br />
7.7 (1.2) 5<br />
Met<br />
Patient Care: Focused<br />
Assessment <strong>of</strong> Skills<br />
PGY 1&2<br />
PGY 3&4<br />
Percent <strong>of</strong> checklist items<br />
performed correctly<br />
71.5%<br />
83.2%<br />
70%<br />
90%<br />
Met<br />
Not Met
Pr<strong>of</strong> / ICS: Team, Self<br />
PGY 1, etc.<br />
Average <strong>of</strong> 9 9‐option items<br />
68<br />
8.2 (0.7)<br />
Pr<strong>of</strong> / ICS: Patient Survey Average <strong>of</strong> 10 5‐option items<br />
PGY 1, etc.<br />
4.5 (0.2) 4.0<br />
Met<br />
PBLI – Student teaching Average <strong>of</strong> 6 5‐option items<br />
PGY 1, etc.<br />
4.2 (0.5) 4.0<br />
Met<br />
PBLI – Critical appraisal Percent <strong>of</strong> checklist items met<br />
PGY 1, etc.<br />
90% 80% Met<br />
SBP – M&M evaluation Average <strong>of</strong> 8 5‐option items<br />
PGY 1, etc.<br />
3.8 (1.2) 4.0<br />
Met<br />
Case logs – Procedure A % residents in PGY year meeting<br />
PGY 1, etc.<br />
targets based on national<br />
100% 100% Met<br />
Case logs – Procedure B percentiles<br />
PGY 1, etc.<br />
Case logs – Procedure C<br />
100% 100% Met<br />
PGY 1, etc.<br />
Scholarly Project<br />
88% 100% Met<br />
Oral Presentation Score Total points out <strong>of</strong> 20<br />
18.2<br />
16<br />
Met<br />
Meeting Presentations % residents completing by PGY4 100% 90% Met<br />
Manuscripts Submitted % residents completing by PGY4 67% 80% Not Met<br />
Favorable Review<br />
% favorable (if submitted)<br />
33% 80% Not Met<br />
Program Resources, Achievement <strong>of</strong> Overall Objectives<br />
Duty hours – AC<strong>GME</strong> survey<br />
items<br />
Average % residents compliant 90% 90% Not Met<br />
Duty hours – <strong>GME</strong> reporting Average % residents compliant 86% 90% Not Met<br />
AC<strong>GME</strong> survey items re:<br />
program<br />
Average % Yes 92% 90% Met<br />
Program Eval by Residents Average <strong>of</strong> 8 5‐option answers 4.3 4.0 Met<br />
Program Eval by Faculty Average <strong>of</strong> 8 5‐option answers 4.0 4.0 Met<br />
Program Eval by Alumni Average <strong>of</strong> 8 5‐option answers 4.6 4.0 Met<br />
Written Board Exam Pass<br />
Rate on First Attempt<br />
Percent grads passing last year 100% 100% Met<br />
Oral Board Exam Pass Rate<br />
on First Attempt<br />
Career Trajectory<br />
Percent grads passing last year 88% 100% Not Met<br />
Fellowship Applicants % Applying to Fellowships<br />
67% 50% Met<br />
Fellowship Acceptance % Matched <strong>of</strong> Those Applying 100% 100% Met<br />
Case Volume: Procedure A Case volume available for resident<br />
(targeted at last review) training based on OR logs<br />
Total Available<br />
240 250 Not Met<br />
<strong>UCSF</strong>/MZ<br />
80<br />
100 Not Met<br />
SFGH<br />
60<br />
50<br />
Met<br />
VAMC<br />
46<br />
50 Not Met<br />
Extramural<br />
54<br />
50<br />
Met<br />
Adapted from Knight DA, Vannatta PM, O’Sullivan PS. A Process to Meet the Challenge <strong>of</strong><br />
Program <strong>Evaluation</strong> and Program Improvement. AC<strong>GME</strong> Bulletin 2006 (Sept): 5‐8.<br />
5<br />
Met
APPENDIX A<br />
AC<strong>GME</strong> Common Program Requirements – Section IVB<br />
AC<strong>GME</strong> Competencies<br />
The program must integrate the following AC<strong>GME</strong> competencies into the curriculum:<br />
• Patient Care<br />
Residents must be able to provide patient care that is compassionate, appropriate, and<br />
effective for the treatment <strong>of</strong> health problems and the promotion <strong>of</strong> health. Residents<br />
are expected to:<br />
[as further specified by the RRC]<br />
• Medical Knowledge<br />
Residents must demonstrate knowledge <strong>of</strong> established and evolving biomedical, clinical,<br />
epidemiological and social‐behavioral sciences, as well as the application <strong>of</strong> this<br />
knowledge to patient care. Residents are expected to:<br />
[as further specified by the RRC]<br />
• Practice‐based Learning and Improvement<br />
Residents must demonstrate the ability to investigate and evaluate their care <strong>of</strong> patients,<br />
to appraise and assimilate scientific evidence, and to continuously improve patient care<br />
based on constant self‐evaluation and life‐long learning. Residents are expected to<br />
develop skills and habits to be able to meet the following goals:<br />
o identify strengths, deficiencies, and limits in one’s knowledge and expertise;<br />
o set learning and improvement goals;<br />
o identify and perform appropriate learning activities;<br />
o systematically analyze practice using quality improvement methods, and<br />
implement changes with the goal <strong>of</strong> practice improvement;<br />
o incorporate formative evaluation feedback into daily practice;<br />
o locate, appraise, and assimilate evidence from scientific studies related to their<br />
patients’ health problems;<br />
o use information technology to optimize learning; and,<br />
o participate in the education <strong>of</strong> patients, families, students, residents and other<br />
health pr<strong>of</strong>essionals.<br />
• Interpersonal and Communication Skills<br />
Residents must demonstrate interpersonal and communication skills that result in the<br />
effective exchange <strong>of</strong> information and collaboration with patients, their families, and<br />
health pr<strong>of</strong>essionals. Residents are expected to:<br />
o communicate effectively with patients, families, and the public, as appropriate,<br />
across a broad range <strong>of</strong> socioeconomic and cultural backgrounds;<br />
69
o communicate effectively with physicians, other health pr<strong>of</strong>essionals, and health<br />
related agencies;<br />
o work effectively as a member or leader <strong>of</strong> a health care team or other pr<strong>of</strong>essional<br />
group;<br />
o act in a consultative role to other physicians and health pr<strong>of</strong>essionals; and,<br />
o maintain comprehensive, timely, and legible medical records, if applicable.<br />
• Pr<strong>of</strong>essionalism<br />
Residents must demonstrate a commitment to carrying out pr<strong>of</strong>essional responsibilities<br />
and an adherence to ethical principles. Residents are expected to demonstrate:<br />
o compassion, integrity, and respect for others;<br />
o responsiveness to patient needs that supersedes self‐interest;<br />
o respect for patient privacy and autonomy;<br />
o accountability to patients, society and the pr<strong>of</strong>ession; and,<br />
o sensitivity and responsiveness to a diverse patient population, including but not<br />
limited to diversity in gender, age, culture, race, religion, disabilities, and<br />
sexual orientation.<br />
• Systems‐based Practice<br />
Residents must demonstrate an awareness <strong>of</strong> and responsiveness to the larger context<br />
and system <strong>of</strong> health care, as well as the ability to call effectively on other resources in<br />
the system to provide optimal health care. Residents are expected to:<br />
o work effectively in various health care delivery settings and systems relevant to<br />
their clinical specialty;<br />
o coordinate patient care within the health care system relevant to their clinical<br />
specialty;<br />
o incorporate considerations <strong>of</strong> cost awareness and risk‐benefit analysis in patient<br />
and/or population‐based care as appropriate;<br />
o advocate for quality patient care and optimal patient care systems;<br />
o work in interpr<strong>of</strong>essional teams to enhance patient safety and improve patient<br />
care quality; and<br />
o participate in identifying system errors and implementing potential systems<br />
solutions.<br />
70
Resident <strong>Evaluation</strong><br />
1. Formative <strong>Evaluation</strong><br />
APPENDIX B<br />
AC<strong>GME</strong> Common Program Requirements – Section V<br />
The faculty must evaluate resident performance in a timely manner during each rotation or<br />
similar educational assignment, and document this evaluation at completion <strong>of</strong> the<br />
assignment.<br />
The program must:<br />
(1) provide objective assessments <strong>of</strong> competence in patient care, medical knowledge,<br />
practice‐based learning and improvement, interpersonal and communication skills,<br />
pr<strong>of</strong>essionalism, and systems‐based practice;<br />
(2) use multiple evaluators (e.g., faculty, peers, patients, self, and other pr<strong>of</strong>essional staff);<br />
(3) document progressive resident performance improvement appropriate to educational<br />
level; and,<br />
(4) provide each resident with documented semiannual evaluation <strong>of</strong> performance with<br />
feedback.<br />
The evaluations <strong>of</strong> resident performance must be accessible for review by the resident, in<br />
accordance with institutional policy.<br />
2. Summative <strong>Evaluation</strong><br />
The program director must provide a summative evaluation for each resident upon<br />
completion <strong>of</strong> the program. This evaluation must become part <strong>of</strong> the resident’s permanent<br />
record maintained by the institution, and must be accessible for review by the resident in<br />
accordance with institutional policy. This evaluation must document the resident’s<br />
performance during the final period <strong>of</strong> education, and verify that the resident has<br />
demonstrated sufficient competence to enter practice without direct supervision.<br />
______________________________________<br />
71
Faculty <strong>Evaluation</strong><br />
1. At least annually, the program must evaluate faculty performance as it relates to the<br />
educational program.<br />
2. These evaluations should include a review <strong>of</strong> the faculty’s clinical teaching abilities,<br />
commitment to the educational program, clinical knowledge, pr<strong>of</strong>essionalism, and scholarly<br />
activities.<br />
3. This evaluation must include at least annual written confidential evaluations by the<br />
residents.<br />
______________________________________<br />
Program <strong>Evaluation</strong> and Improvement<br />
1. The program must document formal, systematic evaluation <strong>of</strong> the curriculum at least<br />
annually. The program must monitor and track each <strong>of</strong> the following areas: resident<br />
performance; faculty development; graduate performance, including performance <strong>of</strong><br />
program graduates on the certification examination; and, program quality. Specifically:<br />
(1) Residents and faculty must have the opportunity to evaluate the program confidentially<br />
and in writing at least annually, and<br />
(2) The program must use the results <strong>of</strong> residents’ assessments <strong>of</strong> the program together with<br />
other program evaluation results to improve the program.<br />
2. If deficiencies are found, the program should prepare a written plan <strong>of</strong> action to<br />
document initiatives to improve performance. The action plan should be reviewed and<br />
approved by the teaching faculty and documented in meeting minutes.<br />
72
Surgical Competency<br />
Ob/Gyn - <strong>UCSF</strong><br />
APPENDIX C<br />
EXAMPLES OF FOCUSED ASSESSMENT TOOLS<br />
Evaluator: Date:<br />
Resident: PGY: 1 2 3 4<br />
Rotation:<br />
Diagnosis:<br />
<strong>UCSF</strong> Ob/Gyn Surgical Skill Checklist<br />
73<br />
Surgical Skills Assessment<br />
0 = poorly or never<br />
3 = majority or well<br />
1 = sometimes or marginal<br />
4 = always or excellent<br />
2 = usually or average<br />
Procedure: (R3) Total Abdominal Hysterectomy<br />
Knew patent history / surgical indication<br />
Operative Checklist<br />
Necessary lines in place (intravenous, foley)<br />
Rating X = not seen or indicated 1 = performed but poorly<br />
Patient positioned correctly on table<br />
Key:<br />
0 = not performed but indicated 2 = performed correctly<br />
Proper stirrups/retractor used for exposure<br />
Lights positioned<br />
1. Discuss the indications for hysterectomy X 0 1 2<br />
Observed sterile technique<br />
2. Discuss the indications for oophorectomy in conjunction with<br />
abdominal hysterectomy<br />
X 0 1 2<br />
Knew names <strong>of</strong> instruments<br />
3. Discuss the post-operative management <strong>of</strong> a patient status<br />
post TAH<br />
X 0 1 2<br />
Knowledge <strong>of</strong> anatomy<br />
4. Choice <strong>of</strong> abdominal incision X 0 1 2<br />
Instrument handling<br />
5. Ligation <strong>of</strong> round ligament X 0 1 2<br />
Respected tissue<br />
6. Anterior & posterior leaf <strong>of</strong> broad ligament opened<br />
X 0 1 2<br />
Moves not wasted<br />
7. Creation <strong>of</strong> broad ligament window X 0 1 2<br />
Kept flow <strong>of</strong> operation / thought ahead<br />
8. Identification <strong>of</strong> the ureter X 0 1 2<br />
Used assistants well<br />
9. Ligation <strong>of</strong> uteroovarian ligament vs infundibulopelvic ligament<br />
(+/-BSO)<br />
X 0 1 2<br />
Worked well with personnel<br />
10. Double ligation <strong>of</strong> pedicles X 0 1 2<br />
Worked well as primary surgeon<br />
11. Sharp dissection <strong>of</strong> bladder flap X 0 1 2<br />
12. Skeletonize uterine vessels X 0 1 2<br />
TOTAL =<br />
13. Cardinal ligament ligation X 0 1 2<br />
14. Uterosacral ligament ligation X 0 1 2 STRENGTHS:<br />
15. Vaginal cuff closure X 0 1 2<br />
16. Vaginal cuff suspension X 0 1 2<br />
17. <strong>Evaluation</strong> for hemostasis X 0 1 2<br />
AREAS FOR IMPROVEMENT:<br />
Modified from:<br />
For administrative use<br />
AJOS 1997; 173:226-230 Attending Signature:<br />
AJOS 1994; 167:423-427<br />
Entered by: _________<br />
Date: ______________<br />
Resident Signature:<br />
Rating<br />
Key:
EMERGENCY MEDICINE RESIDENCY PROGRAM<br />
RESUSCITATION COMPETENCY FORM<br />
Resident: ____________________________ Date:___________________________<br />
Attending Physician: ____________________________ Location: ________________________<br />
According to the AC<strong>GME</strong>, a major resuscitation is patient care for which prolonged physician attention is needed and interventions such<br />
as defibrillation, cardiac pacing, treatment <strong>of</strong> shock, intravenous use <strong>of</strong> drugs (e.g., thrombolytics, vasopressors, neuromuscular<br />
blocking agents), or invasive procedures (e.g., cut downs, central line insertion, tube thoracostomy, endotracheal intubations) that are<br />
necessary for stabilization and treatment.<br />
I. CLINICAL (Patient Care/Medical Knowledge)<br />
Primary Survey: Y N NA<br />
Airway assessed initially Y N NA<br />
Breathing then assessed<br />
Y N NA<br />
Oxygen started for respiratory distress<br />
Circulation assessed Y N NA<br />
Initial interventions Y N NA<br />
Protocol or treatment guideline<br />
followed<br />
Y N NA<br />
Patient reassessed frequently Y N NA<br />
Secondary Survey (head to toe<br />
exam):<br />
Y N NA<br />
Procedures performed competently Y N NA<br />
74<br />
Comments:<br />
II. ORGANIZATION (Communication/Pr<strong>of</strong>essionalism/Systems-Based Practice)<br />
Comments:<br />
Assigned roles Y N NA<br />
Communicates effectively Y N NA<br />
Asked for help when needed Y N NA<br />
Maintains situational awareness Y N NA<br />
Appropriate hand<strong>of</strong>f (SBAR) Y N NA<br />
□ COMPETENT □ NEEDS IMPROVEMENT
EMERGENCY MEDICINE RESIDENCY PROGRAM<br />
AIRWAY MANAGEMENT COMPETENCY FORM<br />
Resident: ____________________________ Date:___________________________<br />
Attending Physician: ____________________________ Location: ________________________<br />
I. PREPARATION:<br />
Personally assembled and tested all necessary equipment (e.g.,<br />
blades, ET tubes, oral/nasal airways, suction, BVM, etc.)<br />
Properly positioned himself/herself at the head <strong>of</strong> the bed and all<br />
necessary equipment within arm’s reach<br />
Verbalized an appropriate “Plan B” should initial attempts at airway<br />
management fail (e.g., use <strong>of</strong> a different type blade, gum elastic<br />
bougie, cric., etc.)<br />
II. MEDICATION MANAGEMENT:<br />
Ordered an appropriate induction and paralytic drug,<br />
demonstrating understanding <strong>of</strong> the particular<br />
indications/contraindications for this drug<br />
Ordered appropriate post-intubation sedation medication,<br />
demonstrating understanding <strong>of</strong> the particular<br />
indications/contraindications for this drug<br />
III. AIRWAY TECHNIQUE:<br />
75<br />
Y N<br />
Y N<br />
Y N NA<br />
Y N NA<br />
Y N NA<br />
Properly positioned patient/head Y N NA<br />
Effectively performed bag-mask-valve ventilation<br />
Maintained a patent airway (with good positioning, oral/nasal<br />
trumpets, etc.) prior to intubation<br />
Y N NA<br />
Y N NA<br />
Properly applied cricoid pressure Y N<br />
Demonstrated proper use <strong>of</strong> a laryngoscope and proper ET tube<br />
placement<br />
Y N<br />
Confirmed proper tube placement with: - Auscultation Y N NA<br />
- End-tidal CO2 Y N NA<br />
- CXR<br />
Y N NA<br />
Applied necessary alternate rescue airway technique(s) Y N NA<br />
IV. VENTILATOR MANAGEMENT:<br />
Ordered appropriate initial ventilator settings Y N NA<br />
V. DOCUMENTATION:<br />
Medications ordered on order sheet Y N<br />
Procedure documented in chart Y N<br />
□ COMPETENT □ NEEDS IMPROVEMENT<br />
Comments:<br />
Comments:<br />
Comments:<br />
Comments:<br />
Comments:
Structured Clinical Observation: Resident Interview<br />
LPH&C Medication Management Clinic<br />
John Q. Young, MD, MPP, <strong>UCSF</strong> Department <strong>of</strong> Psychiatry, v. 2.5.08<br />
May be used or adapted outside <strong>UCSF</strong> only with permission <strong>of</strong> the author jqyoung@lppi.ucsf.edu<br />
Resident Name:_________________________ Attending Name:___________________________<br />
Date:<br />
Instructions: 1. Each resident observed Q4 weeks. 2. Attending checks one box for each row and writes comments at<br />
bottom. 3. Attending reviews with resident and then places in John Young’s box in LP‐281 who will give copy to<br />
resident.<br />
Pharmacotherapy <strong>Task</strong><br />
Reviews chart<br />
Greets patient with respect & warmth<br />
Begins on time<br />
Maintains frame<br />
Establishes rapport<br />
Initial open ended question<br />
Obtains interval history with focus on target<br />
symptoms, medical or medication changes,<br />
intercurrent psychosocial stressors, progress in<br />
therapy.<br />
Assesses treatment response<br />
Encourages ventilation <strong>of</strong> feelings related to illness.<br />
Inquires about other treatments/treaters<br />
Assesses substance use/abuse<br />
Assesses adherence, including number <strong>of</strong> doses<br />
missed in past week and barriers.<br />
Monitors for adverse effects (Sg/Sx, Labs, AIMS, Wt.,<br />
BP), specifically for those associated with prescribed<br />
medications.<br />
MSE appropriately focused<br />
Assesses risk for violence to self and others<br />
If response less than expected, systematic approach to<br />
DDx<br />
Updates treatment plan based on diagnosis, phase <strong>of</strong><br />
illness, efficacy and response, adverse effects, & risk<br />
assessment<br />
Modifies treatment plan for less than expected<br />
responders<br />
Develops plan to address adherence if needed<br />
Develops plan to manage adverse effects, if applicable<br />
0<br />
NA<br />
76<br />
1<br />
Not<br />
Done<br />
2<br />
Done with<br />
suggestions<br />
for<br />
improvement<br />
3<br />
Done well<br />
(meets<br />
expectatio<br />
ns)<br />
4<br />
Done<br />
extraordinarily<br />
well – inspires<br />
me to do the<br />
same!
Educates patient about diagnosis, prognosis,<br />
treatment, and/or adverse effects<br />
Provides patient with simple advice on what can do<br />
to help self (e.g., exercise, sleep hygiene).<br />
Solicits and addresses patient’s questions<br />
Conveys hope and optimism and provides<br />
reassurance<br />
Appropriate follow up, incl. labs/tests, consults, next visit<br />
Documentation sufficient<br />
Informs other tx team members <strong>of</strong> plan, esp. therapists.<br />
77
Structured Clinical Observation: Resident Interview<br />
LPH&C Medication Management Clinic<br />
John Q. Young, MD, MPP, <strong>UCSF</strong> Department <strong>of</strong> Psychiatry, v. 8.1.07<br />
May be used or adapted outside <strong>UCSF</strong> only with permission <strong>of</strong> the author jqyoung@lppi.usf.edu<br />
Key feedback points, including what done well and at least one task to work on:<br />
DDx for less than Expected<br />
Response<br />
Modify treatment plan for less<br />
than Expected Response<br />
• Incorrect primary diagnosis?<br />
• Correct primary diagnosis, but insufficient treatment?<br />
• Poor adherence?<br />
• Under‐ or un‐treated comorbidity (e.g., substance abuse, axis I, axis II<br />
etc…)?<br />
• Intervening stressor<br />
• Adverse effects <strong>of</strong> treatment?<br />
• Alliance ruptured?<br />
A. Pharmacologic Interventions<br />
• Address adherence<br />
• Reassess dose and duration<br />
• Consider a switch to an alternative treatment<br />
• Augment with evidence based second and third line<br />
pharmacologic treatments<br />
• Treat comorbidities<br />
B. Nonpharmacologic Interventions<br />
• Provide further education<br />
• Provide opportunity to “ventilate” with active listening<br />
• Provide reassurance<br />
• Provide specific psychotherapy<br />
• Refer for psychotherapy<br />
• Behavioral intervention (e.g., sleep hygiene)<br />
• Improve alliance<br />
• Improve treatment <strong>of</strong> comorbidities such as substance abuse<br />
• Involve family members<br />
78
Communication competency-<br />
UCD OBG<br />
UC Davis Ob/Gyn Informed Consent Checklist<br />
Evaluator: Date:<br />
Resident: PGY: 1 2 3 4<br />
Rotation:<br />
Diagnosis:<br />
Procedure: INFORMED CONSENT<br />
Rating X = not seen or indicated 1 = performed but poorly<br />
Key: 0 = not performed but indicated 2 = performed correctly<br />
1.Know proper indications for procedure X 0 1 2<br />
2.Know alternatives<br />
3. Establishes rapport with patient<br />
X 0 1 2<br />
X 0 1 2<br />
4. Properly describes procedure in understandable terms X 0 1 2<br />
5. Realistically explains risks <strong>of</strong> procedure<br />
6.Discusses benefits <strong>of</strong> procedure<br />
7. Discusses alternatives to procedure<br />
8. Checks for patient understanding <strong>of</strong>ten<br />
9. Explains preop procedure<br />
Communication check list<br />
X 0 1 2<br />
X 0 1 2<br />
X 0 1 2<br />
X 0 1 2<br />
X 0 1 2<br />
79<br />
Communication Skills List<br />
0 = poorly or never<br />
3 = majority or well<br />
Rating Key:<br />
1 = sometimes or marginal<br />
2 = usually or average<br />
4 = always or excellent<br />
10.Explains hospital procedure<br />
11. Explains follow up<br />
X 0 1 2<br />
X 0 1 2<br />
12. Assess patient questions X 0 1 2 TOTAL =<br />
X 0 1 2<br />
X 0 1 2<br />
X 0 1 2<br />
STRENGTHS:<br />
AREAS FOR IMPROVEMENT:<br />
Attending Signature:<br />
Resident Signature:<br />
Communicates clearly<br />
Listens willingly and attentively<br />
Answers questions and provides explanations<br />
Respects patient does not demean<br />
Uses respectful language<br />
Compassion and kind to patient and family<br />
Attentive to details <strong>of</strong> patient comfort<br />
Worked well with personnel<br />
Nonverbal: shows interest
Long Form<br />
UNIVERSITY OF NORTH CAROLINA<br />
GRIEVING COMPETENCY INSTRUMENT<br />
Directions: Please indicate whether the physician completed the stated actions, with<br />
Y = completed (Yes) or N = did not complete (No)<br />
The Physician…<br />
G—Gather<br />
1. Ensured that all important survivors were present prior to delivery <strong>of</strong> the death notification.<br />
R—Resources<br />
2. Inquired about supportive resources.<br />
3. Facilitated access to supportive resources.<br />
I—Identify<br />
4. Clearly stated the name <strong>of</strong> the patient.<br />
5. Clearly introduced herself/himself.<br />
6. Clearly stated his/her role in the care <strong>of</strong> the patient.<br />
Check for Understanding<br />
7. Determined the level <strong>of</strong> knowledge the survivors possessed prior to their arrival in the<br />
waiting room.<br />
8. Provided an appropriate opening statement (i.e., avoided bluntly stating death <strong>of</strong> patient).<br />
9. Used preparatory phrases to forecast the news <strong>of</strong> death.<br />
E—Educate<br />
10. Clearly indicated the chronology <strong>of</strong> events leading up to the death <strong>of</strong> the patient.<br />
11. Clearly indicated the cause <strong>of</strong> death in an understandable manner.<br />
12. Used language appropriate for the survivor’s culture and educational level.<br />
13. Provided a summary <strong>of</strong> important information to ensure understanding.<br />
V—Verify<br />
14. Used the phrase “dead” or “died.”<br />
15. Avoided using euphemisms.<br />
16. Avoided medical terminology/jargon or clearly explained such terms when used.<br />
Space<br />
17. Was attentive and not rushed in his/her interaction with survivor.<br />
18. Paused to allow the family to assimilate the information before discussing details.<br />
80
I—Inquire<br />
19. Allowed the survivor to react to the information and ask questions or express concerns.<br />
20. Encouraged the survivor to summarize important information to check for understanding.<br />
21. Immediately but appropriately corrected any misconceptions <strong>of</strong> the survivor.<br />
N—“ Nuts and bolts”<br />
Explained and addressed the following details <strong>of</strong> the patientʹs post‐mortem care adequately.<br />
22a. Organ donation<br />
22b. Need for an autopsy<br />
22c. Funeral arrangements<br />
22d. Personal effects<br />
G—Give<br />
25. Established personal availability to answer questions for the survivor at a later date.<br />
26. Provided the survivor appropriate information to contact the physician at a later time.<br />
27. Provided the survivor appropriate information to contact resuscitation or post‐mortem<br />
care providers.<br />
81