Mentores En Pbe

  • May 2020
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Mentores En Pbe as PDF for free.

More details

  • Words: 6,930
  • Pages: 9
Original Article

Guided Mentorship in Evidence-Based Medicine for Psychiatry: A Pilot Cohort Study Supporting a Promising Method of Real-Time Clinical Instruction Anthony Joseph Mascola, M.D. Objective: Evidence-based medicine has been promoted to enhance clinical decision making and outcomes in psychiatry. Residency training programs do not routinely provide instruction in evidence-based medicine. Where instruction exists, it tends to occur in classroom settings divorced from the clinical decisionmaking process and is focused narrowly on appraisal of evidence quality. The goal of this pilot study was to develop and evaluate the promise of a method of “hands-on” instruction in evidencebased medicine done in real clinical time.

ciency of skills by attending subjective evaluation. Trainees’ subjective experiences overall were positive. Conclusion: Guided mentoring in evidence-based medicine appears promising for further study. Academic Psychiatry 2008; 32:475–483

E

Received March 20, 2007; revised July 17, 2007; accepted July 25, 2007. Dr. Mascola is affiliated with the Department of Psychiatry and Behavioral Sciences at Stanford University in Stanford, Calif. Address correspondence to Anthony Joseph Mascola, M.D., Stanford University, Psychiatry and Behavioral Sciences, 401 Quarry Rd., Mail Code 5722, Stanford, CA 94305-5722; amascola@ stanford.edu (e-mail). Copyright 䊚 2008 Academic Psychiatry

vidence-based medicine (EBM) has been defined as “the conscientious, explicit and judicious use of current best evidence in making decisions about the care of individual patients” (1). EBM promotes integrating critically appraised research evidence, clinical expertise, and the patient’s unique values and circumstances in a collaborative, shared decision-making model that aims to optimize outcomes (2). The EBM model has been operationalized into a series of five steps sometimes referred to as the 5As (2): asking focused, answerable questions useful in efficiently querying electronic repositories of published literature; acquiring high quality evidence rapidly to answer such questions; appraising evidence found for its relevance, validity, and importance; applying the findings of information found appropriately by considering the strength of the evidence discovered, the clinician’s judgment of the generalizability of the evidence found to the individual clinical situation encountered, and the patient’s values and preferences; and assessing the outcome of the implemented decision as well as the process used in reaching it. Detailed procedures for implementing each of the five steps have been well described (1–10). Several persons have advocated for increased adoption of the EBM model in mental health care settings in an effort to improve the quality of care given to patients (3, 4, 6–14). Proponents of EBM believe that more optimal medical decision making can be achieved by increasing clinician awareness of potential biases that can influence

Academic Psychiatry, 32:6, November-December 2008

http://ap.psychiatryonline.org

Methods: A modularized curriculum to promote decisionmaking strategies using evidence-based medicine during the course of actual patient care was delivered by an attending physician mentoring a small team on the inpatient and consultationliaison psychiatry services at Stanford. A staggered cohort of 24 consecutive trainees was followed between August and January 2007. Measures of trainees’ skills in evidence-based medicine were assessed before and after mentoring. A blinded grader scored each inventory according to an explicit, predefined rubric. Demonstrated proficiency in delivery in each of the core skills of evidence-based medicine was assessed as a secondary outcome measure via the attending physician’s unblinded subjective evaluation of trainee performance. Subjective descriptions of the experience were obtained via review of trainees’ evaluations. Results: Postmeasures of knowledge and skills in evidencebased medicine increased significantly relative to baseline. The Cohen’s d effect size was large and clinically meaningful. The majority of trainees were able to demonstrate adequate profi-

475

MENTORSHIP IN EBM FOR PSYCHIATRY

judgment and by encouraging practices to minimize the most harmful of these biases. These proponents cite a number of examples in which usual care practices are observed to vary substantially from the practices most strongly supported in the research literature, possibly resulting in patients not receiving interventions that have the potential to provide substantial benefit (6, 12, 13). EBM encourages objective clinical decision making informed by findings from recent, well-designed randomized controlled trials, systematic reviews, and meta-analyses whenever available. EBM also aims to encourage greater appreciation of patients’ values and preferences in decision making (15). The model is critically dependent upon expert clinical judgment and does not advocate “cookbook style” medicine, because it recognizes that the majority of clinical decisions are nuanced and complex, especially in psychiatry. The EBM model attempts to make the clinician aware of the many factors that may influence decision making, and the potential for bias that these factors may introduce, and to allow this awareness to become part of expert clinical judgment. For example, industry marketing pressures are great in psychiatry (16), and several have voiced concern about the influences that these forces may have (17). The EBM process has arisen in part to offset influences such as these and to encourage decision making that is less vulnerable to bias from such influences. EBM attempts to integrate sound clinical judgment, unbiased research evidence, and patient values into a method of decision making that has the potential to result in more optimal outcomes for patients. The topic is one which engenders debate, and various stakeholder preferences may influence the degree to which EBM is embraced (5). The Accreditation Council for Graduate Medical Education (ACGME) has included demonstrating proficiency in EBM as one of the six core general competencies under practice-based learning. Academic institutions often do not have formal methods of instruction in EBM skills specified in their curricula. One recent estimate in psychiatry concluded that only 36% of residency programs have specific instruction in EBM and that only half of these programs have a reasonably comprehensive course of instruction in place (7, 8). There have been trials in internal medicine assessing the effect of EBM training upon trainee knowledge and skills (18); however, no such trials have been conducted in psychiatry to date. Several recent articles have highlighted the difficulty in evaluating the effectiveness of teaching methods of EBM (19–21). Validated measures of EBM skills in psychiatry have yet to be introduced. 476

http://ap.psychiatryonline.org

Where instruction in EBM exists, it tends to be focused on the third step of the process—appraisal of evidence quality—to the exclusion of other skills that may be much more practical and important in a clinical setting (19–22). Such instruction is often delivered in a classroom setting by a nonclinician researcher or biostatistician and may be divorced from existing clinical pressures which make dissemination of EBM into practice quite challenging. Other attempts at promoting EBM have focused on journal clubs, where appraisal of the literature is again the focus, and the setting is also far removed from clinical decision making. Green (22), in a systematic review of EBM training, called for “curricula which use residents’ actual clinical experiences and teach EBM skills in real time in existing clinical and educational venues.” The ACGME (23) strongly reaffirms this call: The educational research examining the implementation of EBM curricula suggests the following: (1) There is little evidence to support the conclusion that learning EBM as a “content area” through didactics alone (or even through journal clubs) encourages residents to use EBM in their practices. EBM must be integrated into clinical practice on the wards and in the clinics (2). Faculty members need both to embrace the EBM approach to teaching medicine and to model its use in their own practice.

The goal of this unfunded pilot study was to develop a “real time” EBM clinical mentoring curriculum consistent with the above recommendations and to crudely assess its promise for further study. Specific goals were to estimate the direction, magnitude, and precision of estimate of change in EBM skills after exposure to such training. A modularized curriculum was developed and utilized with the small teams of trainees assigned to an attending physician experienced in EBM. Evidence-based decision-making strategies were introduced at the point of clinical care and mentored throughout the rotation. Real-time workarounds to the many barriers that exist in incorporating EBM to an actual practice setting were demonstrated by the mentor. Trainees were expected by the end of the rotation to be able to independently demonstrate an ability to apply such skills in caring for their patients. Their ability to do this was measured in several ways. Discussion of the practicality and usability of the various EBM methods was encouraged throughout the process.

Curriculum The curriculum was derived from the usual “5A” EBM model (2, 7, 10). The process of delivery and an overview of the content areas are described briefly below. More detail is available from the author upon request. Academic Psychiatry, 32:6, November-December 2008

MASCOLA

elements of the EBM model in a busy clinical setting. A small number of PowerPoint slides were prepared for each module, but informal methods such as bedside discussions or whiteboard talks in the team rooms to address the specific learning objectives for each module were encouraged as alternatives to the slides, given the real-time nature of delivery of the instruction. The information was tailored to the trainees’ prior knowledge. All information was presented in an interactive, Socratic fashion using the Elicit, Provide, Elicit framework specified by Rollnick et al. (30) and adapted from the Motivational Interviewing intervention for health behavior change. A module might be discussed wherever relevant, cued by an actual patient care scenario. The teaching occurred in the workroom or hallway, at the bedside, or wherever time or space permitted. The goal of the curriculum was to ensure that the learning objectives on the syllabus for each module were addressed in Socratic form during the rotation. More information on the content and delivery of the six modules is available from the author. Trainee decision-making autonomy was encouraged as much as possible throughout the rotation to enhance intrinsic motivation in the learning process consistent with the Self-Determination Model of motivation proposed by Deci, Ryan, and Williams (31, 32). If discrepancies arose between the trainee’s decision and the attending’s preference, the trainee’s concerns were elicited and carefully considered. Treatment decisions were negotiated consistently with the same shared decision-making model (3) that EBM encourages in discussing treatment options with patients. The discrepancies among various styles of decision making and the pros and cons of each approach were elicited from the trainee. Where the attending physician felt a strong need to intervene despite the trainees’ preferences, such interventions were to be delivered in a fashion consistent with the Self-Determination Model (31, 32).

The rotation began with an introductory Socratic discussion intended to elicit a definition of evidence-based medicine (EBM) and the trainees’ initial perceptions of the pros and cons of practicing in this manner. Emphasis was placed on exploring the prior knowledge, skills, and attitudes that trainees had formed about clinical decision making in a style adapted from the Motivational Interviewing (24) model. Motivational Interviewing is a psychosocial intervention model that has shown limited but favorable evidence of benefit in a variety of target behavioral domains, ranging from substance abuse to diet and exercise behaviors (24–27). It places importance on first empathically exploring and resolving an individual’s ambivalence about a behavior change prior to engaging in planning any action. The EBM model was thus initially outlined and explored from the experiences of the group. The pros and cons of EBM were explored in contrast to other modes of practice. Each trainee was asked to consider the upcoming rotation as an opportunity to become further acquainted with the EBM style of decision making and to see whether EBM offered any advantages to them in their future practice. Humor was used to generate open discussion of the limitations of the extremes of various practice styles, including EBM (28, 29). Each trainee was e-mailed a syllabus that specified the EBM objectives to be learned, practiced, and assessed during the rotation. The e-mail described in detail the manner of appraisal of these skills. Trainees were notified that they would receive a written pre- and posttest and would also have their ability to apply EBM decision-making strategies assessed by the attending physician according to criterion detailed in their syllabus. This syllabus and other training materials are available from the author upon request. During the rotation, trainees were encouraged to use their own clinical judgment critically while making decisions and not to blindly adopt any intervention that they thought might be consistent with EBM for the sake of pleasing the attending. They were expected to know what the EBM model might propose at a given point in clinical care and to gain some experience in practicing these methods; however, they were encouraged to think and decide for themselves what they might do at any given time as much as possible. The goal was to generate experience with, and discussion of, the practicality and usefulness of EBM strategies in making decisions in an applied clinical setting. Six brief, 10–15 minute discussion modules were organized with focused learning objectives specified. The modules were intended to convey how to apply each of the key

Behavior Change Theory The overall intervention was developed consistent with the Social Learning Theory framework proposed by Bandura (33, 34), which emphasizes the importance of observing and modeling the behaviors, attitudes, and emotional reactions of others in learning. Trainees’ outcomes expectancies (i.e., what they believe might happen if they were to acquire and use EBM) and their self-efficacy (i.e., their confidence in being able to learn and apply the EBM model in an actual practice setting despite the barriers) are important constructs in Social Learning Theory. These constructs were explored and targeted for intervention

Academic Psychiatry, 32:6, November-December 2008

http://ap.psychiatryonline.org

477

MENTORSHIP IN EBM FOR PSYCHIATRY

utilizing methods adapted from Motivational Interviewing (24), because this model can be used consistently with this theory with little difficulty. The importance of supporting the trainees’ autonomy in decision making throughout the process was anchored in Self-Determination Theory, developed by Deci and Ryan (31), as was the importance of supporting the patient’s participation and autonomy in the medical decision-making process (35–39). Self-Determination Theory is an empirically supported model of motivation which proposes that behaviors intrinsically motivated out of volitional choice, consistent with deeply held values, are much more durable than behaviors encouraged as a result of extrinsically applied pressures. Ambivalence toward practicing in a manner consistent with the EBM model is assumed to be a natural part of the training process and worthy of respect and careful consideration in the Motivational Interviewing model. There was no assumption made that the EBM model was the “correct” one to apply or that “resistance” to adopting EBM practices implied bad intent or inferiority of thought. Residents were asked to try the EBM techniques and to judge for themselves whether they held sufficient value to continue using beyond the rotation, and to discuss their feelings and reactions to the process throughout the rotation. The difficulty in practicing in a manner consistent with the EBM model for those who genuinely felt it to be valuable was introduced in a framework consistent with Ainslie’s Intertemporal Bargaining theory (40), which proposes a mechanism by which smaller, more immediate rewards can often seductively overshadow larger benefits available at a delay, resulting in choices that appear irrational and counterproductive. When asked, many clinicians might be likely to report that they would prefer to engage in practices which would result in the long run in less biased decision making and more optimal patient outcomes, yet few academic centers or clinicians appear to utilize strategies like this as a routine part of their practices (7, 8). The few minutes of time it takes to look up a systematic review on a given topic are not insignificant when other tasks are demanding attention. Practical strategies to enhance self-efficacy in overcoming barriers to applied EBM practice were modeled by the attending mentor, a young faculty member and recent graduate of the program, with the hope of enhancing each trainee’s self-efficacy via vicarious learning consistent with the Social Learning Theory framework and Intertemporal Bargaining Theory, both of which posit self-efficacy as a critical variable in maintaining motivation for delayed pursuit of reward in the set478

http://ap.psychiatryonline.org

ting of more immediately tempting choices. The goal was to begin with modeling and to shift over time to allow the trainee his or her own successful mastery experiences (33). Deliberate attempts were made to elicit detailed implementation intentions from each trainee, consistent with the work of Gollwitzer et al. (41), which supports the idea that detailed planning to overcoming barriers to implementation of desired behaviors is an effective way to encourage durable behavior change in the setting of obstacles to such change.

Methods The study occurred between August 2006 and February 2007 at Stanford University Medical Center and utilized a prospective cohort design. The training program was mentored with trainees from each of the inpatient psychiatry services, including the voluntary patient unit, the acute involuntary patient unit, the geropsychiatry service, and the consultation-liaison psychiatry service. All medical students and residents consecutively assigned by the residency administration to the EBM attending mentor’s team during the study period were followed. Such assignment was not random; however, it was unlikely to be influenced either by trainee or attending preference. Trainees were unlikely to be familiar with the supervision style of the EBM attending mentor because this was the first year the mentor was assigned to the supervision of residents and medical students on the wards. Also, scheduling assignments involving both medical students and resident rotations are made centrally by the psychiatry residency program administration and the medical school before the start of the academic year in a process independent of attending assignment, which is administered by a separate, independent body. The EBM attending had no role in selecting students or residents who would or would not be assigned to his service at any time before or during the study period.

Background and Training of EBM Attending Mentor The attending who mentored the evidence-based medicine (EBM) curriculum was a first-year assistant clinical professor at Stanford in the Department of Psychiatry and Behavioral Sciences. He recently performed his internship, residency, and fellowship at the same institution. The training he received in EBM came from several sources: 3 years of weekly attendance in a Methodology in Research in the Behavioral Sciences Workshop, conducted by biostatistician Helena Kraemer during the years 2003 to 2006; Academic Psychiatry, 32:6, November-December 2008

MASCOLA

recent completion of a 3-year, National Institute of Mental Health-sponsored research fellowship program, in which the attending served as the principal investigator on a randomized controlled trial (42); participation in the Dartmouth Summer Institute in Evidence-Based Psychiatry in the summer of 2006; and participation in the Society of Behavioral Medicine Evidence-Based Behavioral Medicine workshop in the spring of 2006, as well as self-study from a variety of EBM resources (3–10). In addition to the above academic training in EBM, the attending had several years of applied clinical experience in implementing these methods. He applied the EBM model to his clinical practice during internship, residency, and fellowship, and had 2 years of experience after residency working as an inpatient psychiatry attending, managing between 20 and 30 patients each day using the EBM model. The attending also had a considerable amount of experience working with outpatients using evidence-based psychosocial interventions, including motivational interviewing and several cognitive behavior therapies. This attending received 7 years of weekly supervision in a wide range of cognitive behavior intervention models and received extensive training in the motivational interviewing model employed in this study. The attending has been a trainer in the Motivational Interviewing Network of Trainers (http://motivationalinterview.org/training/mint.htm) for 3 years and has previously conducted a randomized controlled trial utilizing an adaptation of motivational interviewing and cognitive behavior therapy (42). He has had 7 years of experience in weekly patient care using evidence-based psychosocial intervention models and continues to maintain at least 10 hours of weekly outpatient contacts using evidence-based psychosocial interventions and their combination with evidence-based psychopharmacologic interventions in the Behavioral Medicine Clinic at Stanford. He has trained Stanford medical students, residents, and psychology graduate students in the practice of evidence-based psychosocial interventions and pharmacotherapies in classroom didactics as well as applied trainings. Measures of EBM skills were assessed before and after exposure to the training program via blinded scoring of the Dartmouth Evidence-Based Psychiatry Inventory (2006 unpublished manuscript of Merrens et al.). The Inventory is a 9-item measure assessing the ability of a trainee to ask structured questions useful in efficiently querying electronic databases, acquire high-quality evidence rapidly, appraise the validity of information found, and apply information utilizing a shared decision-making model.

Open-ended questions with paragraph-length responses are required for each item. There are no multiple-choice or true-false items. Questions are hand scored using a standardized grading rubric. This questionnaire has very good face validity for each of the EBM domains assessed above; however, psychometric properties are not yet available. Unfortunately there are not yet any validated measures available to assess psychiatry trainees’ EBM skills. There are a limited number of instruments for which psychometric data are available that are beginning to appear in other areas of medicine (mostly in internal medicine) (21); however, the bulk of these measures are designed to assess critical appraisal skills and do not assess other essential EBM skills that may reflect a practitioner’s ability to apply the key components of the EBM model in an actual practice setting. The Dartmouth instrument incorporates assessment of a broad palette of EBM skills and is quite similar to an instrument previously validated in a large family practice residency program at the University of California, San Francisco, called the Fresno test (43). The Dartmouth instrument, like the Fresno test, consists of a brief clinical vignette followed by open-ended questions. The first seven questions are adaptations of those from the Fresno test, but the vignette illustrates a psychiatric rather than ambulatory medicine patient encounter. In addition, an item on the Dartmouth inventory assesses the trainee’s ability to determine how to apply information found by considering the strength of the information, whether the information found is relevant to the given clinical scenario, and to determine what the patient’s values and preferences are. Another item assesses the trainee’s ability to involve the patient in decision making consistent with the shared decision-making goals of EBM. These skills are not assessed on the Fresno test and are important in psychiatry. There are no items on diagnostic appraisal on the Dartmouth instrument. Similar to the Fresno test, there is a predefined scoring rubric for grading each question. Unfortunately, no instruments were available with psychometric data that would be meaningfully applicable to this study setting, which remains a major weakness for this type of research. The Dartmouth instrument was selected as a result of its promising face validity and breadth with respect to the goals of measuring the applied EBM skills targeted by the curriculum. Because of the lack of validation data, demonstration of adequate real-time performance of EBM skills was also assessed as a secondary measure by subjective appraisal from the attending physician according to the detailed ru-

Academic Psychiatry, 32:6, November-December 2008

http://ap.psychiatryonline.org

479

MENTORSHIP IN EBM FOR PSYCHIATRY

bric specified in the syllabus, which is available upon request. This instrument is similar to an objective structured clinical exam (OSCE), except that it is conducted by observing the trainee’s ability to apply EBM skills in managing actual patient scenarios rather than using standardized patients. Feedback was provided after each case presentation with the goal that by the end of the rotation, the trainee would independently demonstrate an ability to perform each of the key EBM tasks in an actual clinical setting. This measure is obviously potentially biased because the attending was also the principal investigator for the study. It was impossible to blind the attending to the identities of the participants or to their pre- or postexposure status to the mentoring program. Despite the limitations, this evaluation was used as a secondary outcome measure because no reasonable alternative external standard of performance was available. It was beyond the resources of the pilot study to include measures of patient outcomes. Trainees’ subjective descriptions of the experience were obtained via review of the trainees’ teaching evaluations of the attending as well as their e-mailed feedback obtained after completion of the rotation. These descriptions should not be considered anonymous and were crudely categorized as being either globally positive or negative in nature. They are also potentially prone to bias, but are included to give at least some, albeit limited, estimate of the experience of the trainee during the program. The sample size for this unfunded pilot study was determined by the maximum number of consecutive trainees who were assigned to the attending during the 6-month pilot development period. For a two-tailed t test for dependent means at the 0.05 level of significance, with an expected Cohen’s d effect size of 0.5, 20 subjects would result in a power of 0.59. It was not feasible or practical to conduct an adequately powered trial at this stage of development of the curriculum. The goal of the pilot study was to develop a training program and crudely estimate the effect size and confidence interval of the primary outcome measure to lay the groundwork for a future adequate trial. For the primary outcome measure, a two-tailed t test for dependent means was performed using SPSS 15.0 (SPSS for Windows, 2007, Chicago). Cohen’s d effect size and 95% confidence interval (95% CI) were determined using ESCI-delta (2001, Bundoora VIC, Australia). Scoring of the attendings’ appraisal of the EBM performance of the trainee was performed according to the predefined scoring rubric distributed with the syllabus, resulting in a binary 480

http://ap.psychiatryonline.org

rating indicating the trainee had either sufficiently or insufficiently demonstrated competency in each of the 5A EBM skill domains. The raw number of globally positive and negative summary feedback reports received was tallied to describe trainee subjective experience during the rotation.

Results Twenty-four consecutive trainees were assigned to the EBM attending during the study period. All 24 were exposed to the training methods described in this trial, and each participated in the assessment. There was no loss to follow-up. Results from all trainees are included in the analysis. Fourteen trainees were men and 10 were women, 13 were medical students, and 11 were residents. The mean number of years of graduate and postgraduate education was 5, with a range between 3 and 7. Postmeasures of knowledge and EBM skills on the Dartmouth Inventory increased significantly by 4.9 points relative to baseline out of 17 total possible points on the exam (p⬍0.001; 95% CI 3.7–6.1). The Cohen’s d effect size was large (1.7; 95% CI 1.1–2.3) and clinically meaningful. The majority of trainees (22/24) were able to demonstrate adequate proficiency of skills by subjective evaluation from the attending using the OSCE scoring rubric included in the syllabus. Trainees’ subjective experiences overall were quite positive, with 17/24 providing strong, globally positive feedback, 5/24 not providing any feedback, and 2/24 providing constructive negative feedback mixed with positive feedback. Stratifying the trainees by median split into groups most and least experienced revealed no significant difference in performance on pretest measures of EBM skills via the Dartmouth instrument. The correlation (Spearman’s rho) between years of medical education and EBM skill or practices at baseline via the Dartmouth instrument was not significant.

Discussion A method of instruction using guided mentoring of EBM skills during actual patient care appears promising for further study despite the limitations inherent in the pilot data presented. Few residency programs appear to be implementing EBM training despite evidence to suggest that clinical practices may vary significantly from those practices which have been most strongly supported in the research literature. Where training programs do exist, inAcademic Psychiatry, 32:6, November-December 2008

MASCOLA

struction tends to emphasize literature appraisal in a classroom setting. Such training does not adequately address the complexities and challenge of delivering evidencebased care to actual patients in working environments that are often quite hostile to such practices. Baseline measures of EBM skills among trainees in this study were not correlated with years of previous training or clinical experience. They increased significantly after brief exposure to hands-on training. The above curriculum may represent a promising method of instruction to facilitate learning important clinical skills in a manner practical enough for application in a complex applied environment. There are many barriers to practicing EBM in “the real world.” Measures of psychopathology are usually validated among patient populations with single diagnoses, and access to research assistants with large amounts of time and other resources simply do not exist in clinical settings. Finding clinically practical assessment strategies that are efficient, reliable, and valid is extremely challenging. Finding valid estimates of disease prevalence and likelihood ratios to narrow the differential diagnosis remains mostly a task for the future. Rapidly acquiring information that is least susceptible to bias is a very different task for a journal club presenter than for a clinician on-call alone at midnight in the emergency department with a pager going off every few minutes and several acute patients that need to be seen and treated quickly. Where information is available, it is often limited. The interventions appearing in randomized trials often target populations selected specifically to have single diagnoses and a severity of illness that allows for high rates of study retention and follow-up. Such studies often exclude persons who may be much more representative of the patients seen in actual clinical practice settings, making extrapolation of findings difficult. Being able to apply information in a reasonable manner given the many scenarios that arise in the clinical setting far removed from the questions and populations included in research trials requires a significant degree of educated judgment as well as consideration of patient preferences. Mentorship in such strategies may prove to be critically important in enabling clinicians to perform these tasks confidently in busy, complex clinical settings. This study is obviously limited by several factors. There are no validated measures of EBM skills available to assess psychiatry trainees’ proficiency. The Dartmouth instrument has face validity; however, its psychometric properties have not been established rigorously. This should be a major focus for future efforts in medical education. Including the attending’s appraisal of trainee performance is

subject to considerable potential bias. For a future randomized controlled trial to be feasible, better validated measures will need to be developed. Unfortunately no feasible mechanisms were available to assess the quality of care delivered with respect to concordance with evidencebased guidelines either before and after the program or in comparison with other trainees’ management strategies. Whether the instructional program actually changed the quality of care delivered has not been convincingly proven. Attempts were made to ensure that every patient seen on the service was assessed using a careful consideration of the pretest probability of various diagnoses and reference to explicit DSM-IV-TR criterion—ideally through the use of a structured assessment instrument with known psychometric properties. Every effort was made to ensure that the treatment plan referenced a high-quality systematic review or meta-analysis where available, in accord with the principles of EBM. Eliciting the preferences and values of the patient was considered of paramount importance in deciding upon a course of treatment consistent with the shared decision-making model. However, no practical mechanism was available to assess and document this in an unbiased fashion. The study was not randomized. Trainees’ scores may have improved over time for reasons other than the intervention. This seems unlikely, because stratifying the trainees by median split into groups most and least experienced revealed no significant difference in performance on pretest measures of EBM skills using the Dartmouth instrument. There was no significant correlation between years of education and EBM skill at baseline on this measure. My experience in assessing each of the trainees is that very few were aware of or able to apply EBM skills to actual patient scenarios at the start of the rotation. EBM skills do not appear to be regularly taught in a manner in which trainees are able to apply them, at least in this institution to this examiner. Mentorship appeared to affect this. Such observations are quite obviously prone to significant bias, no matter how objective the author might try to be. Causality cannot be determined in a trial of this design. The subjective experiences of trainees reported here are potentially subject to considerable bias because they were not obtained in blinded fashion. Obviously unblinded feedback to a supervisor responsible for grading performance can be subject to considerable bias. These ratings were included to give some preliminary estimate of the trainees’ qualitative experience, and obviously should be viewed with considerable caution. Future research should aim first at developing validated

Academic Psychiatry, 32:6, November-December 2008

http://ap.psychiatryonline.org

481

MENTORSHIP IN EBM FOR PSYCHIATRY

assessment methods specifically directed toward assessing applied practice skills of EBM trainees in mental health settings. A project is being organized to develop such a measure here. Such an instrument is necessary prior to conducting a larger randomized trial. Ideally, a future controlled trial can be conducted where trainees are randomly assigned to the experimental EBM training program versus some other method of instruction, and their skills, knowledge, patient management strategies, and patient outcomes measured in a valid and reliable way. These measures could then be compared with those obtained from trainees randomized to other methods of instruction. It would be extremely valuable to include objective measures of the degree of concordance of the treatment strategies implemented by each group and those intervention strategies most strongly supported by the EBM model. It would also be desirable to assess for differences in measures of actual patient outcomes. Operationalizing all of this, given the degree of patient preference and clinician judgment involved, is no small task. Additionally challenging is the fact that brief, clinically practical outcome measures for the broad spectrum of patient problems seen in clinical settings have not adequately been developed for routine clinical application. Many significant gaps exist between the efficacy literature and the effectiveness literature, which will require significant additional work and funding. The goal of the unfunded pilot study presented here was to begin the process of development of a training regime to address several important clinical problem areas. This goal was achieved.

Conclusion Guided mentorship in evidence-based psychiatry appears to be sufficiently promising to warrant further investigation. Previous training programs have been conducted largely in classroom settings and have had questionable influence upon the decision making and outcomes occurring during actual patient care. The program above was done in real time in a complex clinical setting. Strong attempts were made to assess each patient during the study period in a reliable, valid manner, to acquire information from high-quality systematic reviews and meta-analyses where available in treatment decision making, and to negotiate a treatment plan incorporating patient preferences, research findings, and clinical experience. An attending mentored each trainee in real time in how to use EBM skills in a manner efficient enough to be employed in an extremely busy quaternary care academic medical center, 482

http://ap.psychiatryonline.org

with patients who had multiple complex, refractory problems. The within group pre- and posttest comparison using the Dartmouth Inventory suggests that the training methods were highly effective in enhancing trainees’ skills in the EBM model but does not prove this. Future randomized controlled trials using more sophisticated measures of outcomes, including comparisons between groups assigned to usual training methods, and assessing the concordance of management strategies with those mostly strongly supported by clinical research will be quite valuable to conduct. Longitudinal comparison of the durability of skills acquired in the experimental training may be useful to determine whether such training is robust enough to weather the barriers that impede dissemination of the EBM model. A number of barriers to implementation of EBM practice are unlikely to be overcome by training in residency no matter how good the instruction. Overall, the findings of this pilot study suggest further investigation of this method of instruction is warranted despite the limitations. Special thanks to Robert Drake, Matthew Merrens, Mary Turco, and Cindy Stewart from the Dartmouth Summer Institute in Evidence-Based Psychiatry for their ideas and feedback and their permission to use the Dartmouth Evidence-Based Psychiatry Inventory for this study. Thanks also to Marilyn Tinsley, our reference librarian, who generously donated her time and made our informatics rounds practical and enjoyable.

References 1. Sackett DL, Rosenberg WM, Gray JA, et al: Evidence-based medicine: what it is and what it isn’t. BMJ 1996; 312:71–72 2. Straus SE, Richardson WS, Glasziou P: Evidence-Based Medicine: How To Practice and Teach EBM, 3rd ed. Edinburgh; New York, Elsevier/Churchill Livingstone, 2005 3. Drake RE, Rosenberg SD, Teague GB, et al: Fundamental principles of evidence-based medicine applied to mental health care. Psychiatr Clin North Am 2003; 26:811–820, vii. 4. Drake RE, Torrey WC, McHugo GJ: Strategies for implementing evidence-based practices in routine mental health settings. Evid Based Ment Health 2003; 6:6–7 5. Essock SM, Goldman HH, Van Tosh L, et al: Evidence-based practices: setting the context and responding to concerns. Psychiatr Clin North Am 2003; 26:919–938, ix 6. Torrey WC, Drake RE, Dixon L, et al: Implementing evidence-based practices for persons with severe mental illnesses. Psychiatr Serv 2001; 52:45–50 7. Gray GE: Concise Guide to Evidence-Based Psychiatry. Arlington, Va, American Psychiatric Publishing, 2004 8. Gray GE: Evidence-based medicine: an introduction for psychiatrists. J Psychiatr Pract 2002; 8:5–13 9. Gray GE, Pinson LA: Evidence-based medicine and psychiatric practice. Psychiatr Q 2003; 74:387–399

Academic Psychiatry, 32:6, November-December 2008

MASCOLA 10. Hatcher S, Butler R, Oakley-Brown M: Evidence-Based Mental Health Care. Edinburgh, Elsevier Churchill Livingstone, 2005 11. March JS, Chrisman A, Breland-Noble A, et al: Using and teaching evidence-based medicine: the Duke University child and adolescent psychiatry model. Child Adolesc Psychiatr Clin N Am 2005; 14:273–296, viii-ix 12. Dixon L: The need for implementing evidence-based practices. Psychiatr Serv 2004; 55:1160–1161 13. Lehman AF, Steinwachs DM: Patterns of usual care for schizophrenia: initial results from the schizophrenia Patient Outcomes Research Team (PORT) client survey. Schizophr Bull 1998; 24:11–20; discussion 20–32 14. Geddes J, Reynolds S, Streiner D, et al: Evidence-based practice in mental health. BMJ 1997; 315:1483–1484 15. Deegan PE, Drake RE: Shared decision making and medication management in the recovery process. Psychiatr Serv 2006; 57:1636–1639 16. Harris G: Psychiatrists Top List in Drug Maker Gifts. New York Times, June 27, 2007. Available at http:/ /www.nytimes. com/2007/06/27/health/psychology/27doctors.html 17. Wazana A: Physicians and the pharmaceutical industry: is a gift ever just a gift? JAMA 2000; 283:373–380 18. Smith CA, Ganschow PS, Reilly BM, et al: Teaching residents evidence-based medicine skills: a controlled trial of effectiveness and assessment of durability. J Gen Intern Med 2000; 15:710–715 19. Hatala R, Guyatt G: Evaluating the teaching of evidencebased medicine. JAMA 2002; 288:1110–1112 20. Straus SE, Green ML, Bell DS, et al: Evaluating the teaching of evidence-based medicine: conceptual framework. BMJ 2004; 329:1029–1032 21. Shaneyfelt T, Baum KD, Bell D, et al: Instruments for evaluating education in evidence-based practice: a systematic review. JAMA 2006; 296:1116–1127 22. Green MLL: Graduate medical education training in clinical epidemiology, critical appraisal, and evidence-based medicine: a critical review of curricula. Acad Med 1999; 74:686–694 23. Accreditation Council for Graduate Medical Education: ACGME Outcome Project: Instruction: Tips for Using Experience-Based, Integrative Evidence-Based Medicine (EBM). ACGME, Chicago, 2007 24. Miller WR, Rollnick S: Motivational interviewing: preparing people for change, 2nd ed. New York, Guilford, 2002 25. Knight KM, McGowan L, Dickens C, et al: A systematic review of motivational interviewing in physical health care settings. Br J Health Psychol 2006; 11:319–332 26. Hettema J, Steele J, Miller WR: Motivational interviewing. Annu Rev Clin Psychol 2005; 1:91–111

27. Burke BL, Arkowitz H, Menchola M: The efficacy of motivational interviewing: a meta-analysis of controlled clinical trials. J Consult Clin Psychol 2003; 71:843–861 28. Smith GCS, Pell JP: Parachute use to prevent death and major trauma related to gravitational challenge: systematic review of randomized controlled trials. BMJ 2003; 327:1459–1461 29. Isaacs D, Fitzgerald D: Seven alternatives to evidence-based medicine. BMJ 1999; 319:1618 30. Rollnick S, Mason P, Butler C: Health Behavior Change: A Guide for Practitioners. Edinburgh, Churchill Livingstone, 1999 31. Williams GC, Deci EL: The importance of supporting autonomy in medical education. Ann Intern Med 1998; 129:303– 308 32. Deci EL, Flaste R: Why We Do What We Do: Understanding Self-Motivation. New York, Penguin Books, 1995, p 149 33. Bandura A: Social Learning Theory. Englewood Cliffs, NJ, Prentice Hall, 1977 34. Bandura A: Self-Efficacy: The Exercise of Control. New York, WH Freeman, 1997 35. Kasser VG, Ryan RM: The relation of psychological needs for autonomy and relatedness to vitality, well-being, and mortality in a nursing home. J Appl Soc Psychol 1999; 29:935–954 36. Williams GC, Grow VM, Freedman ZR, et al: Motivational predictors of weight loss and weight-loss maintenance. J Pers Soc Psychol 1996; 70:115–126 37. Williams GC, McGregor HA, Zeldman A, et al: Testing a selfdetermination theory process model for promoting glycemic control through diabetes self-management. Health Psychol 2004; 23:58–66 38. Williams GC, Minicucci DS, Kouides RW, et al: Self-determination, smoking, diet and health. Health Educ Res 2002; 17:512–521 39. Williams GC, Rodin GC, Ryan RM, et al: Autonomous regulation and adherence to long-term medical regimens in adult outpatients. Health Psychol 1998; 17:269–276 40. Ainslie G: Breakdown of Will. Cambridge; New York, Cambridge University Press, 2001 41. Gollwitzer PM: Implementation intentions: strong effects of simple plans. Am Psychol 1999; 54:493–503 42. Mascola A, Crane L, Simms R, et al: Encouraging physical activity among overweight patients as the main outcome of treatment: a pilot RCT of an adaptation of motivational interviewing, in Society of Behavioral Medicine, 27th Annual Meeting & Scientific Sessions. San Francisco, March 25, 2006 43. Ramos KD, Schafer S, Tracz SM: Validation of the Fresno Test of competence in evidence-based medicine. BMJ 2003; 326:319–321

Academic Psychiatry, 32:6, November-December 2008

http://ap.psychiatryonline.org

483

Related Documents