Evidence-based medicine

From Citizendium
Revision as of 09:53, 6 November 2007 by imported>Robert Badgett
Jump to navigation Jump to search
This article has a Citable Version.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article has an approved citable version (see its Citable Version subpage). While we have done conscientious work, we cannot guarantee that this Main Article, or its citable version, is wholly free of mistakes. By helping to improve this editable Main Article, you will help the process of generating a new, improved citable version.

Evidence-based medicine is defined as "the conscientious, explicit and judicious use of current best evidence in making decisions about the care of individual patients.".[1] Alternative definitions are "the process of systematically finding, appraising, and using contemporaneous research findings as the basis for clinical decisions"[2] or "evidence-based medicine (EBM) requires the integration of the best research evidence with our clinical expertise and our patient's unique values and circumstances."[3] Better known as EBM, evidence based medicine emerged in the early 1990s.

This article discusses the field of evidence based medicine. Background in the history of medicine, clinical epidemiology, and the ethics of medical experimentation is included. Evidenced-based practice is not restricted to medicine among the health sciences, dentistry, nursing and other allied health science are adopting "evidence based medicine" as well.Evidence-Based Health Care or evidence-based practice extends the concept of EBM to all health professions, including purchasing and management [1].

Why do we need evidence-based medicine?

Why should such an approach to clinical medicine merit its own name, let alone another acronym in the medical literature? Don't physicians ordinarily conscientiously and judiciously use scientific evidence in treating patients? Isn't that simply routine medical care?

In fact, most of the specific practices of physicians and surgeons are based on traditional techniques learned from their seniors in the care of patients during training, that are modified with personal clinical experience and information gleaned from the medical literature and continuing education courses. Although these practices almost always have a rational basis in biology, the actual efficacy of treatments is rarely explicitly proven by experimental trials in people. Further, even when the results of experimental trials or other evidence has first been reported, there is a lag time between accepting changes in procedures, treatments and tests in medical care at centers where such research is carried out or reviewed, and establishing them as routine practice generally in clinical care.

Evidence-based medicine seeks to promote practices that has been shown, through the scientific method to have validity by empiric proof. As such, it currently encompasses only a very small minority of actual practices in clinical medicine and surgery. More often, recommendations are made on the basis of best evidence that are reasonable, but not proven. Evidence-based medicine is also a philosophy, however, that seeks to validate practices by finding proof.

Steps in evidence-based medicine

Ask

"Ask" - Formulate a well-structured clinical questions

Acquire

The ability to "acquire" evidence in a timely manner may improve healthcare.[4] Unfortunately, doctors may be lead astray when acquiring information as often as they find correct answers.[5]

Appraise

To "Appraise" the quality of the answer found is very important as one third of the results of even the most visible medical research is eventually either attenuated or refuted.[6] There are many reasons for this[7]; two of the most important reasons are publication bias[8] and conflict of interest[9].

These two problems interact as conflict of interest is predictive of publication bias.[10][8]

Publication bias

Publication bias is the tendency for negative studies not to be published; the most common reason may be due to authors choosing not to submit publications.[8][11] Publication bias may be more prevalent in industry sponsored research.[12]

In performing a meta-analyses, a file drawer[13] or a funnel plot analysis[14][15] may help detect underlying publiation bias among the studies in the meta-analysis.

Conflict of interest

Regarding the design of randomized controlled trials, industry sponsored studies may be more likely to select an inappropriate comparator group that would favor finding benefit in the experimental group.[12]

Regarding the reporting of data in randomized controlled trials, industry sponsored studies may be more likely to omit intention-to-treat analyses.[10]

Regarding the conclusions reached in randomized controlled trials, one study did not find evidence of overstatement[16]; however, a later study[17] found that industry sponsored studies are more likely to recommend the experimental drug as treatment of choice even after adjusting for the treatment effect. Similarly, industry sponsored studies may be more likely to conclude that drugs are safe, even when they have increased adverse effects.[18]

Unfortunately, the presence of authors with conflict of interests is not reliably indicated in journal articles.[19] In addition, when studied in the late 1990s, approximately 10% of some types of articles used 'ghost writers'.[20] Ghost writers mean that the credited author in the byline may not have been the real author and the real author may have a conflict of interest.

Apply

It is important to "apply" correctly the answers found. Common problems in applying evidence are 1) difficulties with numeracy and 2) recognition of the correct population that evidence applies to.

Difficulties with health numeracy

Both patients and healthcare professionals have difficulties with health numeracy and probabilistic reasoning.[21]

Difficulties in applying evidence to the correct patient population

Studies document that extrapolating study results to the wrong patient populations (over-generalization)[22][23][24] and not applying study results to the correct population (under-utilization)[25] can both increase adverse outcomes.

Over-generalization
The problem in over-generalization of study results may be more common among specialist physicians.[26] Two studies found specialists were more likely to adopt COX-2 drugs before the drugs were recalled by the FDA [27][28]. One of the studies went on to state "using COX-2s as a model for physician adoption of new therapeutic agents, specialists were more likely to use these new medications for patients likely to benefit but were also significantly more likely to use them for patients without a clear indication".[28] Similarly, orthopedists may provide more intensive care for back pain, but without benefit from the increased care.[29] Specialists may be less discriminating in their choice of journal reading. [30]

Under-utilization
The problem of under-utilizing study results may be more common when physicians are practicing outside of their expertise. For example, specialist physicians are less likely to under-utilize specialty care[31][32], while primary care physicians are less likely to under-utilize preventive care[33][34].

Assess

"Assess". Evaluate one's performance

Classification

Two types of evidence-based medicine have been proposed.[35]

Evidence-based guidelines

Evidence-based guidelines (EBG) is the practice of evidence-based medicine at the organizational or institutional level. This includes the production of guidelines, policy, and regulations.

Evidence-based individual decision making

Evidence-based individual decision (EBID) making is evidence-based medicine as practiced by the individual health care provider and an individual patient. There is concern that current evidence-based medicine focuses excessively on EBID.[35]

Evidence-based individual decision making can be further divided into three modes, "doer", "user", "replicator" by the intensity of the work by the individual.[36]

This categorization somewhat parallels the theory of Diffusion of innovations, but without pejorative terms, in which adopters of innovation are categorized as innovators (2.5%), early adopters (13%), early majority(33%), late majority(33%), and laggards(16%).[37] This categorization for doctors is supported by a preliminary empirical study of Green et al that grouped doctors into Seekers, Receptives, Traditionalists, and Pragmatists.[38] The study of Green et al has not been externally validated.

The same doctors may vary which group they resemble depending on how much time is available to seek evidence during clinical care.[39] Medicine residents early in training tend to prefer being taught the practitioner model, whereas residents later in training tended to prefer the user model.[40]

Doer

The "doer"[36] or "practitioner"[41] of evidence-based medicine does at least the first four steps (above) of evidence-based medicine are performed for "self-acquired"[39] knowledge.

If the Doers are the same as the "Seekers" in the study of Green, then this group may be 3% of physicians.[38]

This group may also be the similarly small group of doctors who use formal Bayesian calculations[42] or MEDLINE searches[43].

User

For the "user" of evidence-based medicine, "[literature] searches are restricted to evidence sources that have already undergone critical appraisal by others, such as evidence-based guidelines or evidence summaries"[36]. More recently, the 5S search strategy,[44] which starts with the search of "summaries" (evidence-based textbooks) is a quicker approach.[45]

If the Users are the same as the "Receptives" in the study of Green, then this group may be 57% of physicians.[38]

Replicator

For the "replicator", "decisions of respected opinion leaders are followed"[36]. This has been called "'borrowed' expertise".[39]

If the Replicators are the same as the "Traditionalists" and "Pragmatists" combined in the study of Green, then this group may be 40% of physicians.[38] This is a very broad group of doctors. Possibly the lowest end of this group may be equivalent to the laggards of Rogers. This much smaller group of doctors, ones who have "severely diminished capacity for self-improvement", may be at increased risk of disciplinary action by medical boards.[46]

Metrics used in evidence-based medicine

Diagnosis

  • Sensitivity and specificity
  • Likelihood ratios

Interventions

Relative measures

  • Relative risk ratio
  • Relative risk reduction

Absolute measures

  • Absolute risk reduction
  • Number needed to treat
  • Number needed to screen
  • Number needed to harm

Health policy

  • Cost per year of life saved[47]
  • Years (or months or days) of life saved. "A gain in life expectancy of a month from a preventive intervention targeted at populations at average risk and a gain of a year from a preventive intervention targeted at populations at elevated risk can both be considered large."[48]

Experimental trials: producing the evidence

For more information, see: Randomized controlled trial.


Evidence synthesis: summarizing the evidence

Systematic review

For more information, see: Systematic review.


Meta-analysis

Systematic reviews that quantitatively pool results from research studies are called meta-analyses.

Clinical practice guidelines

For more information, see: Clinical practice guideline.


Incorporating evidence into clinical care

Practicing clinicians usually cite the lack of time for reading newer textbooks or journals. However, the emergence of new types of evidence can change the way doctors treat patients. Unfortunately the recent scientific evidence gathered through well controlled clinical trials usually do not reach the busy clinicians in real time. Another potential problem lies in the fact that there may be numerous trials on similar interventions and outcomes but they are not systematically reviewed or meta-analyzed.

Medical informatics

An essential adjunct to the practice of evidence-based medicine (EBM) is medical informatics (MI) which focuses on creating tools to access and apply the best evidence for making decisions about patient care.[3]

Before practicing EBM, informaticians (or informationists) must be familiar with medical journals, literature databases, medical textbooks, practice guidelines, and the growing number of other dedicated evidence-based resources, like the Cochrane Database of Systematic Reviews and Clinical Evidence.[49]

Similarly, for practicing medical informatics properly, it is essential to have an understanding of EBM, including the ability to phrase an answerable question, locate and retrieve the best evidence, and critically appraise and apply it.[50][51]

Studies of the effectiveness of teaching evidence-based medicine

A systematic review of the effectiveness of teaching concluded "standalone teaching improved knowledge but not skills, attitudes, or behaviour. Clinically integrated teaching improved knowledge, skills, attitudes, and behaviour."[52]

Two systematic reviews of provide the framework below for measuring outcomes.[53][54]

Information retrieval

Increasing use of information

A randomized controlled trial of volunteer senior medical students found that access to information portal on a handheld computer increased self-reported use of information.[55] The information portal contained multiple pre-appraised resources, including a textbook and drug resource, and would best resemble the "user" mode. The study was not able to isolate with resources in the portal had increased use. It is possible that the benefit was solely due to the textbook or drug resource.

A randomized controlled trial of teaching and encouraging use of MEDLINE by medical resident physicians showed increased searching for evidence during 6-8 weeks of observation.[56] Based on the median number of searches and hours spent searching, each search averaged 22 minutes, which may not be sustainable over long term.

Improving clinical care

Teaching "user" mode only using syntheses and synopses, without summaries, has not shown benefit in two studies. A controlled trial of teaching the "user" mode (see above) was negative.[57] However, this study encouraged the use of syntheses and synopses and did not encourage the more practical "summaries" (evidence-based textbooks) of the "5S" search strategy.[44] A quasirandomized, controlled of teaching medical students the use of studies, syntheses, and synopses using an automated search engine was negative.[58]

Information awareness

Increasing use of information

A cluster randomized trial of McMaster Premium LiteratUre Service (PLUS) led to " increased the utilization of evidence-based information from a digital library by practicing physicians."[59]

Improving clinical care

No controlled studies have addressed improving clinical care by use of information awareness strategies.

Clinical reasoning

Improving clinical care

A controlled trial of teaching Bayesian principles (probabilistic reasoning) "improves the efficiency of test ordering."[60]

Criticisms

Evidence-based medicine has been criticized as an attempt to define knowledge in medicine in the same way that was done unsuccessfully by the logical positivists in epistemology, "trying to establish a secure foundation for scientific knowledge based only on observed facts".[61]

Complexity theory

Complexity theory is proposed as further explaining the nature of medical knowledge.[62][63]

References

  1. Sackett DL, Rosenberg WM, Gray JA, Haynes RB, Richardson WS (1996). "Evidence based medicine: what it is and what it isn't". BMJ 312 (7023): 71–2. PMID 8555924[e]
  2. Evidence-Based Medicine Working Group (1992). "Evidence-based medicine. A new approach to teaching the practice of medicine. Evidence-Based Medicine Working Group". JAMA 268 (17): 2420–5. PMID 1404801[e]
  3. 3.0 3.1 Glasziou, Paul; Strauss, Sharon Y. (2005). Evidence-based medicine: how to practice and teach EBM. Elsevier/Churchill Livingstone. ISBN 0-443-07444-5. 
  4. Banks DE, Shi R, Timm DF, et al (2007). "Decreased hospital length of stay associated with presentation of cases at morning report with librarian support". Journal of the Medical Library Association : JMLA 95 (4): 381–7. DOI:10.3163/1536-5050.95.4.381. PMID 17971885. Research Blogging.
  5. McKibbon KA, Fridsma DB (2006). "Effectiveness of clinician-selected electronic information resources for answering primary care physicians' information needs". Journal of the American Medical Informatics Association : JAMIA 13 (6): 653–9. DOI:10.1197/jamia.M2087. PMID 16929042. Research Blogging.
  6. Ioannidis JP (2005). "Contradicted and initially stronger effects in highly cited clinical research". JAMA 294 (2): 218–28. DOI:10.1001/jama.294.2.218. PMID 16014596. Research Blogging.
  7. Ioannidis JP, Cappelleri JC, Lau J (1998). "Issues in comparisons between meta-analyses and large trials". JAMA 279 (14): 1089–93. PMID 9546568[e]
  8. 8.0 8.1 8.2 Dickersin K, Min YI, Meinert CL (1992). "Factors influencing publication of research results. Follow-up of applications submitted to two institutional review boards". JAMA 267 (3): 374–8. PMID 1727960[e]
  9. Smith R (2005). "Medical journals are an extension of the marketing arm of pharmaceutical companies". PLoS Med. 2 (5): e138. DOI:10.1371/journal.pmed.0020138. PMID 15916457. Research Blogging.
  10. 10.0 10.1 Melander H, Ahlqvist-Rastad J, Meijer G, Beermann B (2003). "Evidence b(i)ased medicine--selective reporting from studies sponsored by pharmaceutical industry: review of studies in new drug applications". BMJ 326 (7400): 1171–3. DOI:10.1136/bmj.326.7400.1171. PMID 12775615. Research Blogging.
  11. Krzyzanowska MK, Pintilie M, Tannock IF (2003). "Factors associated with failure to publish large randomized trials presented at an oncology meeting". JAMA 290 (4): 495–501. DOI:10.1001/jama.290.4.495. PMID 12876092. Research Blogging.
  12. 12.0 12.1 Lexchin J, Bero LA, Djulbegovic B, Clark O (2003). "Pharmaceutical industry sponsorship and research outcome and quality: systematic review". BMJ 326 (7400): 1167–70. DOI:10.1136/bmj.326.7400.1167. PMID 12775614. Research Blogging.
  13. Pham B, Platt R, McAuley L, Klassen TP, Moher D (2001). "Is there a "best" way to detect and minimize publication bias? An empirical evaluation". Evaluation & the health professions 24 (2): 109–25. PMID 11523382[e]
  14. Egger M, Davey Smith G, Schneider M, Minder C (1997). "Bias in meta-analysis detected by a simple, graphical test". BMJ 315 (7109): 629–34. PMID 9310563[e]
  15. Terrin N, Schmid CH, Lau J (2005). "In an empirical evaluation of the funnel plot, researchers could not visually identify publication bias". Journal of clinical epidemiology 58 (9): 894–901. DOI:10.1016/j.jclinepi.2005.01.006. PMID 16085192. Research Blogging.
  16. Friedberg M, Saffran B, Stinson TJ, Nelson W, Bennett CL (1999). "Evaluation of conflict of interest in economic analyses of new drugs used in oncology". JAMA 282 (15): 1453–7. PMID 10535436[e]
  17. Als-Nielsen B, Chen W, Gluud C, Kjaergard LL (2003). "Association of funding and conclusions in randomized drug trials: a reflection of treatment effect or adverse events?". JAMA 290 (7): 921–8. DOI:10.1001/jama.290.7.921. PMID 12928469. Research Blogging.
  18. Nieto A, Mazon A, Pamies R, et al (2007). "Adverse effects of inhaled corticosteroids in funded and nonfunded studies". Arch. Intern. Med. 167 (19): 2047–53. DOI:10.1001/archinte.167.19.2047. PMID 17954797. Research Blogging.
  19. Papanikolaou GN, Baltogianni MS, Contopoulos-Ioannidis DG, Haidich AB, Giannakakis IA, Ioannidis JP (2001). "Reporting of conflicts of interest in guidelines of preventive and therapeutic interventions". BMC medical research methodology 1: 3. PMID 11405896[e]
  20. Laine C, Mulrow CD (2005). "Exorcising ghosts and unwelcome guests". Ann. Intern. Med. 143 (8): 611–2. PMID 16230729[e]
  21. Ancker JS, Kaufman D (2007). "Rethinking Health Numeracy: A Multidisciplinary Literature Review". DOI:10.1197/jamia.M2464. PMID 17712082. Research Blogging.
  22. Gross CP, Steiner CA, Bass EB, Powe NR (2000). "Relation between prepublication release of clinical trial results and the practice of carotid endarterectomy". JAMA 284 (22): 2886–93. PMID 11147985[e]
  23. Juurlink DN, Mamdani MM, Lee DS, et al (2004). "Rates of hyperkalemia after publication of the Randomized Aldactone Evaluation Study". N. Engl. J. Med. 351 (6): 543–51. DOI:10.1056/NEJMoa040135. PMID 15295047. Research Blogging.
  24. Beohar N, Davidson CJ, Kip KE, et al (2007). "Outcomes and complications associated with off-label and untested use of drug-eluting stents". JAMA 297 (18): 1992–2000. DOI:10.1001/jama.297.18.1992. PMID 17488964. Research Blogging.
  25. Soumerai SB, McLaughlin TJ, Spiegelman D, Hertzmark E, Thibault G, Goldman L (1997). "Adverse outcomes of underuse of beta-blockers in elderly survivors of acute myocardial infarction". JAMA 277 (2): 115–21. PMID 8990335[e]
  26. Turner BJ, Laine C (2001). "Differences between generalists and specialists: knowledge, realism, or primum non nocere?". Journal of general internal medicine : official journal of the Society for Research and Education in Primary Care Internal Medicine 16 (6): 422-4. DOI:10.1046/j.1525-1497.2001.016006422.x. PMID 11422641. Research Blogging. PubMed Central
  27. Rawson N, Nourjah P, Grosser S, Graham D (2005). "Factors associated with celecoxib and rofecoxib utilization". Ann Pharmacother 39 (4): 597-602. PMID 15755796.
  28. 28.0 28.1 De Smet BD, Fendrick AM, Stevenson JG, Bernstein SJ (2006). "Over and under-utilization of cyclooxygenase-2 selective inhibitors by primary care physicians and specialists: the tortoise and the hare revisited". Journal of general internal medicine : official journal of the Society for Research and Education in Primary Care Internal Medicine 21 (7): 694-7. DOI:10.1111/j.1525-1497.2006.00463.x. PMID 16808768. Research Blogging.
  29. Carey T, Garrett J, Jackman A, McLaughlin C, Fryer J, Smucker D (1995). "The outcomes and costs of care for acute low back pain among patients seen by primary care practitioners, chiropractors, and orthopedic surgeons. The North Carolina Back Pain Project". N Engl J Med 333 (14): 913-7. PMID 7666878.
  30. McKibbon KA, Haynes RB, McKinlay RJ, Lokker C (2007). "Which journals do primary care physicians and specialists access from an online service?". Journal of the Medical Library Association : JMLA 95 (3): 246-54. DOI:10.3163/1536-5050.95.3.246. PMID 17641754. Research Blogging.
  31. Majumdar S, Inui T, Gurwitz J, Gillman M, McLaughlin T, Soumerai S (2001). "Influence of physician specialty on adoption and relinquishment of calcium channel blockers and other treatments for myocardial infarction". J Gen Intern Med 16 (6): 351-9. PMID 11422631.
  32. Fendrick A, Hirth R, Chernew M (1996). "Differences between generalist and specialist physicians regarding Helicobacter pylori and peptic ulcer disease". Am J Gastroenterol 91 (8): 1544-8. PMID 8759658.
  33. Lewis C, Clancy C, Leake B, Schwartz J (1991). "The counseling practices of internists". Ann Intern Med 114 (1): 54-8. PMID 1983933.
  34. Turner B, Amsel Z, Lustbader E, Schwartz J, Balshem A, Grisso J. "Breast cancer screening: effect of physician specialty, practice setting, year of medical school graduation, and sex". Am J Prev Med 8 (2): 78-85. PMID 1599724.
  35. 35.0 35.1 Eddy DM (2005). "Evidence-based medicine: a unified approach". Health affairs (Project Hope) 24 (1): 9-17. DOI:10.1377/hlthaff.24.1.9. PMID 15647211. Research Blogging.
  36. 36.0 36.1 36.2 36.3 Straus SE, McAlister FA (2000). "Evidence-based medicine: a commentary on common criticisms". CMAJ : Canadian Medical Association journal = journal de l'Association medicale canadienne 163 (7): 837–41. PMID 11033714[e]
  37. Berwick DM (2003). "Disseminating innovations in health care". JAMA 289 (15): 1969–75. DOI:10.1001/jama.289.15.1969. PMID 12697800. Research Blogging.
  38. 38.0 38.1 38.2 38.3 Green LA, Gorenflo DW, Wyszewianski L (2002). "Validating an instrument for selecting interventions to change physician practice patterns: a Michigan Consortium for Family Practice Research study". The Journal of family practice 51 (11): 938–42. PMID 12485547[e]
  39. 39.0 39.1 39.2 Montori VM, Tabini CC, Ebbert JO (2002). "A qualitative assessment of 1st-year internal medicine residents' perceptions of evidence-based clinical decision making". Teaching and learning in medicine 14 (2): 114–8. PMID 12058546[e]
  40. Akl EA, Maroun N, Neagoe G, Guyatt G, Schünemann HJ (2006). "EBM user and practitioner models for graduate medical education: what do residents prefer?". Medical teacher 28 (2): 192–4. DOI:10.1080/01421590500314207. PMID 16707306. Research Blogging.
  41. Guyatt GH, Meade MO, Jaeschke RZ, Cook DJ, Haynes RB (2000). "Practitioners of evidence based care. Not all clinicians need to appraise evidence from scratch but all need some skills". BMJ 320 (7240): 954–5. PMID 10753130[e]
  42. Reid MC, Lane DA, Feinstein AR (1998). "Academic calculations versus clinical judgments: practicing physicians' use of quantitative measures of test accuracy". Am. J. Med. 104 (4): 374–80. PMID 9576412[e]
  43. Ely JW, Osheroff JA, Ebell MH, et al (1999). "Analysis of questions asked by family doctors regarding patient care". BMJ 319 (7206): 358–61. PMID 10435959[e] PubMed Central
  44. 44.0 44.1 Haynes RB (2006). "Of studies, syntheses, synopses, summaries, and systems: the "5S" evolution of information services for evidence-based health care decisions". ACP J. Club 145 (3): A8. PMID 17080967[e]
  45. Patel MR, Schardt CM, Sanders LL, Keitz SA (2006). "Randomized trial for answers to clinical questions: evaluating a pre-appraised versus a MEDLINE search protocol". Journal of the Medical Library Association : JMLA 94 (4): 382–7. PMID 17082828[e]
  46. Papadakis MA, Teherani A, Banach MA, et al (2005). "Disciplinary action by medical boards and prior behavior in medical school". N. Engl. J. Med. 353 (25): 2673–82. DOI:10.1056/NEJMsa052596. PMID 16371633. Research Blogging.
  47. Tengs TO, Adams ME, Pliskin JS, et al (1995). "Five-hundred life-saving interventions and their cost-effectiveness". Risk Anal. 15 (3): 369–90. PMID 7604170[e]
  48. Wright JC, Weinstein MC (1998). "Gains in life expectancy from medical interventions--standardizing data on outcomes". N. Engl. J. Med. 339 (6): 380–6. PMID 9691106[e]
  49. Mendelson D, Carino TV (2005). "Evidence-based medicine in the United States--de rigueur or dream deferred?". Health affairs (Project Hope) 24 (1): 133–6. DOI:10.1377/hlthaff.24.1.133. PMID 15647224. Research Blogging.
  50. Hersh W (2002). "Medical informatics education: an alternative pathway for training informationists". Journal of the Medical Library Association : JMLA 90 (1): 76–9. PMID 11838463[e]
  51. Shearer BS, Seymour A, Capitani C (2002). "Bringing the best of medical librarianship to the patient team". Journal of the Medical Library Association : JMLA 90 (1): 22–31. PMID 11838456[e]
  52. Coomarasamy A, Khan KS (2004). "What is the evidence that postgraduate teaching in evidence based medicine changes anything? A systematic review". BMJ 329 (7473): 1017. DOI:10.1136/bmj.329.7473.1017. PMID 15514348. Research Blogging.
  53. Shaneyfelt T, Baum KD, Bell D, et al (2006). "Instruments for evaluating education in evidence-based practice: a systematic review". JAMA 296 (9): 1116–27. DOI:10.1001/jama.296.9.1116. PMID 16954491. Research Blogging.
  54. Straus SE, Green ML, Bell DS, et al (2004). "Evaluating the teaching of evidence based medicine: conceptual framework". BMJ 329 (7473): 1029–32. DOI:10.1136/bmj.329.7473.1029. PMID 15514352. Research Blogging.
  55. Leung GM, Johnston JM, Tin KY, et al (2003). "Randomised controlled trial of clinical decision support tools to improve learning of evidence based medicine in medical students". BMJ 327 (7423): 1090. DOI:10.1136/bmj.327.7423.1090. PMID 14604933. Research Blogging.
  56. Cabell CH, Schardt C, Sanders L, Corey GR, Keitz SA (2001). "Resident utilization of information technology". Journal of general internal medicine : official journal of the Society for Research and Education in Primary Care Internal Medicine 16 (12): 838–44. PMID 11903763[e]
  57. Shuval K, Berkovits E, Netzer D, et al (2007). "Evaluating the impact of an evidence-based medicine educational intervention on primary care doctors' attitudes, knowledge and clinical behaviour: a controlled trial and before and after study". Journal of evaluation in clinical practice 13 (4): 581–98. DOI:10.1111/j.1365-2753.2007.00859.x. PMID 17683300. Research Blogging.
  58. Badgett RG, Paukert JL, Levy LS (2001). "Teaching clinical informatics to third-year medical students: negative results from two controlled trials". BMC medical education 1: 3. PMID 11532204[e] PubMed Central
  59. Haynes RB, Holland J, Cotoi C, et al (2006). "McMaster PLUS: a cluster randomized clinical trial of an intervention to accelerate clinical use of evidence-based information from digital libraries". Journal of the American Medical Informatics Association : JAMIA 13 (6): 593–600. DOI:10.1197/jamia.M2158. PMID 16929034. Research Blogging.
  60. Davidoff F, Goodspeed R, Clive J (1989). "Changing test ordering behavior. A randomized controlled trial comparing probabilistic reasoning with cost-containment education". Medical care 27 (1): 45–58. PMID 2492066[e]
  61. Goodman SN (2002). "The mammography dilemma: a crisis for evidence-based medicine?". Ann. Intern. Med. 137 (5 Part 1): 363–5. PMID 12204023[e]
  62. Sweeney, Kieran (2006). Complexity in Primary Care: Understanding Its Value. Abingdon: Radcliffe Medical Press. ISBN 1-85775-724-6. Review
  63. Holt, Tim A (2004). Complexity for Clinicians. Abingdon: Radcliffe Medical Press. ISBN 1-85775-855-2.  Review, ACP Journal Club Review

See also

External links