Magnitude of change in outcomes following entry-level evidence-based practice training : a systematic review

Objectives: The aim of this systematic review was to determine the magnitude of change (effect size) in outcomes (knowledge, attitudes, behaviours, skills and confidence) following evidence-based practice training in entry-level health professional students. Methods: Six electronic databases were searched for primary studies that investigated the effectiveness of evidencebased practice intervention(s) and reported on, or included data, to allow the calculation of effect size. Data were extracted regarding the effect size, or enabling calculation of effect size (mean, standard deviation, standard error, sample size). Results: Eight studies were found that met the inclusion criteria. Effect sizes for evidence-based practice knowledge and skills ranged from small (0.33) to huge (5.42). Four studies exploring attitudes found negligible (0.075) to medium (0.57) effect sizes. These studies assessing behaviours showed effect sizes ranging from negligible (0.031) to very large (1.34). Very large (0.89) and huge (3.03) effect sizes were reported for confidence with evidence-based practice. Conclusions: Few studies reported the effect size for outcomes and many did not report sufficient data to enable calculation of effect size. Considerable and varied improvements were found in students’ evidence-based practice knowledge, skills and confidence after training.


Introduction
2][3] To practise in an evidence-based way, health professionals must have the necessary skills to seek, appraise and integrate new knowledge throughout their career. 4A previous systematic review exploring the changes in EBP outcomes (knowledge, skills, attitudes and behavior) in postgraduate healthcare workers reported improvements in all outcomes following EBP training. 5For qualified professionals to become proficient in EBP, students must be taught the necessary EBP knowledge and skills during their entry-level education. 4,6][12][13] However, little is known about the size of the effect of EBP training on outcomes in entry-level programs.In contrast to testing statistical significance, effect size (ES) calculation allows quantification of the difference between two or more Correspondence: Lucy K. Lewis, University of South Australia, School of Health Sciences, GPO Box 2471, Adelaide SA 5001 Australia.Email: lucy.lewis@unisa.edu.augroups in relation to the spread or variation of scores in the group. 14Although ES has been available for over sixty years, is routinely used in meta-analyses, and valuable in reporting and interpreting effectiveness, there appears to have been a reluctance in its use in educational research. 14,15The aim of this systematic review was to determine the magnitude of change in EBP outcomes (such as knowledge, attitudes and behaviours) following EBP training in entry-level health professional programs.

Participants
The participants of interest were entry-level health professional students.'Entry-level' was defined as undergraduate and graduate entry programs that prepare students to enter their professions as beginning practitioners. 16For the purposes of this review, the term 'health professional' incorporated all health disciplines listed or planned for listing under the Australian Health Practitioner Regulation Agency (AHPRA) at the time of the review (chiropractic, dental, medical, nursing and midwifery, optometry, osteopathy, pharmacy, physiotherapy, podiatry, psychology, medical radiation and occupational therapy). 17

Interventions
Studies needed to include at least one EBP educational intervention involving entry-level health professional students.The educational interventions had to include one or more of the five steps of EBP as outlined and defined in the Sicily Statement. 4There were no restrictions placed upon the mode of delivery (e.g.lectures, tutorials, online or workshops) or the type of EBP educational interventions (e.g.formal or informal, stand-alone or integrated training).

Comparison
All studies were included irrespective of the presence or absence of control or comparator groups.

Outcomes
For inclusion, studies had to evaluate EBP outcomes pre and post the educational intervention.9][20][21] In this review, 'actual knowledge' referred to EBP knowledge of participants measured objectively, whereas 'self-reported' outcomes were based on participants' perceptions.These outcomes were chosen as they were common to many studies investigating the impact of EBP training. 19,20,22

Study design
Only primary studies reporting original data on outcomes evaluating an EBP educational intervention were included.Included studies were required to report ES for study outcomes, or to report pre and post intervention data in sufficient detail to allow ES calculation.Study designs included randomised controlled trials, controlled trials or cohort studies (pre-post and longitudinal studies).Case studies, cross-sectional studies, editorials and narrative reviews were excluded.Secondary studies such as systematic reviews of primary studies (with original data on EBP outcomes evaluating EBP educational interventions) were not included; however, the reference lists of these studies were screened for further relevant studies.

Search strategy
Six electronic databases were searched in December 2011: Academic Search Premier, Cumulative Index to Nursing and Allied Health Literature (CINAHL), Excerpta Medica Database (EMBASE), Education Resources Information Center (ERIC), Medical Literature Analysis and Retrieval System Online (MEDLINE), and Scopus.The Medical Subject Headings (MeSH) database was used to formulate relevant search terms.The search strategy for one database (Medline) is presented in Appendix I.There were no restrictions placed on publication year; however, where possible, all searches were limited to English and studies involving humans.The reference lists of all included studies meeting the inclusion criteria were screened to identify additional relevant studies.An expert in the field of EBP and entry-level education was contacted and invited to provide details of additional relevant studies.

Study selection
Two rounds of eligibility screening were completed.Firstly, the titles and abstracts were screened for inclusion of an EBP educational intervention (CW).The full text of studies were retrieved when (a) studies were identified as potentially relevant from the title and abstract, (b) the relevance of studies was uncertain from the title and abstract, or (c) the abstract was unavailable.In the second round of screening, all full text studies were reviewed against the eligibility criteria (CW).Where there was uncertainty, discussion was held with the research team to reach consensus.

Data collection process
Data were independently extracted from one randomly chosen study by all members of the research team using a purpose-designed data extraction sheet and instructions.Results were compared and discussed with refinements made to the data extraction tool.Independent data extraction from a further five randomly chosen articles using the modified data extraction tool was undertaken with results compared between the principal investigator (CW) and another member of the research team (LKL).Data from all remaining studies were then extracted (CW).

Data items
The following information was extracted verbatim from each included study: (1) research design; (2) health professional discipline; (3) sample size; (4) EBP intervention (type, duration and EBP steps addressed); (5) instrument(s) used; (6) outcomes measured; and (7) pre and post EBP intervention data for the identified outcomes (mean, standard deviation, standard error, number of students before and after intervention, effect size, p-value, t-value).

Risk of bias in individual studies
All included studies were evaluated for risk of methodological bias using 'The Critical Appraisal Checklist for an Article on an Educational Intervention Tool'. 23The tool contains 13 items evaluating the research question, study presentation and design, outcome measures and results of each included study.Eleven items (items 1-9, 12 and 13) have three possible gradings: 'Yes', 'Can't Tell', and 'No'.The remaining two items (items 10 and 11) require a narrative response.For the purpose of this study, a score of one point was allocated for items that were graded 'Yes'; and zero points for 'Can't Tell', 'No' or 'Non-applicable'.This applied to all of the 11 graded items except item 13 where 'No' indicates a positive response and therefore was allocated one point.The appraisal was completed by two independent reviewers (CW and LKL).Disagreements were resolved by a third independent reviewer (MM).

Synthesis of results
Effect sizes were calculated for the outcomes using an ES calculator in Microsoft Excel. 24Data required for ES calculation included sample size, mean, standard deviation or standard error; or sample size and t-value.The ES was classified as negligible (≥-0.15 and <0.15), small (≥0.15 and <0.40), medium (≥0.40 and <0.75), large (≥0.75 and <1.10), very large (≥1.10 and <1.45) and huge (≥1.45). 24

Study selection
Following the initial search (n=3335 studies) and removal of duplicates (n=987), the titles and abstracts of 2348 articles were screened for possible inclusion.After excluding 2035 studies, the full text of 313 potential articles were retrieved and reviewed against the eligibility criteria.Eight studies fulfilled the final criteria and were subsequently included in the review.No additional studies were identified through screening the reference lists of included studies or systematic reviews, or by the EBP expert (Figure 1).

Study characteristics
The included studies were published between 2003 and 2011 (Table 1).Of the eight studies, two were nonrandomised controlled trials 25,26 , five were pre-post (uncontrolled) studies [27][28][29][30][31] and one was a longitudinal study with four test occasions. 32Both controlled studies compared the intervention group to a control group with no intervention.The majority of studies sampled medical students (n=5), with the remaining studies in nursing (n=1), physiotherapy (n=1) and combined physiotherapy/occupational therapy students (n=1).Entry-level students only were sampled in all studies with the exception of one study which included both postgraduate physiotherapy and undergraduate occupational therapy students. 31The sample size for the included studies ranged from 17 to 293 students.
The type of EBP interventions varied considerably, with most studies employing a mix of lecture-based and clinically-integrated EBP training.Similarly, the duration of training ranged widely, from four days to one and a half years.Different steps of EBP were covered in the interventions; the first three steps of the EBP model (Ask, Acquire, Appraise) were included in all interventions except Johnston et al. 27 where information on the course material was not provided which prevented clear determination of the steps covered.Three studies contained interventions which addressed all five steps of EBP. 26,31,32All but two studies 28,31 reported using valid and reliable instruments.One study 25 used the English version of the Berlin Questionnaire 33 , two studies 29,30 used the Fresno test 34 , two studies 26,27 the Knowledge, Attitudes and Behaviour (KAB) questionnaire and one study 32 used both the Knowledge of Research Evidence Competencies (K-REC) Instrument 36 and the Evidence-Based Practice Profile (EBP 2 ) Questionnaire. 21ennett et al. 31 used an adapted questionnaire that had not undergone validity testing whereas insufficient information was provided by Taheri et al. 28 to determine the validity and reliability of the instruments used in the study (multiple choice exam, students' search strategy).Knowledge, attitudes and skills were the outcomes most commonly explored among the included studies (Table 1).

Risk of methodological bias
All included studies were appraised for risk of methodological bias by two independent reviewers (Table 2).Item 8 (explanation of unanticipated outcomes) was not applicable for one study that did not report any unanticipated outcomes. 28Item 9 (report of relationship between selfreported behavioural changes and objective changes) was irrelevant for studies 25,29,30 that used only objective outcome measures (n=3).Five studies 25,26,29,31,32 had low risk of methodological bias (score of 9 or 10) and the remaining three studies had moderate risk of methodological bias (score of 7-8).

Synthesis of results
Of the eight included studies, two reported ES 27,32 while six provided sufficient information for calculation of ES. 25,26,[28][29][30][31] The ES from the included studies were in the EBP domains of self-reported knowledge (n=4), actual knowledge (n=4), attitudes (n=4), behaviours (n=3), confidence (n=2) and skills (n=2) (Figure 2).Large (1.07) to huge (2.94) ES were reported for changes in self-reported knowledge of students For actual knowledge, the ES ranged from medium (0.46) to huge (1.67).There was little change in students' attitudes toward EBP following training.Negligible ES were reported in two studies 27,31 while a medium ES (0.30 and 0.49) was reported by Long et al. 32 Kim et al. 26 found a medium ES (0.57) in students' attitudes towards future use of EBP, although their perception of the role of, and need for, EBP remained unchanged (ES 0.10).Effect sizes for the changes in students' EBP behaviours differed considerably.Two studies which used the KAB questionnaire yielded very different results, with one study reporting a negligible effect (ES 0.031) 27 and the other a medium effect (ES 0.57). 26ing the EBP 2 Questionnaire, Long et al. 32 reported a very large ES (1.34) in physiotherapy students' self-reported practice of EBP following training.Evidence-based practice skills were evaluated by two studies.Taheri et al. 28 found a huge improvement (ES 5.42) in students' search skills after a four-day workshop by assessing their printed copy of search strategies whereas Aronoff et al. 30 reported small to medium ES (0.16-0.49) in individual items of the Fresno test.Two studies 31,32 that explored the confidence of students after EBP training reported very large (0.89) and huge (3.03) improvements respectively.Due to the heterogeneity of interventions used and outcomes measured across studies, meta-analysis was not performed.Given the relatively short duration of EBP interventions investigated among studies (4 days to 13x2-hour weekly excluding one study intervention conducted over one and a half years), it is less likely that large changes in students' attitudes toward EBP would be observed.Dawes et al. 4 coined the phrase 'attitudes are caught, not taught', emphasising the importance of one's mindset as a reflection of attitudes.The attitudes of students toward EBP may require longer to develop than knowledge, and could mature with greater opportunities to incorporate knowledge and practical skills in real clinical settings.This is supported by medium ES for change in attitudes in Long et al. 32 following two EBP courses conducted in two successive years, with the second course involving clinical application.
The results of this review are not consistent with those of a systematic review which reported small improvements in all EBP outcomes (knowledge, skills, attitudes and behaviours) following training in postgraduate health professional students and practitioners. 5The differences may be explained by the different student populations sampled (entry-level versus postgraduate / practitioner).It could be anticipated that students at the postgraduate level are more likely to have been exposed to principles relating to EBP during undergraduate programs.The attitudes, knowledge and skills learnt and opportunities for practice (behaviours) during this time could have carried over to their postgraduate education, resulting in higher baseline measures, leaving less scope for improvement.In contrast, entry-level students could be expected to have little or no prior exposure to EBP before training and therefore greater potential for improvements.
During the screening process, a large number of studies were identified which investigated the impact of EBP interventions in entry-level health professional programs.Among 27 studies that reported the magnitude of change in EBP outcomes after training, the majority of studies (n=12) presented the sample size and mean scores for outcomes but did not report standard deviation or standard error, therefore preventing ES calculation.Only eight studies reported data relating to, or allowing, calculation of ES, with two of these studies reporting the actual ES. 27,32Given the welldocumented importance and benefits of ES, it is surprising that so few studies used ES to report outcomes.While statistical information is now reported more comprehensively in experimental studies (e.g.confidence intervals), there continues to be a lack of studies reporting ES in educational research. 14While there are current reporting guidelines for study designs such as systematic reviews (PRISMA) 37 and randomised controlled trials (CON-SORT), 38 to the best of our knowledge, there are no guidelines for the reporting of educational interventions in primary studies.Given the lack of consistent reporting of results in the current review, it seems that there is a need for such reporting guidelines for educational research.
There are several limitations in this review.All of the outcomes investigated related to student learning of EBP (e.g.student knowledge, attitudes and behaviours) rather than clinical outcomes for patients.The translation of EBP knowledge and skills on students' transition from entrylevel training into clinical practice is unclear.While every effort was made to ensure a thorough systematic search of the literature, it is possible that some studies were missed.The inclusion in this review of only those health professions currently listed in AHPRA limits the generalisability of the results to those health professional disciplines.However, this is the first systematic review which has assessed the impact of EBP training (effectiveness and size of the effect) in entry-level health professional students.
The findings of this review offer several implications for future practice and research.Firstly, future EBP interventions may benefit from strategies that more directly inspire positive changes in students' attitudes toward EBP.A positive attitude may motivate students to participate more actively in educational programs and therefore increase the likelihood of success. 5Future studies investigating the change in EBP outcomes following training should calculate ES to enable meaningful comparisons between studies using a standardised measure.Effect size (95% CI)

Conclusion
The results of this review showed small to huge improvement in knowledge and skills; small changes in attitudes and variable changes in behaviours following entry-level EBP training.Future studies investigating the effectiveness of EBP training should calculate ES to enable more direct comparison of findings across studies.

Figure 1 .
Figure 1.PRISMA chart showing the flow of studies through the review

Studies
Not entry-level students n=55, Health professionals not listed in AHPRA n= 7, Not EBP (e.g.development of an EBP tool n=20, Not an EBP intervention n=22, Specific pre-post intervention data not reported (e.g.mean difference or p-value only) n=63, Outcomes measured unclear n=1, Inappropriate study design (narrative, letters, editorials, thesis, qualitative, cross-sectional or secondary studies) n=118 Inadequate data for effect size calculation

Figure 2 .
Figure 2. Modified forest plot of effect size of EBP outcomes.Boxes represent ES for each study (the size of studies was not weighted) and horizontal bars represent 95% confidence interval.Confidence intervals were not available for all studies.

Table 1 .
Summary of included studies* Sample size is presented as a range when there were different sample sizes for different outcomes or test occasions † EBP steps: step 1 ASK, step 2 ACQUIRE, step 3 APPRAISE, step 4 APPLY, step 5 ASSESS

Table 2 .
Critical appraisal results of included studies (items 1-9, 12 and13) 'No' in item 13 was given one point as it indicated a positive response.†yes,‡no,**notapplicableCriticalappraisal questions Q1.Is there a clearly focussed question?Q2.Was there a clear learning need that the intervention addressed?Q3.Was there a clear description of the educational context for the intervention?Q4.Was the precise nature of the intervention clear?Q5.Was the study design chosen able to address the aims of the study?Q6.Were the outcomes chosen to evaluate the intervention appropriate?Q7.Were any other explanations of the results explored by the authors?Q8.Were any unanticipated outcomes explained?Q9.Were any reported behavioural changes after the intervention linked to measurement of other, more objective measures e.g.changes in referral rates?Q12.Was the setting sufficiently similar to your own and/or representative of real life?Q13.Does it require additional resources to adopt the intervention? *