<?xml version="1.0" encoding="UTF-8"?><!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.0 20120330//EN" "JATS-journalpublishing1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML" article-type="review-article" dtd-version="1.0" xml:lang="en">
  <front>
    <journal-meta>
      <journal-id journal-id-type="publisher-id">IJME</journal-id>
      <journal-id journal-id-type="nlm-ta">Int J Med Educ</journal-id>
      <journal-title-group>
        <journal-title>International Journal of Medical Education</journal-title>
        <abbrev-journal-title abbrev-type="pubmed">Int J Med Educ</abbrev-journal-title>
      </journal-title-group>
      <issn pub-type="epub">2042-6372</issn>
      <publisher>
        <publisher-name>IJME</publisher-name>
      </publisher>
    </journal-meta>
    <article-meta>
      <article-id pub-id-type="publisher-id">11-118</article-id>
      <article-id pub-id-type="doi">10.5116/ijme.5e01.eb1a</article-id>
      <article-categories>
        <subj-group subj-group-type="heading">
          <subject>Review literature</subject>
          <subj-group>
            <subject>Augmented reality</subject>
          </subj-group>
        </subj-group>
      </article-categories>
      <title-group>
        <article-title>Augmented reality and mixed reality for healthcare education beyond surgery: an integrative review</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author" corresp="yes">
          <name>
            <surname>Gerup</surname>
            <given-names>Jaris</given-names>
          </name>
          <xref ref-type="aff" rid="aff1">
            <sup>1</sup>
          </xref>
        </contrib>
        <contrib contrib-type="author">
          <name>
            <surname>Soerensen</surname>
            <given-names>Camilla B.</given-names>
          </name>
          <xref ref-type="aff" rid="aff2">
            <sup>2</sup>
          </xref>
        </contrib>
        <contrib contrib-type="author">
          <name>
            <surname>Dieckmann</surname>
            <given-names>Peter</given-names>
          </name>
          <xref ref-type="aff" rid="aff3">
            <sup>3</sup>
          </xref>
        </contrib>
        <aff id="aff1"><label>1</label>School of Medical Sciences, University of Copenhagen, Denmark</aff>
        <aff id="aff2"><label>2</label>Department of Pediatrics, Herlev and Gentofte Hospital, Denmark</aff>
        <aff id="aff3"><label>3</label>Copenhagen Academy of Medical Education and Simulation (CAMES), Center for Human Resources, Herlev and Gentofte Hospital, Denmark</aff>
      </contrib-group>
      <author-notes>
        <corresp id="cor1">Correspondence: Jaris Gerup, School of Medical Sciences, University of Copenhagen, Denmark. Email: <email xlink:href="jaris.gerup@gmail.com">jaris.gerup@gmail.com</email></corresp>
      </author-notes>
      <pub-date pub-type="epub">
        <day>18</day>
        <month>01</month>
        <year>2020</year>
      </pub-date>
      <pub-date pub-type="collection">
        <year>2020</year>
      </pub-date>
      <volume>11</volume>
      <fpage>1</fpage>
      <lpage>18</lpage>
      <history>
        <date date-type="accepted">
          <day>24</day>
          <month>12</month>
          <year>2019</year>
        </date>
        <date date-type="received">
          <day>27</day>
          <month>04</month>
          <year>2019</year>
        </date>
      </history>
      <permissions>
        <copyright-statement>Copyright: &#xA9; 2020 Jaris Gerup et al.</copyright-statement>
        <copyright-year>2020</copyright-year>
        <license license-type="open-access" xlink:href="http://creativecommons.org/licenses/by/3.0">
          <license-p>This is an Open Access article distributed under the terms of the Creative Commons Attribution License which permits unrestricted use of work provided the original work is properly cited. <ext-link ext-link-type="uri" xlink:href="http://creativecommons.org/licenses/by/3.0/">http://creativecommons.org/licenses/by/3.0/</ext-link></license-p>
        </license>
      </permissions>
      
<abstract>
		  
<sec><title>Objectives</title>
<p>This study aimed to review
and synthesize the current research and state of augmented reality (AR), mixed
reality (MR) and the applications developed for healthcare education beyond
surgery.</p></sec>

<sec><title>Methods</title>
<p>An integrative review was
conducted on all relevant material, drawing on different data sources,
including the databases of PubMed, PsycINFO, and ERIC from January 2013 till
September 2018. Inductive content analysis and qualitative synthesis were
performed. Additionally, the quality of the studies was assessed with different
structured tools.</p></sec>
		  
<sec><title>Results</title>
<p>Twenty-six studies were
included. Studies based on both AR and MR involved established applications in
27% of all cases (n=6), the rest being prototypes. The most frequently studied
subjects were related to anatomy and anesthesia (n=13). All studies showed
several healthcare educational benefits of AR and MR, significantly
outperforming traditional learning approaches in 11 studies examining various
outcomes. Studies had a low-to-medium quality overall with a MERSQI mean of
12.26 (SD=2.63), while the single qualitative study had high quality.</p></sec>

<sec><title>Conclusions</title>
<p>This review suggests the progress of learning
approaches based on AR and MR for various medical subjects while moving the
research base away from feasibility studies on prototypes. Yet, lacking
validity of study conclusions, heterogeneity of research designs and widely
varied reporting challenges transferability of the findings in the studies
included in the review. Future studies should examine suitable research designs
and instructional objectives achievable by AR and MR-based applications to
strengthen the evidence base, making it relevant for medical educators and
institutions to apply the technologies.</p></sec>
</abstract>
      <kwd-group kwd-group-type="author">
        <kwd>Augmented reality</kwd>
        <kwd>mixed reality</kwd>
        <kwd>healthcare education</kwd>
        <kwd>medicine</kwd>
        <kwd>integrative review</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec sec-type="intro"><title>Introduction</title>
<p>The integration of digital strategies has brought healthcare education to a paradigm shift, now reflected in many educational curricula.<xref ref-type="bibr" rid="r1"><sup>1</sup></xref> Modern teaching curricula aim to educate trainees efficiently in safe environments to establish transferability into the clinical context. Augmented reality (AR) and mixed reality (MR) have long been expected to be disruptive technologies, with potential uses in medical education, training, surgical planning and to guide complex procedures.<xref ref-type="bibr" rid="r2"><sup>2</sup></xref> While virtual reality (VR) has mainly led the way for the implementation of the display technologies, it is criticized for several limitations.<xref ref-type="bibr" rid="r3"><sup>3</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r4"><sup>4</sup></xref> The term display technologies will hereafter be used to refer to AR and MR although it in principle also covers VR. The latter, however, is beyond the scope of this review.</p>
<p>AR describes display-based systems that combine real and virtual imagery, which are interactive in real-time and register the real-world environment to be augmented by virtual imagery.<xref ref-type="bibr" rid="r5"><sup>5</sup></xref> &#xA0;The visual display technology augments the physical environment by especially two principal manifestations: See-through (transparent) head-mounted display and non-immersive monitor-based video (window on the world). <xref ref-type="bibr" rid="r6"><sup>6</sup></xref> AR systems are based on the combination of the physical and the virtual environment. On the contrary, in VR systems the participant is totally immersed in a completely virtual one.</p>
<p>MR is defined as the merging of real and virtual worlds and can be seen as a larger class of technologies covering the display environment of AR and augmented virtuality (AV).<xref ref-type="bibr" rid="r7"><sup>7</sup></xref> Where virtual information augments the real view in AR, real-world information augments the virtual scene in AV. The external inputs providing real-world context are also seen in VR but were classified as MR in this review. The term of MR was included to embrace new technology labeled as MR, that tries to define a clear distinction between AR and MR, even if there is none.<xref ref-type="bibr" rid="r8"><sup>8</sup></xref></p>
<p>The abilities to provide situated and authentic experience connected with the real environment, enhance interaction between the physical and virtual content, while preserving a feeling of presence explains the growing expectations that AR and MR may be suitable for healthcare education in various contexts.<xref ref-type="bibr" rid="r9"><sup>9</sup></xref></p>
<p>Concerning healthcare education, the process of teaching, learning and training with an ongoing integration of knowledge, experience, skills and responsibility qualifies an individual to practice medicine.<xref ref-type="bibr" rid="r10"><sup>10</sup></xref> Looking into medical education, several authors request to eliminate outdated, inefficient, and passive learning approaches and start to embrace these newer methodologies of learning.<xref ref-type="bibr" rid="r11"><sup>11</sup></xref> Surgeons have historically always been quick to adapt to new technology developing new treatment and learning methodologies, while physicians were rather more tardy.<xref ref-type="bibr" rid="r12"><sup>12</sup></xref> Today most studies on display technologies stem from surgery. In an integrative review on AR in healthcare education from 2014, surgical studies accounted for 64% (n=16) of the studies included.<xref ref-type="bibr" rid="r13"><sup>13</sup></xref> A recent systematic review on AR for the surgeon clarifies the current lack of systematic reviews for physicians and ultimately educators within the field of medicine.<xref ref-type="bibr" rid="r14"><sup>14</sup></xref> Many internists and other medical specialists do no longer diagnose and treat illnesses using only their knowledge of pathophysiology and pharmacology.<xref ref-type="bibr" rid="r15"><sup>15</sup></xref>&#xA0; Today, many physicians have taken up procedures and surgical treatment initiatives by operation or manipulation defined as the use of hands to produce the desired movement or therapeutic effect in part of the body.<xref ref-type="bibr" rid="r16"><sup>16</sup></xref> Nevertheless, medicine consists essentially of non-surgical treatment, procedures and other approaches of diagnostics and prevention of disease that need to be taught, learned and trained with an ongoing evaluation of adaptations. AR and MR may effectively help medical educators achieve such instructional objectives for medical education as it is being used for surgical training.</p>
<p>According to the review by Zhu and colleagues, publications in the field of AR increased significantly in 2008.<xref ref-type="bibr" rid="r13"><sup>13</sup></xref> Now, ten years after that publication outbreak, a new review is warranted. To the best of our knowledge, current reviews on AR and MR have not specifically studied applications for medical subjects in healthcare education. Most papers predominantly include surgical studies and only a few focused on AR in either otolaryngology or medical training.<xref ref-type="bibr" rid="r1"><sup>1</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r3"><sup>3</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r4"><sup>4</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r9"><sup>9</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r13"><sup>13</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r17"><sup>17</sup></xref> Currently, no adequate reviews are available that uncover the educational profile of both AR and MR-based applications across different medical specialties, subjects and target groups.</p>
<p>Our aim of this integrative review was to investigate the current research and state of AR and MR-based applications for healthcare education beyond surgery, providing an overview of the findings, strengths and weaknesses of the reported studies.</p>
</sec>
    <sec sec-type="methods"><title>Methods</title>
<p>We chose to conduct an integrative review, given that previous reviews showed only a few studies relevant for the current scope.<xref ref-type="bibr" rid="r3"><sup>3</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r4"><sup>4</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r13"><sup>13</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r17"><sup>17</sup></xref> This is thought to be the broadest type of review as it allows the inclusion of various research designs and information sources.<xref ref-type="bibr" rid="r18"><sup>18</sup></xref> The method also integrates a process of quality assessment of the studies included that may qualify the integrative review for recommending practice and answering complex search questions.<xref ref-type="bibr" rid="r19"><sup>19</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r20"><sup>20</sup></xref> The digital databases of PubMed, PsycINFO and ERIC were searched. The journal of Medical Teacher was hand-searched. Ted Talks and podcasts on the iTunes Podcast app were included, acknowledging the increasing importance of &#x201C;new media&#x201D;.<xref ref-type="bibr" rid="r21"><sup>21</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r22"><sup>22</sup></xref> Studies published between January 2013 and September 2018 were included. Relevant word groups, combinations and open-ended terms used for the search were: &#x201C;Augmented reality OR mixed reality&#x201D; AND &#x201C;medicine OR medical OR healthcare&#x201D; AND &#x201C;educat* OR simulat* OR train* OR learn*&#x201D;. We did not implement any filter of &#x2018;NOT virtual reality OR surgery&#x2019; in our search string to avoid missing relevant studies examining non-surgical elements despite being termed as a surgical study.</p>
		
<sec><title>Eligibility criteria</title>
<p>The selection process was done according to three overall criteria regarding research, focus on technology and content. According to the criterion of research studies were included if they described 1) a goal or research question, 2) an appropriate study design, 3) data collection and analysis methods and 4) the discussion of results.&#xA0; Research articles were excluded if they 1) neither described goal nor research question, 2) were review papers and 3) were focused on system descriptions without evaluation or other data. <xref ref-type="table" rid="t1">Table 1</xref> provides the inclusion and exclusion criteria for the study.</p>
</sec>
	
<sec><title>Study selection</title>
<p>All abstracts were read by JG, who assessed whether they met the inclusion criteria. In case of doubt, JG discussed the inclusion of studies with the other authors. All duplicates were removed.</p>
</sec><sec><title>Data extraction and synthesis</title>
<p>Study characteristics and information of all articles were extracted and described by JG. Characteristics were authors, study aim, subject of healthcare education, design, participants, outcome measures, results, application/technologies, training time and display system. Content analysis was used to describe the study designs and to inductively identify the strengths and weaknesses of AR and MR as described by the studies included.</p>

	
<table-wrap id="t1" position="float"><label>Table 1</label><caption><title>Inclusion and exclusion</title></caption>
<table width="100%">
<thead><tr style="border-top: 1pt solid; border-bottom: 1pt solid;">
<th align="left" style="border-top: 1pt solid; border-bottom: 1pt solid; width: 19%;">Criterion</th>
<th align="left" style="border-top: 1pt solid; border-bottom: 1pt solid; width: 38%;">Inclusion criteria</th>
<th align="left" style="border-top: 1pt solid; border-bottom: 1pt solid; width: 41%;">Exclusion criteria</th>
</tr></thead>
<tbody>
<tr>
<td rowspan="4" valign="top" align="left" style="width: 19%;">
Research
&#xA0;
</td>
<td valign="top" align="left" style="width: 38%;">Goal or research question described</td>
<td valign="top" align="left" style="width: 41%;">Neither goal nor research question described</td>
</tr>
<tr>
<td valign="top" align="left" style="width: 38%;">Study design described and appropriate</td>
<td valign="top" align="left" style="width: 41%;">Review papers</td>
</tr>
<tr>
<td valign="top" align="left" style="width: 38%;">Data collection and analysis methods were described</td>
<td valign="top" align="left" style="width: 41%;">System description without data evaluation</td>
</tr>
<tr>
<td valign="top" align="left" style="width: 38%;">Results were described and discussed</td>
<td valign="top" align="left" style="width: 41%;">&#xA0;</td>
</tr>
<tr>
<td rowspan="3" valign="top" align="left" style="width: 19%;">
Focus on technology
&#xA0;
</td>
<td valign="top" align="left" style="width: 38%;">Combination of real and virtual environments</td>
<td valign="top" align="left" style="width: 41%;">Used augmented or mixed reality in name but investigated only virtual reality</td>
</tr>
<tr>
<td valign="top" align="left" style="width: 38%;">Interactive in real-time</td>
<td valign="top" align="left" style="width: 41%;">&#xA0;</td>
</tr>
<tr>
<td valign="top" align="left" style="width: 38%;">Real or perceived registration in 2D or 3D</td>
<td valign="top" align="left" style="width: 41%;">&#xA0;</td>
</tr>
<tr style="border-bottom: 1pt solid;">
<td rowspan="4" valign="top" align="left" style="border-bottom: 1pt solid; width: 19%;">
Content
&#xA0;
</td>
<td valign="top" align="left" style="border-bottom: 1pt solid; width: 38%;">Healthcare education</td>
<td valign="top" align="left" style="border-bottom: 1pt solid; width: 41%;">Education without medicine or only surgical focus</td>
</tr>
<tr>
<td valign="top" align="left" style="width: 38%;">Medical education</td>
<td valign="top" align="left" style="width: 41%;">Medicine without education or only treatment or rehabilitation focus</td>
</tr>
<tr>
<td valign="top" align="left" style="width: 38%;">&#xA0;</td>
<td valign="top" align="left" style="width: 41%;">Patient education related to treatment</td>
</tr>
<tr style="border-bottom: 1pt solid;">
<td valign="top" align="left" style="border-bottom: 1pt solid; width: 38%;">&#xA0;</td>
<td valign="top" align="left" style="border-bottom: 1pt solid; width: 41%;">Dentistry, veterinary medicine or other fields of education</td>
</tr>
</tbody>
</table></table-wrap></sec>
		
<sec><title>Quality assessment</title>
<p>The methodological quality of quantitative and mixed methods studies was evaluated with the Medical Education Research Study Quality Instrument (MERSQI).<xref ref-type="bibr" rid="r23"><sup>23</sup></xref> This 10-item instrument has been thoroughly assessed and evaluated for its correlation with other assessment tools for research quality.<xref ref-type="bibr" rid="r24"><sup>24</sup></xref> MERSQI covers six domains of studies: Study design, sampling, type of data, the validity of evaluation instrument, data analysis and outcome. All domains assign 0-3 points valuing the study to a final score between 0 and 18, the larger number indicating better study quality. The score will be presented as mean, standard deviation (SD) and range in parentheses. Each study was scored at the highest possible level. If a study reported more than one outcome, the rating for the highest outcome score was recorded not differentiating between primary or secondary outcome.</p>
<p>The quality assessment of all studies was done by JG. In addition, to assess the quality of JG evaluation, a level of approximately 20% of the studies were randomly selected for assessment by co-authors and independently evaluated by at least two authors. We computed the intraclass correlation coefficient (ICC) to calculate the inter-rater reliability (IRR) between all authors.</p>
<p>The methodological quality of qualitative studies was evaluated with a 12-item grid for Appraising Qualitative Research Articles in Medical Education that was converted into a quality assessment tool (AQRAME) by the authors of this review.<xref ref-type="bibr" rid="r25"><sup>25</sup></xref> The instrument covers five domains: Introduction, methods, results, discussion and conclusion. The domain of methods assigns 0-5 points and the conclusion domain only assigns 0-1 point, while the three remaining domains assign 0-2 points. It includes a score range between 0 and 12 points, with a larger number indicating better study quality. A score of 0.5 was given in case of an unclear answer of neither yes nor no. The score will be presented as mean, SD and range in parentheses.</p>
<p>An overall quality assessment tool was developed for rating all included studies regardless of their methodological design, assigning a figure of 1 to 7, with the larger number indicating better study quality. This was introduced to challenge the relative judgements of the MERSQI and AQRAME, acknowledging that different research questions inherently require different study designs. The appraisal was based on the need to be explicit about the role and assessment of the researcher in qualitative research.<xref ref-type="bibr" rid="r26"><sup>26</sup></xref> For studies with mixed-method designs, we applied the MERSQI tool only, rating the quantitative parts of the study.</p>
</sec></sec>
    <sec sec-type="results"><title>Results</title>
<p>Out of the 315 papers initially identified, four duplicates were removed, three articles in Chinese excluded, and one article could not be retrieved. No reporting of research was found in 14 Ted Talks and iTunes podcasts. Three hundred seven publications were screened and 281 excluded as they did not meet the inclusion criteria. Study subjects related to nasogastric tube insertion, facet joint injection, catheterization or needle guidance were interpreted to clinically related to medicine as a practice of diagnosis and so these studies were classified to fulfill the inclusion criteria. One study focusing on resection planning was included and categorized as preoperative visualization.<xref ref-type="bibr" rid="r27"><sup>27</sup></xref> However, needle insertion itself was interpreted not to produce a desired movement or therapeutic effect in part of the body and not classified as a surgical procedure. This resulted in a total of 26 studies being included in the integrative review. The flow chart of publications selected for inclusion in this integrative review is displayed in <xref ref-type="fig" rid="f1">Figure 1</xref>.</p>
<sec><title>Study characteristics</title>
<p>The studies applied AR and MR primarily by integrating the display technologies into knowledge platforms and guidance systems for simulator practice. Some studies offered feedback in the endeavor of a skill or a field of knowledge, while others provided an immersion into scenarios and remote assessment-training for telemedicine. The display technologies showed the ability to stimulate the learning process and support the learner for several competencies:</p>


<fig id="f1" position="float" fig-type="figure"><label>Figure 1</label><caption><p>Selection process of studies</p></caption>
<graphic xlink:href="4df725747ffa82d24a6885cd2ea1d47c.jpg"/></fig>

<p>To understand spatial relationships and construct mental 3D models of anatomy with the help or without 2D imaging. To acquire cognitive-psychomotor abilities, prolong learning retention, experience student-centered motivation and obtain flexibility to learn anytime and anywhere in their own pace and style. Furthermore, the studies suggested that AR and MR could complement practice in safe simulation environments contributing to patient safety and a higher degree of confidence (See <xref ref-type="supplementary-material" rid="S1">Appendix 1</xref> &#x2013; &#x201C;Summary of results&#x201D;).</p>
</sec><sec><title>Technical specifications</title>
<p>The majority of studies (n=22) examined an actual application of AR.<xref ref-type="bibr" rid="r28"><sup>28</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r49"><sup>49</sup></xref> The rest (n=4) investigated an application based on MR.<xref ref-type="bibr" rid="r27"><sup>27</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r50"><sup>50</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r52"><sup>52</sup></xref>Six applications developed by companies were reported in 10 studies.<xref ref-type="bibr" rid="r30"><sup>30</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r31"><sup>31</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r37"><sup>37</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r39"><sup>39</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r40"><sup>40</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r43"><sup>43</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r47"><sup>47</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r48"><sup>48</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r50"><sup>50</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r51"><sup>51</sup></xref> The remaining studies (n=16) involved self-developed applications primarily developed at universities and hospitals.</p>
<p>Mobile device-based (tablets and smartphones) applications were used in nine studies.<xref ref-type="bibr" rid="r33"><sup>33</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r35"><sup>35</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r37"><sup>37</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r39"><sup>39</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r41"><sup>41</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r42"><sup>42</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r47"><sup>47</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r49"><sup>49</sup></xref> Of these two thirds (n=6) involved camera and marker-based recognition, and three studies did not report any further on the applications developed.<xref ref-type="bibr" rid="r41"><sup>41</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r47"><sup>47</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r48"><sup>48</sup></xref> Eight studies implemented head-mounted display.<xref ref-type="bibr" rid="r27"><sup>27</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r28"><sup>28</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r38"><sup>38</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r40"><sup>40</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r43"><sup>43</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r46"><sup>46</sup></xref> Two studies utilized the same head-mounted display.<xref ref-type="bibr" rid="r40"><sup>40</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r43"><sup>43</sup></xref> The head-mounted display-integrated applications had marker-based recognition in four of the studies.<xref ref-type="bibr" rid="r28"><sup>28</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r40"><sup>40</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r43"><sup>43</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r44"><sup>44</sup></xref> One study recognized the hands and gestures of a mentor projecting these into in the trainee&#x2019;s display.<xref ref-type="bibr" rid="r46"><sup>46</sup></xref> Two studies implemented a foot pedal to interact with the application.<xref ref-type="bibr" rid="r27"><sup>27</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r38"><sup>38</sup></xref> For one study this included toggling between AR and MR-mode.<xref ref-type="bibr" rid="r27"><sup>27</sup></xref> Computers were used in 11 studies.<xref ref-type="bibr" rid="r30"><sup>30</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r31"><sup>31</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r34"><sup>34</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r36"><sup>36</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r38"><sup>38</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r40"><sup>40</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r43"><sup>43</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r46"><sup>46</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r50"><sup>50</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r52"><sup>52</sup></xref>These delivered the computing power for head-mounted display-based applications in four studies. <xref ref-type="bibr" rid="r38"><sup>38</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r40"><sup>40</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r43"><sup>43</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r46"><sup>46</sup></xref> One computer-based application had marker-based recognition.<xref ref-type="bibr" rid="r36"><sup>36</sup></xref> Seven studies were sensor-based.<xref ref-type="bibr" rid="r30"><sup>30</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r31"><sup>31</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r34"><sup>34</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r46"><sup>46</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r50"><sup>50</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r52"><sup>52</sup></xref>Two studies recognized landmarks of the user&#x2019;s body.<xref ref-type="bibr" rid="r30"><sup>30</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r31"><sup>31</sup></xref> Four studies recognized a virtual model registered with a phantom characterized as MR.<xref ref-type="bibr" rid="r27"><sup>27</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r50"><sup>50</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r52"><sup>52</sup></xref>Eleven studies reported using external cameras and tracking devices.<xref ref-type="bibr" rid="r27"><sup>27</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r28"><sup>28</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r31"><sup>31</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r32"><sup>32</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r34"><sup>34</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r36"><sup>36</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r44"><sup>44</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r50"><sup>50</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r52"><sup>52</sup></xref>Two studies used applications based on projectors, one recognizing markers on a phantom, and one projecting images directly onto a phantom without using a tracking device.<xref ref-type="bibr" rid="r29"><sup>29</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r51"><sup>51</sup></xref></p>
</sec><sec><title>Methodological quality</title>
<p>In the included 26 studies, nine were solely quantitative, 16 were mixed research methods and one was qualitative. Based on rating comparisons of the approximately 20% (n=5) randomly selected papers, the authors&#x2019; agreed to use the ratings by JG for MERSQI, AQRAME and the overall score for the remaining papers. The average total MERSQI score of the 25 quantitative and mixed methods studies was mean 12.26, SD=2.63 (7-15.5). The ICC between all raters were computed to IRR=.50 for the MERSQI overall score, which corresponds to a moderate reliability.<xref ref-type="bibr" rid="r53"><sup>53</sup></xref> Nearly one-third of all studies (n=8) either had no evaluation tool or did not report any validity of the instrument used.<xref ref-type="bibr" rid="r28"><sup>28</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r35"><sup>35</sup></xref></p>
<p>The qualitative study involved semi-structured face-to-face interviews that explored the needs and challenges of applying AR for healthcare education. The study demonstrated a detailed clarity and rigor according to the individual AQRAME score of all three authors corresponding to 12 (JG), 11.5 (CBS), and 12 (PD). As there was only one qualitative study, we did not report any IRR for the AQRAME overall score.</p>
<p>The mean average overall quality score of all studies was 4.08, SD=1.65 (1-7) with an adjusted ICC equaling IRR=.429 also corresponding to a moderate reliability.<xref ref-type="bibr" rid="r53"><sup>53</sup></xref> The scores of the individual studies and the study characteristics are reported in <xref ref-type="supplementary-material" rid="S1">Appendix 1</xref>.</p>
</sec><sec><title>Strengths and weaknesses of AR and MR</title>
<p>Three themes were inductively identified indicating the strengths and weaknesses of AR and MR in healthcare education beyond surgery.</p>
</sec><sec><title>Strengths</title>
<sec><title>Implemented across various subjects for learner types of all levels spanning different sectors</title>
<p>The most frequently studied subjects of healthcare education were found within anatomy (n=6) and anesthesia (n=7), the ladder represented by four studies focusing on central vein catheterization.<xref ref-type="bibr" rid="r29"><sup>29</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r38"><sup>38</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r44"><sup>44</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r52"><sup>52</sup></xref> Study participants were divided into 12 different categories: Pre-medical, medical, nursing, and health science students, novices, residents, fellows and established clinicians of different specialties, technicians, non-clinicians, non-specified participants and managers. The mean number of participants was 77.1, SD=170.6 (1-880) since the sample size was set to one in a study that did not report or specify the study participants.<xref ref-type="bibr" rid="r33"><sup>33</sup></xref> The distribution of studies across subjects of healthcare education related to the number of participants enrolled is described in <xref ref-type="supplementary-material" rid="S2">Appendix 2</xref>.</p>
</sec><sec><title>The rich diversity of research and outcome focus</title>
<p>A total of six proof-of-concept, pilot or user studies sought to introduce an application or assess initial validity.<xref ref-type="bibr" rid="r28"><sup>28</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r29"><sup>29</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r33"><sup>33</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r35"><sup>35</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r47"><sup>47</sup></xref>Eight studies focused on evaluating training by an application for strengthening the validity of the construct.<xref ref-type="bibr" rid="r30"><sup>30</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r37"><sup>37</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r39"><sup>39</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r40"><sup>40</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r42"><sup>42</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r43"><sup>43</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r50"><sup>50</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r51"><sup>51</sup></xref> The remaining studies (n=12) focused on the application-based assessment of a specific skill or procedure, eventually correlating the performance to other outcomes such as cognitive load.<xref ref-type="bibr" rid="r27"><sup>27</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r31"><sup>31</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r36"><sup>36</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r38"><sup>38</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r41"><sup>41</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r44"><sup>44</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r46"><sup>46</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r48"><sup>48</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r49"><sup>49</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r51"><sup>51</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r52"><sup>52</sup></xref> Technical test outcomes were reported in 17 studies and concerned primarily needle insertion in terms of accuracy and precision (n=11).<xref ref-type="bibr" rid="r27"><sup>27</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r29"><sup>29</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r31"><sup>31</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r33"><sup>33</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r40"><sup>40</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r43"><sup>43</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r44"><sup>44</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r50"><sup>50</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r52"><sup>52</sup></xref>  The secondly most reported technical test outcome concerned procedure time (n=9).<xref ref-type="bibr" rid="r27"><sup>27</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r29"><sup>29</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r38"><sup>38</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r43"><sup>43</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r44"><sup>44</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r46"><sup>46</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r50"><sup>50</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r52"><sup>52</sup></xref> Nineteen studies investigated learning experience and user acceptance based on especially Likert scales.<xref ref-type="bibr" rid="r30"><sup>30</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r32"><sup>32</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r34"><sup>34</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r42"><sup>42</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r44"><sup>44</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r49"><sup>49</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r52"><sup>52</sup></xref> Other questionnaire-based outcomes were cognitive load, stress response, adverse health effects and ergonomics.<xref ref-type="bibr" rid="r38"><sup>38</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r39"><sup>39</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r41"><sup>41</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r44"><sup>44</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r46"><sup>46</sup></xref>  Knowledge tests were examined in combination with questionnaire-based outcomes in six studies.<xref ref-type="bibr" rid="r36"><sup>36</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r37"><sup>37</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r39"><sup>39</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r41"><sup>41</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r42"><sup>42</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r49"><sup>49</sup></xref> One study included an observational method to determine learning behavior.<xref ref-type="bibr" rid="r49"><sup>49</sup></xref></p>
</sec><sec><title>Growing evidence for improving learning</title>
<p>In 11 studies AR and MR were claimed to significantly improve the learning process or part-tasks associated in all or in the majority of outcome measures.<xref ref-type="bibr" rid="r27"><sup>27</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r29"><sup>29</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r36"><sup>36</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r37"><sup>37</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r39"><sup>39</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r40"><sup>40</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r43"><sup>43</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r48"><sup>48</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r50"><sup>50</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r52"><sup>52</sup></xref> Four out of six studies examining the acquisition of anatomy knowledge reported significantly improved learning.<xref ref-type="bibr" rid="r36"><sup>36</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r37"><sup>37</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r39"><sup>39</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r49"><sup>49</sup></xref> Significant positive findings were found in six of 11 studies concerning skill training of needle insertion favoring both students and established clinicians.<xref ref-type="bibr" rid="r27"><sup>27</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r29"><sup>29</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r40"><sup>40</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r43"><sup>43</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r50"><sup>50</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r52"><sup>52</sup></xref> Procedure time was significantly reduced in three of nine studies.<xref ref-type="bibr" rid="r27"><sup>27</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r29"><sup>29</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r52"><sup>52</sup></xref> Examining different questionnaire-based aspects of the learning experience and user acceptance four of 19 studies demonstrated significant positive findings advocating the usability</p> 
<p>of the display technologies.<xref ref-type="bibr" rid="r36"><sup>36</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r37"><sup>37</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r39"><sup>39</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r48"><sup>48</sup></xref> Fifteen studies found no significant positive results but all suggested the AR and MR-based applications may outperform traditional learning approaches within the involved subjects of healthcare education.<xref ref-type="bibr" rid="r28"><sup>28</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r30"><sup>30</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r35"><sup>35</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r38"><sup>38</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r41"><sup>41</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r42"><sup>42</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r44"><sup>44</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r47"><sup>47</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r51"><sup>51</sup></xref> Other promising learning factors facilitated by the display technologies were related to visualization, directing attention, intrinsic benefits of motivation, physical interaction activating kinesthetic schemes, patient safety, skill retention, simulation confidence related to transferability, mobile learning and using oneself as a learning object.<xref ref-type="bibr" rid="r39"><sup>39</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r41"><sup>41</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r42"><sup>42</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r45"><sup>45</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r49"><sup>49</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r51"><sup>51</sup></xref></p>
</sec></sec><sec><title>Weaknesses</title>
<sec><title>Reporting of prototypes, technological limitations and poor ergonomics</title>
<p>Sixteen studies presented a prototype, typically as preliminary feasibility studies lacking to report adequately on the educational impact of the prototype tested.<xref ref-type="bibr" rid="r27"><sup>27</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r29"><sup>29</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r32"><sup>32</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r36"><sup>36</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r38"><sup>38</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r41"><sup>41</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r42"><sup>42</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r44"><sup>44</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r46"><sup>46</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r49"><sup>49</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r52"><sup>52</sup></xref> Ten studies were conducted on one of six established applications.<xref ref-type="bibr" rid="r30"><sup>30</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r31"><sup>31</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r37"><sup>37</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r39"><sup>39</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r40"><sup>40</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r43"><sup>43</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r47"><sup>47</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r48"><sup>48</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r50"><sup>50</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r51"><sup>51</sup></xref> The studies of head-mounted display-based applications (n=8) addressed technological limitations related to limited computing power, occlusion of the user&#x2019;s field of view and poor ergonomics by head-mounted displays being tethered to workstations and when wearing glasses underneath.<xref ref-type="bibr" rid="r27"><sup>27</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r44"><sup>44</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r46"><sup>46</sup></xref></p>
</sec><sec><title>Shortcomings of the study designs for transferability</title>
<p>Four studies were designed as a single group user study only, making strong conclusions difficult.<xref ref-type="bibr" rid="r31"><sup>31</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r33"><sup>33</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r35"><sup>35</sup></xref> Twenty-two studies used a group design or comparison, of which the most (n=17) compared two groups.<xref ref-type="bibr" rid="r27"><sup>27</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r30"><sup>30</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r34"><sup>34</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r36"><sup>36</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r38"><sup>38</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r40"><sup>40</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r42"><sup>42</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r44"><sup>44</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r45"><sup>45</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r47"><sup>47</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r51"><sup>51</sup></xref>Only two studies did not compare AR or MR with another media corresponding to lectures, books, video, virtual reality, mobile devices, conventional training platforms, and telemedical full-setup.<xref ref-type="bibr" rid="r28"><sup>28</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r34"><sup>34</sup></xref>&#xA0; Two studies compared the media of mobile devices after having provided AR content to one of the groups.<xref ref-type="bibr" rid="r41"><sup>41</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r42"><sup>42</sup></xref> Five studies encompassed three groups.<xref ref-type="bibr" rid="r37"><sup>37</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r41"><sup>41</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r43"><sup>43</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r46"><sup>46</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r52"><sup>52</sup></xref> Two of the two-group studies used a cross-over design.<xref ref-type="bibr" rid="r29"><sup>29</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r30"><sup>30</sup></xref> No study involved patients in an authentic context, but two studies included patient data.<xref ref-type="bibr" rid="r27"><sup>27</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r32"><sup>32</sup></xref></p>
</sec><sec><title>Lacking evidence for improving learning</title>
<p>Eight studies reported descriptive frequencies of self-reported evaluations and measures without any statistical analysis of significance.<xref ref-type="bibr" rid="r28"><sup>28</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r30"><sup>30</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r35"><sup>35</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r47"><sup>47</sup></xref>Seven studies claimed the display technologies offered no significant impact for improving learning in all or in the majority of outcome measures. <xref ref-type="bibr" rid="r38"><sup>38</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r41"><sup>41</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r42"><sup>42</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r44"><sup>44</sup></xref><sup>-</sup><xref ref-type="bibr" rid="r46"><sup>46</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r51"><sup>51</sup></xref> The two studies that compared AR within the same media of mobile devices found no significant difference in any of the outcome measures.<xref ref-type="bibr" rid="r41"><sup>41</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r42"><sup>42</sup></xref> Only a single study presented a significant negative finding of prolonged completion time of an ultrasound examination in the AR group.<xref ref-type="bibr" rid="r46"><sup>46</sup></xref> Potentially conflicting factors were addressed in terms of visual misperception, media or technology enthusiasm-based motivation, negation of patient discomfort related to patient safety, and missing translation of performance from simulation to clinical setting.<xref ref-type="bibr" rid="r27"><sup>27</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r41"><sup>41</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r50"><sup>50</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r51"><sup>51</sup></xref></p></sec></sec></sec>
    <sec sec-type="discussion"><title>Discussion</title>
<p>Virtual augmentation and guidance of AR and MR are increasingly used in applications for medical subjects of healthcare education these years. The quality of the existing studies and applications including the educational benefits of the display technologies remain unclear at the moment.</p>
<p>We reviewed the current research and state of AR and MR-based applications for healthcare education in medical disciplines beyond surgery. Our integrative review identified 26 original studies examining various applications of both display technologies. The applications were found to measure numerous outcomes related to the learning process, acquisition of knowledge and skill training while providing feedback on patient care-related outcomes such as complication rates, insertion time and needle path related to tissue damage. This differs greatly from the findings of a systematic review by Barsom and colleagues on applications for medical training for professionals, in which none were developed to measure the prevention of errors for the interest of patient safety.<xref ref-type="bibr" rid="r4"><sup>4</sup></xref></p>
<p>Our work revealed an increased emergence of established applications corresponding to 27% (n=6) investigated in 10 studies against 16 prototypes. A prior review by Zhu and colleagues only found one established application for laparoscopic colorectal surgery.<xref ref-type="bibr" rid="r13"><sup>13</sup></xref> In the same review, the authors found the application designs lacking guidance by learning theories only resting on traditional learning strategies. We observed that the applications of AR and MR still have not exploited the integration of learning theories and strategies into their design. Still, the increased number of established applications is a step towards turning the research base away from feasibility studies examining prototypes.</p>
<p>We conclude that the studies overall were of low-to-medium quality. This is consistent with the low to modest strength of evidence level reported in previous systematic reviews.<xref ref-type="bibr" rid="r4"><sup>4</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r17"><sup>17</sup></xref>The single qualitative study was found to be of high quality in terms of clarity and rigor, while the relative judgement of the overall quality was found to be of a low-to-medium quality. The greatest limitation across the pool of studies noted in nearly one-third of all studies (n=8) was either the utter lack or poor reporting of the validity of the evaluation instruments indirectly providing the evidence base for the study findings. Additionally, the statistical analyses reported incomplete results or were unclearly interpreted. Shortcomings of the reviewed studies further included heterogeneity of research designs, unstandardized outcome measures and wide variation in details given. Widespread heterogeneity among studies is stated to be one of the greatest challenges of quantitatively synthesizing research evidence.<xref ref-type="bibr" rid="r54"><sup>54</sup></xref> At the same time, an outspoken concern argues that media-comparative studies in learning are virtually useless and not valid for comparison.<xref ref-type="bibr" rid="r55"><sup>55</sup></xref> From this perspective, the studies failed to determine which media or technologies were best for healthcare education but rather informed practice with the specific application. These limitations are general for much education research but may be especially pronounced for research in the nexus of learning and technology.<xref ref-type="bibr" rid="r56"><sup>56</sup></xref> Nevertheless, we did not exclude studies based on their quality due to our aim of providing an overview of the strengths and weaknesses of all relevant research in AR and MR for healthcare education beyond surgery during the past half-decade.</p>
<sec><title>Limitations and recommendations for future studies</title>
<p>To our knowledge, this is the first integrative review of AR and MR solely focusing on medical subjects of healthcare education. Three articles in Chinese were not included, meaning that we possibly excluded relevant knowledge. Moreover, we may have missed relevant research either published or not published in technical journals as our main focus was on databases for healthcare and education. Our finding that all included studies suggested or reported significant positive findings should be interpreted with caution since publication bias cannot be excluded. We tried to minimize the drop-out of relevant material by including unpublished work from new online sources such as TED Talks and the podcast media of iTunes. There was a contentious issue of the designs and presentations of these varying too extensively without enhancing the quality and usefulness of the review. Our study abstained from addressing the educational profile of AV compared to AR both being encompassed by MR. This could not be done due to a low number of studies measuring AV-based learning, possibly related to the impaired technologic and conceptual understanding of MR across the research field and industry. The quality of the included studies was assessed with the MERSQI scale, which revealed inconsistencies across a few domains in the process of rating. This was mainly due to missing information in the reviewed studies as well as a lack of clarity in the MERSQI guidelines. Though moderate reliability was found between all raters in the MERSQI and the overall quality assessment tool, one could argue that the sample size of the rating corresponding to approximately 20% (n=5) of the studies either hinders or disallows reliable calculations beyond descriptive analysis. Finally, the self-developed assessment tool of AQRAME has not been validated for quality scoring qualitative research despite relying on a known 12-item grid for quality appraisal. This tool was introduced since we were not aware of any validated evaluation instruments for quality assessment of qualitative research in healthcare education.</p>
<p>A variety of applications for subjects of healthcare education beyond surgery have been developed, and their benefits were supported by this integrative review. We expect that more research will be done on the field as more institutions will explore and apply applications based on AR and MR in the future. Randomized controlled trials should continuously be organized for evaluating clinical performance and patient-care related outcomes. Specifically, the actual effects on real patients and physician behaviors towards patients in a real context are yet to be elucidated. We recommend future studies to justify and validate metrics and report the reliability of measures for higher-quality evaluations. Established guidelines and recommendations for high-quality research formulating joint standards could promote the adoption of the display technologies and facilitate exchange among researchers, educators and developers with widely different experiences and approaches.<xref ref-type="bibr" rid="r57"><sup>57</sup></xref></p>
<p>Similar to the words of David A. Cook, professor of medicine and medical education, we suggest placing more emphasis on the &#x2018;How&#x2019; and &#x2018;When&#x2019; to use AR and MR-based learning and to focus less on &#x2018;Whether&#x2019;.<xref ref-type="bibr" rid="r55"><sup>55</sup></xref> Answering these questions researchers, educators and developers should share and evaluate the instructional design and learning theory-based methods while looking into effective use of simulation, and integration of the display technologies within and between institutions. Eventually, this could also provide an understanding of learning concepts revealed from the included studies involving intrinsic benefits of motivation, physical interaction activating kinesthetic schemes, skill retention, transferability of simulation confidence, mobile learning and using oneself as a learning object. By defining instructional objectives beforehand, the display technologies should be used only when it could refine or even replace training programs and curricula.</p>
<p>With that being said partially immersive environments such as AR and MR may offer unique qualities for specifically, assessment and training procedural strategies integrating real patient data and without breaching patient safety. By using non-invasive sensors for imaging, the display technologies could complement the established imaging technologies of MRI, CT scan and ultrasound for monitoring of technical performance with an objective-comparative function as observed in our review.<xref ref-type="bibr" rid="r27"><sup>27</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r29"><sup>29</sup></xref><sup>,</sup><xref ref-type="bibr" rid="r50"><sup>50</sup></xref> To tap the full potential of the display technologies, the study and application design must be based on a throughout investigation of the educational context, learner types and learning objectives whether the latter being cognitive, technical, or non-technical such as measuring situational awareness, communication, or stress coping.</p>
</sec></sec>
    <sec sec-type="conclusions"><title>Conclusions</title>
<p>This review reports the current state of AR and MR-based applications for healthcare education beyond surgery. Studies based on both display technologies across various specialties and subjects states an increased number of established applications moving the research base away from feasibility studies on prototypes. All included studies suggested various healthcare educational benefits by the display technologies which significantly outperformed traditional learning approaches in 11 studies, specifically regarding the acquisition of anatomy knowledge and needle insertion skills. Yet, this review identifies multiple shortcomings of the studies. Study quality was low-to-medium especially due to lacking validity of the evaluation instruments, heterogeneity of research designs and widely varied reporting. Future studies are thus needed for researchers, educators and developers to build an evidence base defining suitable research designs and instructional objectives achievable by AR and MR-based applications, for these to complement conventional learning, curricula, and conduct a transformation in healthcare education.</p>
<sec><title>Acknowledgements</title>
<p>We would like to thank for financial support by the institutional funds of the Copenhagen Academy of Medical Education and Simulation (CAMES), and valuable feedback by the employees of the academy.</p>
</sec><sec><title>Conflict of Interest</title>
<p>PD holds a professorship with the University of Stavanger, Norway that is supported unconditionally by a grant from the Laerdal Foundation in Norway.</p>
</sec></sec>
  </body>
  <back>
    <sec sec-type="supplementary-material"><title>Supplementary material</title>
<supplementary-material xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="S1.pdf" id="S1" mimetype="application/pdf"><label>Supplementary file 1</label><caption><p>Appendix 1. Study characteristics including quality scores</p></caption></supplementary-material><supplementary-material xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="S2.pdf" id="S2" mimetype="application/pdf"><label>Supplementary file 2</label><caption><p>Appendix 2. Distribution of studies across medical specialty or health science, number of studies, and participants enrolled  according to number of studies</p></caption></supplementary-material></sec>
	<ref-list><title>References</title>
<ref id="r1"><label>1</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Kamphuis</surname><given-names>C</given-names></name><name><surname>Barsom</surname><given-names>E</given-names></name><name><surname>Schijven</surname><given-names>M</given-names></name><name><surname>Christoph</surname><given-names>N</given-names></name></person-group><article-title>Augmented reality in medical education?</article-title><source>Perspect Med Educ</source><year>2014</year><volume>3</volume><fpage>300</fpage><lpage>311</lpage><pub-id pub-id-type="doi">10.1007/s40037-013-0107-7</pub-id><pub-id pub-id-type="pmid">24464832</pub-id></element-citation></ref><ref id="r2"><label>2</label><mixed-citation publication-type="other">Chen L, Day TW, Tang W, John NW. Recent developments and future challenges in medical mixed reality. Proceedings of 16th IEEE International Symposium on Mixed and Augmented Reality (ISMAR); 9-13 October 2017. Nantes, France: IEEE; 2017. 
</mixed-citation></ref><ref id="r3"><label>3</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Silva</surname><given-names>JNA</given-names></name><name><surname>Southworth</surname><given-names>M</given-names></name><name><surname>Raptis</surname><given-names>C</given-names></name><name><surname>Silva</surname><given-names>J</given-names></name></person-group><article-title>Emerging applications of virtual reality&amp;#x00A0;in cardiovascular medicine.</article-title><source>JACC Basic Transl Sci</source><year>2018</year><volume>3</volume><fpage>420</fpage><lpage>430</lpage><pub-id pub-id-type="doi">10.1016/j.jacbts.2017.11.009</pub-id><pub-id pub-id-type="pmid">30062228</pub-id></element-citation></ref><ref id="r4"><label>4</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Barsom</surname><given-names>EZ</given-names></name><name><surname>Graafland</surname><given-names>M</given-names></name><name><surname>Schijven</surname><given-names>MP</given-names></name></person-group><article-title>Systematic review on the effectiveness of augmented reality applications in medical training.</article-title><source>Surg Endosc</source><year>2016</year><volume>30</volume><fpage>4174</fpage><lpage>4183</lpage><pub-id pub-id-type="doi">10.1007/s00464-016-4800-6</pub-id><pub-id pub-id-type="pmid">26905573</pub-id></element-citation></ref><ref id="r5"><label>5</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Azuma</surname><given-names>R</given-names></name><name><surname>Baillot</surname><given-names>Y</given-names></name><name><surname>Behringer</surname><given-names>R</given-names></name><name><surname>Feiner</surname><given-names>S</given-names></name><name><surname>Julier</surname><given-names>S</given-names></name><name><surname>MacIntyre</surname><given-names>B</given-names></name></person-group><article-title>Recent advances in augmented reality.</article-title><source>IEEE Comput Grap Appl</source><year>2001</year><volume>21</volume><fpage>34</fpage><lpage>47</lpage><pub-id pub-id-type="doi">10.1109/38.963459</pub-id></element-citation></ref><ref id="r6"><label>6</label><mixed-citation publication-type="other">Milgram P, Takemura H, Utsumi A, Kishino F. Augmented reality: a class of displays on the reality-virtuality continuum. Telemanipulator and Telepresence Technologies. 1995;2351:282-92.
</mixed-citation></ref><ref id="r7"><label>7</label><mixed-citation publication-type="other">Milgram P, Kishino F. Taxonomy of mixed reality visual displays. IEICE Transactions on Information Systems. 1994;E77-D(12):449-459.
</mixed-citation></ref><ref id="r8"><label>8</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Brigham</surname><given-names>TJ</given-names></name></person-group><article-title>Reality check: basics of augmented, virtual, and mixed reality.</article-title><source>Med Ref Serv Q</source><year>2017</year><volume>36</volume><fpage>171</fpage><lpage>178</lpage><pub-id pub-id-type="doi">10.1080/02763869.2017.1293987</pub-id><pub-id pub-id-type="pmid">28453428</pub-id></element-citation></ref><ref id="r9"><label>9</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Zhu</surname><given-names>E</given-names></name><name><surname>Lilienthal</surname><given-names>A</given-names></name><name><surname>Shluzas</surname><given-names>LA</given-names></name><name><surname>Masiello</surname><given-names>I</given-names></name><name><surname>Zary</surname><given-names>N</given-names></name></person-group><article-title>Design of mobile augmented reality in health care education: a theory-driven framework.</article-title><source>JMIR Med Educ</source><year>2015</year><volume>1</volume><fpage>e10</fpage><pub-id pub-id-type="doi">10.2196/mededu.4443</pub-id><pub-id pub-id-type="pmid">27731839</pub-id></element-citation></ref><ref id="r10"><label>10</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Wojtczak</surname><given-names>A</given-names></name></person-group><article-title>Glossary of medical education terms: Part 4.</article-title><source>Med Teach</source><year>2002</year><volume>24</volume><fpage>567</fpage><lpage>568</lpage><pub-id pub-id-type="doi">10.1080/0142159021000012667</pub-id><pub-id pub-id-type="pmid">12450484</pub-id></element-citation></ref><ref id="r11"><label>11</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Smith</surname><given-names>ML</given-names></name><name><surname>Foley</surname><given-names>MR</given-names></name></person-group><article-title>Transforming clinical education in obstetrics and gynecology: gone is the day of the sage on the stage.</article-title><source>Obstet Gynecol</source><year>2016</year><volume>127</volume><fpage>763</fpage><lpage>767</lpage><pub-id pub-id-type="doi">10.1097/aog.0000000000001356</pub-id><pub-id pub-id-type="pmid">26959215</pub-id></element-citation></ref><ref id="r12"><label>12</label><mixed-citation publication-type="other">Ellis H, Abdalla S. A history of surgery. Boca Raton: CRC Press; 2018. 
</mixed-citation></ref><ref id="r13"><label>13</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Zhu</surname><given-names>E</given-names></name><name><surname>Hadadgar</surname><given-names>A</given-names></name><name><surname>Masiello</surname><given-names>I</given-names></name><name><surname>Zary</surname><given-names>N</given-names></name></person-group><article-title>Augmented reality in healthcare education: an integrative review.</article-title><source>PeerJ</source><year>2014</year><volume>2</volume><fpage>e469</fpage><pub-id pub-id-type="doi">10.7717/peerj.469</pub-id><pub-id pub-id-type="pmid">25071992</pub-id></element-citation></ref><ref id="r14"><label>14</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Yoon</surname><given-names>JW</given-names></name><name><surname>Chen</surname><given-names>RE</given-names></name><name><surname>Kim</surname><given-names>EJ</given-names></name><name><surname>Akinduro</surname><given-names>OO</given-names></name><name><surname>Kerezoudis</surname><given-names>P</given-names></name><name><surname>Han</surname><given-names>PK</given-names></name><name><surname>Si</surname><given-names>P</given-names></name><name><surname>Freeman</surname><given-names>WD</given-names></name><name><surname>Diaz</surname><given-names>RJ</given-names></name><name><surname>Komotar</surname><given-names>RJ</given-names></name><name><surname>Pirris</surname><given-names>SM</given-names></name><name><surname>Brown</surname><given-names>BL</given-names></name><name><surname>Bydon</surname><given-names>M</given-names></name><name><surname>Wang</surname><given-names>MY</given-names></name><name><surname>Wharen</surname><given-names>RE</given-names></name><name><surname>Quinones-Hinojosa</surname><given-names>A</given-names></name></person-group><article-title>Augmented reality for the surgeon: systematic review.</article-title><source>Int J Med Robot</source><year>2018</year><volume>14</volume><fpage>1914</fpage><pub-id pub-id-type="doi">10.1002/rcs.1914</pub-id><pub-id pub-id-type="pmid">29708640</pub-id></element-citation></ref>
		
<ref id="r15"><label>15</label><element-citation publication-type="journal">
<person-group person-group-type="author">
<name><surname>Aggarwal</surname><given-names>A</given-names></name>
</person-group>
<article-title>The evolving relationship between surgery and medicine.</article-title>
<source>Virtual Mentor</source>
<year>2010</year><volume>12</volume><fpage>119</fpage><lpage>123</lpage>
<pub-id pub-id-type="doi">10.1001/virtualmentor.2010.12.2.mhst1-1002</pub-id>
<pub-id pub-id-type="pmid">23140820</pub-id></element-citation></ref>
		
<ref id="r16"><label>16</label><mixed-citation publication-type="other">Martin E. Concise medical dictionary. Oxford: Oxford University Press; 2015.</mixed-citation></ref>
		
<ref id="r17"><label>17</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Wong</surname><given-names>K</given-names></name><name><surname>Yee</surname><given-names>HM</given-names></name><name><surname>Xavier</surname><given-names>BA</given-names></name><name><surname>Grillone</surname><given-names>GA</given-names></name></person-group><article-title>Applications of augmented reality in otolaryngology: a systematic review.</article-title><source>Otolaryngol Head Neck Surg</source><year>2018</year><volume>159</volume><fpage>956</fpage><lpage>967</lpage><pub-id pub-id-type="doi">10.1177/0194599818796476</pub-id><pub-id pub-id-type="pmid">30126323</pub-id></element-citation></ref><ref id="r18"><label>18</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Whittemore</surname><given-names>R</given-names></name></person-group><article-title>Combining evidence in nursing research: methods and implications</article-title><source>Nurs Res</source><year>2005</year><volume>54</volume><fpage>56</fpage><lpage>62</lpage><pub-id pub-id-type="doi">10.1097/00006199-200501000-00008</pub-id><pub-id pub-id-type="pmid">15695940</pub-id></element-citation></ref>

<ref id="r19"><label>19</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Whittemore</surname><given-names>R</given-names></name><name><surname>Knafl</surname><given-names>K</given-names></name></person-group><article-title>The integrative review: updated methodology.</article-title><source>J Adv Nurs</source><year>2005</year><volume>52</volume><fpage>546</fpage><lpage>553</lpage><pub-id pub-id-type="doi">10.1111/j.1365-2648.2005.03621.x</pub-id><pub-id pub-id-type="pmid">16268861</pub-id></element-citation></ref><ref id="r20"><label>20</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Grant</surname><given-names>MJ</given-names></name><name><surname>Booth</surname><given-names>A</given-names></name></person-group><article-title>A typology of reviews: an analysis of 14 review types and associated methodologies.</article-title><source>Health Info Libr J</source><year>2009</year><volume>26</volume><fpage>91</fpage><lpage>108</lpage><pub-id pub-id-type="doi">10.1111/j.1471-1842.2009.00848.x</pub-id><pub-id pub-id-type="pmid">19490148</pub-id></element-citation></ref><ref id="r21"><label>21</label><mixed-citation publication-type="other">TED. TED: Ideas worth spreading. [Cited 1 September 2018]: Available from: https://www.ted.com.
</mixed-citation></ref><ref id="r22"><label>22</label><mixed-citation publication-type="other">iTunes Podcast app. Podcasts Downloads on iTunes. [Cited 1 September 2018]; Available from: https://itunes.apple.com/gb/genre/podcasts/id26.
</mixed-citation></ref><ref id="r23"><label>23</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Reed</surname><given-names>DA</given-names></name><name><surname>Cook</surname><given-names>DA</given-names></name><name><surname>Beckman</surname><given-names>TJ</given-names></name><name><surname>Levine</surname><given-names>RB</given-names></name><name><surname>Kern</surname><given-names>DE</given-names></name><name><surname>Wright</surname><given-names>SM</given-names></name></person-group><article-title>Association between funding and quality of published medical education research.</article-title><source>JAMA</source><year>2007</year><volume>298</volume><fpage>1002</fpage><pub-id pub-id-type="doi">10.1001/jama.298.9.1002</pub-id><pub-id pub-id-type="pmid">17785645</pub-id></element-citation></ref><ref id="r24"><label>24</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Cook</surname><given-names>DA</given-names></name><name><surname>Reed</surname><given-names>DA</given-names></name></person-group><article-title>Appraising the quality of medical education research methods: the Medical Education Research Study Quality Instrument and the Newcastle-Ottawa Scale-Education.</article-title><source>Acad Med</source><year>2015</year><volume>90</volume><fpage>1067</fpage><lpage>1076</lpage><pub-id pub-id-type="doi">10.1097/ACM.0000000000000786</pub-id><pub-id pub-id-type="pmid">26107881</pub-id></element-citation></ref><ref id="r25"><label>25</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>C&#xF4;t&#xE9;</surname><given-names>L</given-names></name><name><surname>Turgeon</surname><given-names>J</given-names></name></person-group><article-title>Appraising qualitative research articles in medicine and medical education.</article-title><source>Med Teach</source><year>2005</year><volume>27</volume><fpage>71</fpage><lpage>75</lpage><pub-id pub-id-type="doi">10.1080/01421590400016308</pub-id><pub-id pub-id-type="pmid">16147774</pub-id></element-citation></ref>
	
<ref id="r26"><label>26</label><element-citation publication-type="journal">
<person-group person-group-type="author">
<name><surname>Stacy</surname><given-names>R</given-names></name>
<name><surname>Spencer</surname><given-names>J</given-names></name>
</person-group>
<article-title>Assessing the evidence in qualitative medical education research.</article-title><source>Med Educ</source><year>2000</year><volume>34</volume><fpage>498</fpage><lpage>500</lpage><pub-id pub-id-type="doi">10.1046/j.1365-2923.2000.00710.x</pub-id><pub-id pub-id-type="pmid">10886625</pub-id></element-citation></ref>
	
<ref id="r27"><label>27</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Abhari</surname><given-names>K</given-names></name><name><surname>Baxter</surname><given-names>JSH</given-names></name><name><surname>Chen</surname><given-names>ECS</given-names></name><name><surname>Khan</surname><given-names>AR</given-names></name><name><surname>Peters</surname><given-names>TM</given-names></name><name><surname>de Ribaupierre</surname><given-names>S</given-names></name><name><surname>Eagleson</surname><given-names>R</given-names></name></person-group><article-title>Training for planning tumour resection: augmented reality and human factors.</article-title><source>IEEE Trans Biomed Eng</source><year>2015</year><volume>62</volume><fpage>1466</fpage><lpage>1477</lpage><pub-id pub-id-type="doi">10.1109/tbme.2014.2385874</pub-id><pub-id pub-id-type="pmid">25546854</pub-id></element-citation></ref><ref id="r28"><label>28</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Bifulco</surname><given-names>P</given-names></name><name><surname>Narducci</surname><given-names>F</given-names></name><name><surname>Vertucci</surname><given-names>R</given-names></name><name><surname>Ambruosi</surname><given-names>P</given-names></name><name><surname>Cesarelli</surname><given-names>M</given-names></name><name><surname>Romano</surname><given-names>M</given-names></name></person-group><article-title>Telemedicine supported by Augmented Reality: an interactive guide for untrained people in performing an ECG test.</article-title><source>Biomed Eng Online</source><year>2014</year><volume>13</volume><fpage>153</fpage><pub-id pub-id-type="doi">10.1186/1475-925x-13-153</pub-id><pub-id pub-id-type="pmid">25413448</pub-id></element-citation></ref><ref id="r29"><label>29</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Jeon</surname><given-names>Y</given-names></name><name><surname>Choi</surname><given-names>S</given-names></name><name><surname>Kim</surname><given-names>H</given-names></name></person-group><article-title>Evaluation of a simplified augmented reality device for ultrasound-guided vascular access in a vascular phantom.</article-title><source>J Clin Anesth</source><year>2014</year><volume>26</volume><fpage>485</fpage><lpage>489</lpage><pub-id pub-id-type="doi">10.1016/j.jclinane.2014.02.010</pub-id><pub-id pub-id-type="pmid">25204510</pub-id></element-citation></ref><ref id="r30"><label>30</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Kugelmann</surname><given-names>D</given-names></name><name><surname>Stratmann</surname><given-names>L</given-names></name><name><surname>N&#xFC;hlen</surname><given-names>N</given-names></name><name><surname>Bork</surname><given-names>F</given-names></name><name><surname>Hoffmann</surname><given-names>S</given-names></name><name><surname>Samarbarksh</surname><given-names>G</given-names></name><name><surname>Pferschy</surname><given-names>A</given-names></name><name><surname>von der Heide</surname><given-names>AM</given-names></name><name><surname>Eimannsberger</surname><given-names>A</given-names></name><name><surname>Fallavollita</surname><given-names>P</given-names></name><name><surname>Navab</surname><given-names>N</given-names></name><name><surname>Waschke</surname><given-names>J</given-names></name></person-group><article-title>An Augmented Reality magic mirror as additive teaching device for gross anatomy.</article-title><source>Ann Anat</source><year>2018</year><volume>215</volume><fpage>71</fpage><lpage>77</lpage><pub-id pub-id-type="doi">10.1016/j.aanat.2017.09.011</pub-id><pub-id pub-id-type="pmid">29017852</pub-id></element-citation></ref><ref id="r31"><label>31</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Ma</surname><given-names>M</given-names></name><name><surname>Fallavollita</surname><given-names>P</given-names></name><name><surname>Seelbach</surname><given-names>I</given-names></name><name><surname>Von Der Heide</surname><given-names>AM</given-names></name><name><surname>Euler</surname><given-names>E</given-names></name><name><surname>Waschke</surname><given-names>J</given-names></name><name><surname>Navab</surname><given-names>N</given-names></name></person-group><article-title>Personalized augmented reality for anatomy education.</article-title><source>Clin Anat</source><year>2016</year><volume>29</volume><fpage>446</fpage><lpage>453</lpage><pub-id pub-id-type="doi">10.1002/ca.22675</pub-id><pub-id pub-id-type="pmid">26646315</pub-id></element-citation></ref><ref id="r32"><label>32</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Mewes</surname><given-names>A</given-names></name><name><surname>Heinrich</surname><given-names>F</given-names></name><name><surname>K&#xE4;gebein</surname><given-names>U</given-names></name><name><surname>Hensen</surname><given-names>B</given-names></name><name><surname>Wacker</surname><given-names>F</given-names></name><name><surname>Hansen</surname><given-names>C</given-names></name></person-group><article-title>Projector-based augmented reality system for interventional visualization inside MRI scanners.</article-title><source>Int J Med Robot</source><year>2019</year><volume>15</volume><fpage>1950</fpage><pub-id pub-id-type="doi">10.1002/rcs.1950</pub-id><pub-id pub-id-type="pmid">30168639</pub-id></element-citation></ref><ref id="r33"><label>33</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Solbiati</surname><given-names>M</given-names></name><name><surname>Passera</surname><given-names>KM</given-names></name><name><surname>Rotilio</surname><given-names>A</given-names></name><name><surname>Oliva</surname><given-names>F</given-names></name><name><surname>Marre</surname><given-names>I</given-names></name><name><surname>Goldberg</surname><given-names>SN</given-names></name><name><surname>Ierace</surname><given-names>T</given-names></name><name><surname>Solbiati</surname><given-names>L</given-names></name></person-group><article-title>Augmented reality for interventional oncology: proof-of-concept study of a novel high-end guidance system platform.</article-title><source>Eur Radiol Exp</source><year>2018</year><volume>2</volume><fpage>18</fpage><pub-id pub-id-type="doi">10.1186/s41747-018-0054-5</pub-id><pub-id pub-id-type="pmid">30148251</pub-id></element-citation></ref><ref id="r34"><label>34</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Sutherland</surname><given-names>C</given-names></name><name><surname>Hashtrudi-Zaad</surname><given-names>K</given-names></name><name><surname>Sellens</surname><given-names>R</given-names></name><name><surname>Abolmaesumi</surname><given-names>P</given-names></name><name><surname>Mousavi</surname><given-names>P</given-names></name></person-group><article-title>An augmented reality haptic training simulator for spinal needle procedures.</article-title><source>IEEE Trans Biomed Eng</source><year>2013</year><volume>60</volume><fpage>3009</fpage><lpage>3018</lpage><pub-id pub-id-type="doi">10.1109/tbme.2012.2236091</pub-id><pub-id pub-id-type="pmid">23269747</pub-id></element-citation></ref><ref id="r35"><label>35</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Wang</surname><given-names>LL</given-names></name><name><surname>Wu</surname><given-names>HH</given-names></name><name><surname>Bilici</surname><given-names>N</given-names></name><name><surname>Tenney-Soeiro</surname><given-names>R</given-names></name></person-group><article-title>Gunner goggles: implementing augmented reality into medical education.</article-title><source>Stud Health Technol Inform</source><year>2016</year><volume>220</volume><fpage>446</fpage><lpage>449</lpage><pub-id pub-id-type="pmid">27046620</pub-id></element-citation></ref><ref id="r36"><label>36</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Ferrer-Torregrosa</surname><given-names>J</given-names></name><name><surname>Torralba</surname><given-names>J</given-names></name><name><surname>Jimenez</surname><given-names>MA</given-names></name><name><surname>Garc&#xED;a</surname><given-names>S</given-names></name><name><surname>Barcia</surname><given-names>JM</given-names></name></person-group><article-title>ARBOOK: development and assessment of a tool based on augmented reality for anatomy.</article-title><source>J Sci Educ Technol</source><year>2015</year><volume>24</volume><fpage>119</fpage><lpage>124</lpage><pub-id pub-id-type="doi">10.1007/s10956-014-9526-4</pub-id></element-citation></ref><ref id="r37"><label>37</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Ferrer-Torregrosa</surname><given-names>J</given-names></name><name><surname>Jim&#xE9;nez-Rodr&#xED;guez</surname><given-names>M</given-names></name><name><surname>Torralba-Estelles</surname><given-names>J</given-names></name><name><surname>Garz&#xF3;n-Farin&#xF3;s</surname><given-names>F</given-names></name><name><surname>P&#xE9;rez-Bermejo</surname><given-names>M</given-names></name><name><surname>Fern&#xE1;ndez-Ehrling</surname><given-names>N</given-names></name></person-group><article-title>Distance learning ects and flipped classroom in the anatomy learning: comparative study of the use of augmented reality, video and notes.</article-title><source>BMC Med Educ</source><year>2016</year><volume>16</volume><fpage>230</fpage><pub-id pub-id-type="doi">10.1186/s12909-016-0757-3</pub-id><pub-id pub-id-type="pmid">27581521</pub-id></element-citation></ref>
	
<ref id="r38"><label>38</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Huang</surname><given-names>CY</given-names></name><name><surname>Thomas</surname><given-names>JB</given-names></name><name><surname>Alismail</surname><given-names>A</given-names></name><name><surname>Cohen</surname><given-names>A</given-names></name><name><surname>Almutairi</surname><given-names>W</given-names></name><name><surname>Daher</surname><given-names>NS</given-names></name><name><surname>Terry</surname><given-names>MH</given-names></name><name><surname>Tan</surname><given-names>LD</given-names></name></person-group><article-title>The use of augmented reality glasses in central line simulation: &amp;quot;see one, simulate many, do one competently, and teach everyone&amp;quot;.</article-title><source>Adv Med Educ Pract</source><year>2018</year><volume>Volume 9</volume><fpage>357</fpage><lpage>363</lpage><pub-id pub-id-type="doi">10.2147/amep.s160704</pub-id><pub-id pub-id-type="pmid">29785148</pub-id></element-citation></ref>
	
<ref id="r39"><label>39</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>K&#xFC;&#xE7;&#xFC;k</surname><given-names>S</given-names></name><name><surname>Kapakin</surname><given-names>S</given-names></name><name><surname>G&#xF6;kta&#x15F;</surname><given-names>Y</given-names></name></person-group><article-title>Learning anatomy via mobile augmented reality: effects on achievement and cognitive load.</article-title><source>Anat Sci Educ</source><year>2016</year><volume>9</volume><fpage>411</fpage><lpage>421</lpage><pub-id pub-id-type="doi">10.1002/ase.1603</pub-id><pub-id pub-id-type="pmid">26950521</pub-id></element-citation></ref><ref id="r40"><label>40</label><mixed-citation publication-type="other">Leitritz MA, Ziemssen F, Suesskind D, Partsch M, Voykov B, Bartz-Schmidt KU, et al. Critical evaluation of the usability of augmented reality ophthalmoscopy for the training of inexperienced examiners. Retina. 2014;34(4):785&amp;#x2013;91. 
</mixed-citation></ref><ref id="r41"><label>41</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Moro</surname><given-names>C</given-names></name><name><surname>&#x160;tromberga</surname><given-names>Z</given-names></name><name><surname>Raikos</surname><given-names>A</given-names></name><name><surname>Stirling</surname><given-names>A</given-names></name></person-group><article-title>The effectiveness of virtual and augmented reality in health sciences and medical anatomy.</article-title><source>Anat Sci Educ</source><year>2017</year><volume>10</volume><fpage>549</fpage><lpage>559</lpage><pub-id pub-id-type="doi">10.1002/ase.1696</pub-id><pub-id pub-id-type="pmid">28419750</pub-id></element-citation></ref><ref id="r42"><label>42</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Noll</surname><given-names>C</given-names></name><name><surname>von Jan</surname><given-names>U</given-names></name><name><surname>Raap</surname><given-names>U</given-names></name><name><surname>Albrecht</surname><given-names>UV</given-names></name></person-group><article-title>Mobile augmented reality as a feature for self-oriented, blended learning in medicine: randomized controlled trial.</article-title><source>JMIR Mhealth Uhealth</source><year>2017</year><volume>5</volume><fpage>e139</fpage><pub-id pub-id-type="doi">10.2196/mhealth.7943</pub-id><pub-id pub-id-type="pmid">28912113</pub-id></element-citation></ref><ref id="r43"><label>43</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Rai</surname><given-names>AS</given-names></name><name><surname>Rai</surname><given-names>AS</given-names></name><name><surname>Mavrikakis</surname><given-names>E</given-names></name><name><surname>Lam</surname><given-names>WC</given-names></name></person-group><article-title>Teaching binocular indirect ophthalmoscopy to novice residents using an augmented reality simulator.</article-title><source>Can J Ophthalmol</source><year>2017</year><volume>52</volume><fpage>430</fpage><lpage>434</lpage><pub-id pub-id-type="doi">10.1016/j.jcjo.2017.02.015</pub-id><pub-id pub-id-type="pmid">28985799</pub-id></element-citation></ref><ref id="r44"><label>44</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Rochlen</surname><given-names>LR</given-names></name><name><surname>Levine</surname><given-names>R</given-names></name><name><surname>Tait</surname><given-names>AR</given-names></name></person-group><article-title>First-person point-of-view-augmented reality for central line insertion training: a usability and feasibility study.</article-title><source>Simul Healthc</source><year>2017</year><volume>12</volume><fpage>57</fpage><lpage>62</lpage><pub-id pub-id-type="doi">10.1097/SIH.0000000000000185</pub-id><pub-id pub-id-type="pmid">27930431</pub-id></element-citation></ref><ref id="r45"><label>45</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Siebert</surname><given-names>JN</given-names></name><name><surname>Ehrler</surname><given-names>F</given-names></name><name><surname>Gervaix</surname><given-names>A</given-names></name><name><surname>Haddad</surname><given-names>K</given-names></name><name><surname>Lacroix</surname><given-names>L</given-names></name><name><surname>Schrurs</surname><given-names>P</given-names></name><name><surname>Sahin</surname><given-names>A</given-names></name><name><surname>Lovis</surname><given-names>C</given-names></name><name><surname>Manzano</surname><given-names>S</given-names></name></person-group><article-title>Adherence to AHA guidelines when adapted for augmented reality glasses for assisted pediatric cardiopulmonary resuscitation: a randomized controlled trial.</article-title><source>J Med Internet Res</source><year>2017</year><volume>19</volume><fpage>e183</fpage><pub-id pub-id-type="doi">10.2196/jmir.7379</pub-id><pub-id pub-id-type="pmid">28554878</pub-id></element-citation></ref><ref id="r46"><label>46</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Wang</surname><given-names>S</given-names></name><name><surname>Parsons</surname><given-names>M</given-names></name><name><surname>Stone-McLean</surname><given-names>J</given-names></name><name><surname>Rogers</surname><given-names>P</given-names></name><name><surname>Boyd</surname><given-names>S</given-names></name><name><surname>Hoover</surname><given-names>K</given-names></name><name><surname>Meruvia-Pastor</surname><given-names>O</given-names></name><name><surname>Gong</surname><given-names>M</given-names></name><name><surname>Smith</surname><given-names>A</given-names></name></person-group><article-title>Augmented reality as a telemedicine platform for remote procedural training.</article-title><source>Sensors (Basel)</source><year>2017</year><volume>17</volume><pub-id pub-id-type="doi">10.3390/s17102294</pub-id><pub-id pub-id-type="pmid">28994720</pub-id></element-citation></ref><ref id="r47"><label>47</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Zhu</surname><given-names>E</given-names></name><name><surname>Fors</surname><given-names>U</given-names></name><name><surname>Smedberg</surname><given-names/></name></person-group><article-title>Exploring the needs and possibilities of physicians&amp;#x2019; continuing professional development - An explorative qualitative study in a Chinese primary care context.</article-title><source>PLoS ONE</source><year>2018</year><volume>13</volume><fpage>e0202635</fpage><pub-id pub-id-type="doi">10.1371/journal.pone.0202635</pub-id><pub-id pub-id-type="pmid">30114295</pub-id></element-citation></ref><ref id="r48"><label>48</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Aebersold</surname><given-names>M</given-names></name><name><surname>Voepel-Lewis</surname><given-names>T</given-names></name><name><surname>Cherara</surname><given-names>L</given-names></name><name><surname>Weber</surname><given-names>M</given-names></name><name><surname>Khouri</surname><given-names>C</given-names></name><name><surname>Levine</surname><given-names>R</given-names></name><name><surname>Tait</surname><given-names>AR</given-names></name></person-group><article-title>Interactive anatomy-augmented virtual simulation training.</article-title><source>Clin Simul Nurs</source><year>2018</year><volume>15</volume><fpage>34</fpage><lpage>41</lpage><pub-id pub-id-type="doi">10.1016/j.ecns.2017.09.008</pub-id><pub-id pub-id-type="pmid">29861797</pub-id></element-citation></ref><ref id="r49"><label>49</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Albrecht</surname><given-names>UV</given-names></name><name><surname>Folta-Schoofs</surname><given-names>K</given-names></name><name><surname>Behrends</surname><given-names>M</given-names></name><name><surname>von Jan</surname><given-names>U</given-names></name></person-group><article-title>Effects of mobile augmented reality learning compared to textbook learning on medical students: randomized controlled pilot study.</article-title><source>J Med Internet Res</source><year>2013</year><volume>15</volume><fpage>e182</fpage><pub-id pub-id-type="doi">10.2196/jmir.2497</pub-id><pub-id pub-id-type="pmid">23963306</pub-id></element-citation></ref><ref id="r50"><label>50</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Keri</surname><given-names>Z</given-names></name><name><surname>Sydor</surname><given-names>D</given-names></name><name><surname>Ungi</surname><given-names>T</given-names></name><name><surname>Holden</surname><given-names>MS</given-names></name><name><surname>McGraw</surname><given-names>R</given-names></name><name><surname>Mousavi</surname><given-names>P</given-names></name><name><surname>Borschneck</surname><given-names>DP</given-names></name><name><surname>Fichtinger</surname><given-names>G</given-names></name><name><surname>Jaeger</surname><given-names>M</given-names></name></person-group><article-title>Computerized training system for ultrasound-guided lumbar puncture on abnormal spine models: a randomized controlled trial.</article-title><source>Can J Anaesth</source><year>2015</year><volume>62</volume><fpage>777</fpage><lpage>784</lpage><pub-id pub-id-type="doi">10.1007/s12630-015-0367-2</pub-id><pub-id pub-id-type="pmid">25804431</pub-id></element-citation></ref><ref id="r51"><label>51</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Moult</surname><given-names>E</given-names></name><name><surname>Ungi</surname><given-names>T</given-names></name><name><surname>Welch</surname><given-names>M</given-names></name><name><surname>Lu</surname><given-names>J</given-names></name><name><surname>McGraw</surname><given-names>RC</given-names></name><name><surname>Fichtinger</surname><given-names>G</given-names></name></person-group><article-title>Ultrasound-guided facet joint injection training using Perk Tutor.</article-title><source>Int J Comput Assist Radiol Surg</source><year>2013</year><volume>8</volume><fpage>831</fpage><lpage>836</lpage><pub-id pub-id-type="doi">10.1007/s11548-012-0811-5</pub-id><pub-id pub-id-type="pmid">23329279</pub-id></element-citation></ref>
	
<ref id="r52"><label>52</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Robinson</surname><given-names>AR</given-names></name><name><surname>Gravenstein</surname><given-names>N</given-names></name><name><surname>Cooper</surname><given-names>LA</given-names></name><name><surname>Lizdas</surname><given-names>D</given-names></name><name><surname>Luria</surname><given-names>I</given-names></name><name><surname>Lampotang</surname><given-names>S</given-names></name></person-group><article-title>A mixed-reality part-task trainer for subclavian venous access.</article-title><source>Simul Healthc</source><year>2014</year><volume>9</volume><fpage>56</fpage><lpage>64</lpage><pub-id pub-id-type="doi">10.1097/sih.0b013e31829b3fb3</pub-id><pub-id pub-id-type="pmid">24310163</pub-id></element-citation></ref>
	
<ref id="r53"><label>53</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Landis</surname><given-names>JR</given-names></name><name><surname>Koch</surname><given-names>GG</given-names></name></person-group><article-title>The measurement of observer agreement for categorical data.</article-title><source>Biometrics</source><year>1977</year><volume>33</volume><fpage>159</fpage><lpage>174</lpage><pub-id pub-id-type="pmid">843571</pub-id></element-citation></ref>
	
<ref id="r54"><label>54</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Barry Issenberg</surname><given-names>S</given-names></name><name><surname>Mcgaghie</surname><given-names>WC</given-names></name><name><surname>Petrusa</surname><given-names>ER</given-names></name><name><surname>Lee Gordon</surname><given-names>D</given-names></name><name><surname>Scalese</surname><given-names>RJ</given-names></name></person-group><article-title>Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review.</article-title><source>Med Teach</source><year>2005</year><volume>27</volume><fpage>10</fpage><lpage>28</lpage><pub-id pub-id-type="doi">10.1080/01421590500046924</pub-id><pub-id pub-id-type="pmid">16147767</pub-id></element-citation></ref>
	
<ref id="r55"><label>55</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Cook</surname><given-names>DA</given-names></name></person-group><article-title>Where are we with Web-based learning in medical education?</article-title><source>Med Teach</source><year>2006</year><volume>28</volume><fpage>594</fpage><lpage>598</lpage><pub-id pub-id-type="doi">10.1080/01421590601028854</pub-id><pub-id pub-id-type="pmid">17594549</pub-id></element-citation></ref>	
<ref id="r56"><label>56</label><element-citation publication-type="journal">
<person-group person-group-type="author">
<name><surname>Jensen</surname><given-names>L</given-names></name>
<name><surname>Konradsen</surname><given-names>F</given-names></name>
</person-group><article-title>A review of the use of virtual reality head-mounted displays in education and training. Education and Information Technologies.</article-title>
<source>Education and Information Technologies</source>
<year>2018</year><volume>23</volume><fpage>1515</fpage><lpage>29</lpage>
<pub-id pub-id-type="doi">10.1007/s10639-017-9676-0</pub-id>
</element-citation></ref>
<ref id="r57"><label>57</label><element-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Cheng</surname><given-names>A</given-names></name><name><surname>Kessler</surname><given-names>D</given-names></name><name><surname>Mackinnon</surname><given-names>R</given-names></name><name><surname>Chang</surname><given-names>TP</given-names></name><name><surname>Nadkarni</surname><given-names>VM</given-names></name><name><surname>Hunt</surname><given-names>EA</given-names></name><name><surname>Duval-Arnould</surname><given-names>J</given-names></name><name><surname>Lin</surname><given-names>Y</given-names></name><name><surname>Cook</surname><given-names>DA</given-names></name><name><surname>Pusic</surname><given-names>M</given-names></name><name><surname>Hui</surname><given-names>J</given-names></name><name><surname>Moher</surname><given-names>D</given-names></name><name><surname>Egger</surname><given-names>M</given-names></name><name><surname>Auerbach</surname><given-names>M</given-names></name></person-group><article-title>Reporting guidelines for health care simulation research: extensions to the CONSORT and STROBE statements.</article-title><source>Simul Healthc</source><year>2016</year><volume>11</volume><fpage>238</fpage><lpage>248</lpage><pub-id pub-id-type="doi">10.1097/SIH.0000000000000150</pub-id><pub-id pub-id-type="pmid">27465839</pub-id></element-citation></ref></ref-list>
  </back>
</article>
