ORIGINAL RESEARCH 6398 DOWNLOADS

Can a smartphone app improve medical trainees’ knowledge of antibiotics?

Michael Fralick1, Reem Haj2, Dhruvin Hirpara3, Karen Wong3, Matthew Muller4, Larissa Matukas5, John Bartlett6, Elizabeth Leung2 and Linda Taggart4

1Department of Medicine, Division of General Internal Medicine, University of Toronto, Toronto, Canada

2Department of Pharmacy, St Michael's Hospital, Toronto, Canada

3Faculty of Medicine, University of Toronto, Toronto, Canada

4Division of Infectious Diseases, Department of Medicine, University of Toronto, Toronto, Canada

5Department of Laboratory Medicine and Pathobiology, University of Toronto

6App Developer

Submitted: 16/06/2017; Accepted: 19/11/2017; Published: 30/11/2017

Int J Med Educ. 2017; 8:416-420; doi: 10.5116/ijme.5a11.8422

© 2017 Michael Fralick et al. This is an Open Access article distributed under the terms of the Creative Commons Attribution License which permits unrestricted use of work provided the original work is properly cited. http://creativecommons.org/licenses/by/3.0

Objectives: To determine whether a smartphone app, containing local bacterial resistance patterns (antibiogram) and treatment guidelines, improved knowledge of prescribing antimicrobials among medical trainees.

Methods: We conducted a prospective, controlled, pre-post study of medical trainees with access to a smartphone app (app group) containing our hospital’s antibiogram and treatment guidelines compared to those without access (control group). Participants completed a survey which included a knowledge assessment test (score range, 0 [lowest possible score] to 12 [highest possible score]) at the start of the study and four weeks later. The primary outcome was change in mean knowledge assessment test scores between week 0 and week 4. Change in knowledge assessment test scores in the app group were compared to the difference in scores in the control group using multivariable linear regression.

Results: Sixty-two residents and senior medical students participated in the study. In a multivariable analysis controlling for sex and prior knowledge, app use was associated with a 1.1 point (95% CI: 0.10, 2.1) [β = 1.08, t(1) = 2.08, p = 0.04]  higher change in knowledge score compared to the change in knowledge scores in the control group. Among those in the app group, 88% found it easy to navigate, 85% found it useful, and about one- quarter used it daily.

Conclusions: An antibiogram and treatment algorithm app increased knowledge of prescribing antimicrobials in the context of local antibiotic resistance patterns. These findings reinforce the notion that smartphone apps can be a useful and innovative means of delivering medical education.

Antimicrobial stewardship promotes the appropriate selection, dose, route, and duration of antimicrobial therapy.1 A recent Cochrane review demonstrated that antimicrobial stewardship programs were associated with a reduction in the duration of antimicrobial therapy, hospital length of stay, and possibly Clostridium difficile infection.2 Hospitals often develop institution-specific treatment algorithms in the context of their local antibiogram as part of an antimicrobial stewardship intervention. However, multiple studies have shown that clinicians are usually unaware these resources exist or cannot easily access this information.3,4

Smartphone use amongst health-care professionals has rapidly increased over the past ten years.5,6 Approximately 80% of doctors and 85% of medical trainees use smartphones.7 For this reason, they represent an innovative opportunity in the field of medical education.7-9 For example, a smartphone application, or 'app', that provides point-of-care information about local antibiotic resistance patterns and treatment guidelines could provide trainees accurate and up to date information from the patient's bedside. A previous study demonstrated that access to an antibiotic decision management guide on the internet improved prescribing accuracy amongst critical care fellows.8 One limitation of this study, and similar studies,9 is that these tools are seldom evaluated from a medical education standpoint. When medical apps have been evaluated, the studies have been limited by inadequate, or non-existent, control groups.10-12 Given these issues, we developed a smartphone app and prospectively evaluated it.

This study aimed to assess the change in knowledge of prescribing antimicrobials (e.g., antibiotics, antivirals) among medical trainees. Specifically, our research question was: Does access to a smartphone app improve medical trainees' knowledge of antimicrobials compared to medical trainees without access to the smartphone app?  We also evaluated whether the app improved confidence in prescribing, made antibiogram and treatment data more accessible, and was easy to use. We hypothesized that the smartphone app would be associated with enhanced antibiotic-related knowledge, improved confidence in prescribing and that the information contained in the app would be accessible and easy to use.

Study design

We conducted a prospective, controlled pre-post study between May 1, 2015, and September 24, 2015, on the general internal medicine ward at St. Michael's Hospital. St. Michael's Hospital is a tertiary care teaching hospital in Toronto, Ontario. The general internal medicine ward is a 66-bed unit cared for by five medical teams. Each team is comprised of two senior medical students (i.e., in their final 1-2 years of medical school), four resident physicians and a staff physician. An individual team will care for approximately 20 patients each day.

Senior medical students and residents rotating on general internal medicine at St. Michael's Hospital were recruited to participate in the study. Participants enrolled between May 1, 2015, and June 30, 2015, did not have access to the smartphone app (control group) while those enrolled between August 4, 2015, and September 24, 2015, had access to the app (app group). Participants were excluded if they did not have either an Apple or Android-based smartphone. Individuals could not participate in both the control group and the app group. All participants who completed the study received a $10 gift card. The study was approved by the institutional research ethics board.

Baseline survey

Participants in the control group and app group completed the baseline survey (Appendix). This was a 27-item survey divided into four sections: demographics, knowledge assessment, self-reported confidence, and data accessibility. The demographic section included information about the participant's sex, smartphone type, training program, and level of training. Knowledge assessment was comprised of a total of 12 multiple choice or true or false questions developed for this study (Appendix). The knowledge assessment questions were trialed on a group of 30 medical students and residents at a separate teaching hospital to ensure the questions were clear, unambiguous, and had only one correct answer. Self-reported confidence was determined using a 5-point Likert scale used previously.11,13 The accessibility section of the survey assessed how often participants were using hospital antibiogram data and treatment algorithms at baseline. This was assessed because prior to the development of the app similar information was already available on the hospital's local intranet.

Table 1. Baseline characteristics of enrolled participants

Follow-up survey

Participants in both groups completed the follow-up survey approximately 30 days after completing the baseline survey. The follow-up survey was identical to the baseline survey with the exception that for the app group, the follow-up survey had an additional section to assess the usability of the app. The follow-up survey was paper-based, and participants did not have access to their phone or any other resources during this time.

Statistical analysis

The primary outcome was the change from baseline in the knowledge score (score range, 0 [lowest possible score] to 12 [highest possible score]) for the app group compared to the control group.

The secondary outcome was the change from baseline in self-reported confidence (score range, 1 [not at all confident] to 5 [very confident]). To account for confounding factors (i.e., sex, baseline knowledge, baseline confidence) both unadjusted and adjusted multivariate linear regression analyses were performed for analysis of the primary and secondary outcomes. Descriptive statistics were used to characterize participant-level characteristics and app use. Continuous data were compared with the Student's t-test with unequal variance, and categorical data were compared with the chi-square test. The analyses were conducted using SAS/STAT®  14.1 software (SAS Institute Inc., Cary, NC). All reported p-values were two-tailed.

Table 2. Comparison of mean knowledge scores and overall mean confidence scores between the control group and the app group

Of the 85 medical trainees approached, 62 (73%) participated in the study. Most participants were male, approximately half were residents, and most used an Apple smartphone (Table 1). Of the residents, 17 (55%) were in first year, 8 (26%) in second year, and 6 (19%) in third year. This included 20 (65%) internal medicine residents, five family medicine residents (16%), one surgical resident 1 (3%); the remainder were from other specialties. The remaining participants were senior medical students.

At the time of study enrollment only 35% of participants were aware that St Michael's Hospital had an antibiogram. Amongst trainees in the control group, 20% reported the use of an antibiogram in the 30-days prior to enrolling in the study, and by the end of the study, 31% had used the St Michael's Hospital antibiogram. Amongst trainees in the app group, 22% reported the use of an antibiogram in the 30 days prior to enrolling in the study, and by the end of the study, 81% had used the St Michael's Hospital antibiogram.

The average number of questions answered correctly on the knowledge assessment test for the app group significantly improved over the duration of the study (6.2 points, SD=2.1 vs. 8.1 points, SD=2.2, t(26)=4.6, p=0.0001) but did not improve in the control group (7.1 points, SD=1.7 vs. 7.5 points, SD=2.0, t(25)=-1.2, p=0.23) (Table 2). In the unadjusted linear regression analysis, use of the app was associated with a 1.5 point (95% CI: 0.46, 2.48) [β = 1.46, t(1)=2.86, p=0.006] higher change in knowledge score compared to the control group. In the adjusted multivariable linear regression analysis, app use was associated with a 1.1 point (95% CI: 0.10, 2.1) [β = 1.08, t(1) = 2.08, p = 0.04] higher change in knowledge score compared to the control group (Table 2). Self-reported confidence in prescribing antibiotics improved over the duration of the study for the app group, but the change in confidence between the two groups was not statistically different in the unadjusted and adjusted analyses (Table 2).

Amongst app users, nearly 90% found it easy to use and 85% agreed that it was useful (Table 3). One-quarter of participants used the app daily or multiple times a day and 41% used it weekly.

Table 3. Self-reported app utilization

In this prospective, controlled, pre-post study, use of an antibiogram and treatment algorithm app was associated with higher use of hospital-specific antibiogram data and a greater improvement in knowledge scores compared to the control group. Self-reported confidence in prescribing, however, "was similar for both groups."

Multiple past studies have demonstrated that there is a strong interest in, and use of, smartphone apps among medical trainees.11,14-21 An increasing number of medical apps are being developed, but many are never evaluated to see what, if any, educational impact they have. Instead, many are simply studied to assess patterns of use (e.g., number of downloads, screen views).14,20,21 A recent single-arm case-study of an antimicrobial prescribing app found that 100% of the target population had downloaded the application within 12 months and that 71% of the participants experienced a self-reported improvement in antibiotic knowledge after accessing the application.14 A similar single-arm study assessed the impact of a smartphone app to aid family doctors (N=14) in treating depression.11 Their study found that smartphone app use improved the doctors knowledge of treating depression, but their study was limited by a small sample size (N=14) and a lack of a comparator group that did not have access to the app.11 Similar studies exist for apps related to other medical subspecialties (i.e., hematology, infectious diseases, plastic surgery).8,11,13,18

The advantages of our study were its prospective design and the inclusion of a control group. Both are crucial when trying to assess the impact of an educational program.15,16 Without an adequate control group and a method that accounts for the change in knowledge over time for other reasons, any observed findings might be due to unmeasured confounding factors. In our study, the inclusion of this control group was important since we expected that knowledge related to antimicrobial prescribing might increase over the course of an internal medicine rotation, even without access to an app.

One explicit limitation of our study is that clinically relevant endpoints such as appropriateness of antimicrobial therapy were not measured. This endpoint was a priori deemed to be beyond the scope of our study and would have required a substantially larger sample size to power the study adequately. Furthermore, assessing the impact of antimicrobial stewardship strategies is challenging because the definition of "appropriate therapy" is varied and controversial. One potential endpoint that has been suggested is frequency of inappropriate use of antimicrobials, but the definition of "inappropriate" can be subjective and may require consensus by expert reviewers.1

Our study has several other limitations. First, the knowledge assessment test was developed for the purposes of our study and has not been previously validated. Second, our study took place at one hospital which limits the generalizability of our results. Third, the duration of follow-up in our study was relatively short, and thus it is unknown whether any gains in knowledge were maintained long-term. Finally, since our study was not randomized, our results may partially be explained by unmeasured confounding factors, such as differential educational opportunities or exposure to infectious diseases related cases during the study period for the two groups.

A theoretical concern of using apps is that people might become reliant on the app and not retain important basic facts since they know they can rely on their app.17,18 The results of our study showed the opposite. The knowledge assessment scores were unchanged for the control group and improved in the app group. This might be because participants in the app group were exposed to the content more frequently since it was literally at their fingertips. Since medical trainees often use electronic clinical resources during their training, it is reassuring to know that these can directly improve their medical knowledge. This highlights the importance of these resources being credible and providing accurate and up to date information.

Another theoretical concern of practitioners using apps is that they might overstate or overestimate what they know since the knowledge is readily accessible.17,18 Our study demonstrated that the app group and the control group expressed a similar level of confidence in their knowledge of various infectious disease topics.

Antimicrobial stewardship requires clinicians to be aware of both local antibiotic resistance rates and hospital-specific treatment guidelines.2,19 Hospitals invest a lot of time, energy, and money into developing these documents, but if they are not readily accessible, they often go unused. Our study found that the use of a smartphone app improved trainee rates of accessing antibiogram data and improved their knowledge about antibiotic prescribing. These findings reinforce the notion that smartphone apps can be a useful and innovative means of delivering medical education. In the case of this project, we created our app, but this is not necessary if there are existing high-quality apps with accurate, up to date, and relevant information. Considering the busy schedules of medical trainees, a final benefit of using apps as an adjunct to traditional forms of teaching (e.g., lectures, tutorials) is that they can be used seamlessly during clinical rotations.

Acknowledgments

We thank Jeff Alfonsi for his help with app programming and Rosane Nisenbaum for her assistance with statistical analyses. All authors have no financial or personal relationships or affiliations that could influence the decisions and work on this manuscript. Dr. Fralick receives funding from the Eliot Phillipson Clinician-Scientist Training Program at the University of Toronto, the Clinician Investigator Program at the University of Toronto, CIHR and The Detweiller Traveling Fellowship funded by the Royal College of Physicians and Surgeons of Canada.

Conflict of Interest

This work was supported by an unrestricted grant from Medbuy. Medbuy is a Canadian healthcare group purchasing organization. The funding organization had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; and preparation, review, or approval of the manuscript. The app is free, does not contain advertisements, and does not provide revenue to any of the study investigators (with the exception of John Bartlett who was paid to program the app). The authors report no conflicts of interest.

  1. Dellit TH, Owens RC, McGowan JE, Gerding DN, Weinstein RA, Burke JP, Huskins WC, Paterson DL, Fishman NO, Carpenter CF, Brennan PJ, Billeter M and Hooton TM. Infectious Diseases Society of America and the Society for Healthcare Epidemiology of America guidelines for developing an institutional program to enhance antimicrobial stewardship. Clin Infect Dis. 2007; 44: 159-177.
    Full Text PubMed
  2. Davey P, Marwick CA, Scott CL, Charani E, McNeil K, Brown E, Gould IM, Ramsay CR and Michie S. Interventions to improve antibiotic prescribing practices for hospital inpatients. Cochrane Database Syst Rev. 2017; 2: 003543.
    Full Text PubMed
  3. Evans CT, Rogers TJ, Burns SP, Lopansri B and Weaver FM. Knowledge and use of antimicrobial stewardship resources by spinal cord injury providers. PM R. 2011; 3: 619-623.
    Full Text PubMed
  4. Mermel LA, Jefferson J and Devolve J. Knowledge and use of cumulative antimicrobial susceptibility data at a university teaching hospital. Clin Infect Dis. 2008; 46: 1789.
    Full Text PubMed
  5. Robinson T, Cronin T, Ibrahim H, Jinks M, Molitor T, Newman J and Shapiro J. Smartphone use and acceptability among clinical medical students: a questionnaire-based study. J Med Syst. 2013; 37: 9936.
    Full Text PubMed
  6. Oehler RL, Smith K and Toney JF. Infectious diseases resources for the iPhone. Clin Infect Dis. 2010; 50: 1268-1274.
    Full Text PubMed
  7. Ventola CL. Mobile devices and apps for health care professionals: uses and benefits. P T. 2014; 39: 356-364.
    PubMed
  8. Bochicchio GV, Smit PA, Moore R, Bochicchio K, Auwaerter P, Johnson SB, Scalea T and Bartlett JG. Pilot study of a web-based antibiotic decision management guide. J Am Coll Surg. 2006; 202: 459-467.
    Full Text PubMed
  9. Moodley A, Mangino JE and Goff DA. Review of infectious diseases applications for iPhone/iPad and Android: from pocket to patient. Clin Infect Dis. 2013; 57: 1145-1154.
    Full Text PubMed
  10. Christensen S. Evaluation of a nurse-designed mobile health education application to enhance knowledge of Pap testing. Creat Nurs. 2014; 20: 137-143.
    PubMed
  11. Man C, Nguyen C and Lin S. Effectiveness of a smartphone app for guiding antidepressant drug selection. Fam Med. 2014; 46: 626-630.
    PubMed
  12. Johansson PE, Petersson GI and Nilsson GC. Personal digital assistant with a barcode reader--a medical decision support system for nurses in home care. Int J Med Inform. 2010; 79: 232-242.
    Full Text PubMed
  13. Barnes J, Duffy A, Hamnett N, McPhail J, Seaton C, Shokrollahi K, James MI, McArthur P and Pritchard Jones R. The Mersey Burns App: evolving a model of validation. Emerg Med J. 2015; 32: 637-641.
    Full Text PubMed
  14. Charani E, Kyratsis Y, Lawson W, Wickens H, Brannigan ET, Moore LS, et al. An analysis of the development and implementation of a smartphone applica-tion for the delivery of antimicrobial prescribing policy: lessons learnt.J Antimicrob Chemother. 2013;68(4):960-967.
  15. Sigmund AE, Stevens ER, Blitz JD and Ladapo JA. Use of preoperative testing and physicians' response to professional society guidance. JAMA Intern Med. 2015; 175: 1352-1359.
    Full Text PubMed
  16. Justo JA, Gauthier TP, Scheetz MH, Chahine EB, Bookstaver PB, Gallagher JC, et al. Knowledge and attitudes of doctor of pharmacy students regarding the appropriate use of antimicrobials. Clin Infect Dis. 2014;59(Suppl 3):S162-S169.
  17. Lau C, Kolli V. App use in psychiatric evaluation: a medical student survey. Acad Psychatr.2017;41(1):68-70.
  18. Dimond R, Bullock A, Lovatt J and Stacey M. Mobile learning devices in the workplace: 'as much a part of the junior doctors' kit as a stethoscope'? BMC Med Educ. 2016; 16: 207.
    Full Text PubMed
  19. Nason GJ, Burke MJ, Aslam A, Kelly ME, Akram CM, Giri SK and Flood HD. The use of smartphone applications by urology trainees. Surgeon. 2015; 13: 263-266.
    Full Text PubMed
  20. Zaki M and Drazin D. Smartphone use in neurosurgery? APP-solutely!. Surg Neurol Int. 2014; 5: 113.
    Full Text PubMed
  21. Kessler C, Peerschke EI, Chitlur MB, Kulkarni R, Holot N and Cooper DL. The Coags Uncomplicated App: fulfilling educational gaps around diagnosis and laboratory testing of coagulation disorders. JMIR Med Educ. 2017; 3: 6.
    Full Text PubMed