Skip Navigation
Skip to contents

KMJ : Kosin Medical Journal

OPEN ACCESS
SEARCH
Search

Articles

Page Path
HOME > Kosin Med J > Volume 37(1); 2022 > Article
Original article
New approach to learning medical procedures using a smartphone and the Moodle platform to facilitate assessments and written feedback
Sang-Shin Lee1,*orcid, Haeyoung Lee2,*orcid, Hyunyong Hwang3orcid
Kosin Medical Journal 2022;37(1):75-82.
DOI: https://doi.org/10.7180/kmj.22.010
Published online: March 25, 2022

1Department of Psychiatry, Kosin University College of Medicine, Busan, Korea

2Department of Thoracic and Cardiovascular Surgery, Kosin University College of Medicine, Busan, Korea

3Department of Laboratory Medicine, Kosin University College of Medicine, Busan, Korea

Corresponding Author: Hyunyong Hwang, MD, PhD Department of Laboratory Medicine, Kosin University College of Medicine, 262 Gamcheon-ro, Seo-gu, Busan 49267, Korea Tel: +82-51-990-6371 Fax: +82-51-990-3010 E-mail: terminom@gmail.com
*These authors contributed equally to this work as first authors.
• Received: February 4, 2022   • Revised: March 10, 2022   • Accepted: March 10, 2022

Copyright © 2022 Kosin University College of Medicine.

This is an open-access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

  • 4,015 Views
  • 105 Download
  • 5 Crossref
prev next
  • Background
    To overcome communication obstacles between medical students and trainers, we designed serial learning activities utilizing a smartphone and web-based instruction (WBI) on the Moodle platform to provide clear and retrievable trainer feedback to students on an objective structured clinical examination (OSCE) item.
  • Methods
    We evaluated students’ learning achievement and satisfaction with the new learning tool. A total of 80 fourth-year medical students participated. They installed the Moodle app (the WBI platform) on their smartphones and practiced an endotracheal suction procedure on a medical simulation mannequin while being evaluated by a trainer regarding competence in clinical skills on the smartphone app. Students’ competency was evaluated by comparing the scores between the formative assessment and the summative assessment. The degree of satisfaction and usefulness for the smartphone and WBI system were analyzed.
  • Results
    The means (standard deviations, SDs) of the formative and summative assessments were 8.80 (2.53) and 14.24 (1.97) out of a total of 17 points, respectively, reflecting a statistically significant difference (p<0.05). The degree of satisfaction and perceived usefulness of the smartphone app and WBI system were excellent, with means (SDs) of 4.60 (0.58), and 4.60 (0.65), respectively.
  • Conclusion
    We believe that the learning process using a smartphone and the Moodle platform offers good guidance for OSCE skill development because trainers’ written feedback is recorded online and is retrievable at all times, enabling students to build and maintain competency through frequent feedback review.
The processes of observation and procedural practice for in-patient and out-patient care are an important part of medical education [1,2]. These processes involve corrective feedback typically provided verbally, and the majority of learning is obtained via observation and verbal feedback [3]. In general, students with limited clinical experience do not easily achieve competence by observation and verbal feedback alone, particularly because such feedback is not always clear. To facilitate procedural skill training, trainer feedback needs to be retrievable and available regardless of time and place [4-6].
Lack of timely feedback from teacher to student is a critical communication obstacle to overcome in medical education [7]. Immediate feedback allows students to refine clinical skills as they are being practiced [8,9]. Currently, feedback is typically provided verbally, but not always timely, and without the student’s continuous participation in the provided guidance, proficiency can be lost. Furthermore, memory can become distorted over time, which can alter a student’s understanding of verbal feedback. Therefore, timely written feedback is necessary so students can retrieve it later for self-directed learning [4,10,11]. The medical education field needs a reliable and tangible tool to ensure clear written feedback is continuously utilized by students for preparation of their clinical skills.
Furthermore, due to the coronavirus disease 2019 (COVID-19) pandemic, in-person education has become limited. Thus, virtual education is accepted as a new normal form of education [12,13]. Nevertheless, some aspects of medical education, such as clinical skills, are difficult to implement virtually because they require direct feedback. In this context, we needed a teaching tool to minimize in-person verbal explanation to prevent participants from being infected with the COVID-19 virus via droplets of saliva. It was believed that a modular object-oriented dynamic learning environment (Moodle) platform and smartphone app would provide an educational environment for trainers to give written feedback to students in a timely manner.
Therefore, we designed serial learning activities within the smartphone app and the Moodle platform to guarantee clear and retrievable trainer feedback. We then evaluated students’ learning achievements and their satisfaction with the new learning tool.
Ethical statements: The study was approved by the Institutional Review Board of the Kosin University Gospel Hospital (KUGH 2020-11-028). Informed written consent was exempted due to the retrospective nature of this study.
A total of 80 fourth-year medical students participated in this study. Students were asked to complete formative assessment, self-reflection, summative assessment, and a learning-tool survey (Fig. 1). All assignments were created using Moodle version 3.0 software (Martin Dougiamas, Perth, Australia; http://www.moodle.org/). Students were instructed to install the Moodle app, a web-based instruction (WBI) platform, on their smartphones (Fig. 2). They were asked to practice an “endotracheal suction” procedure (one of the objective structured clinical examination [OSCE] items used to measure a student’s clinical competence) on a medical simulation mannequin while being evaluated in person by a trainer on the smartphone app. We used the results from the students’ submitted assignments for this study.
1. Formative assessment
Students were asked to learn an endotracheal suction procedure with a provided manual and movie clips on an e-learning consortium website. After that, a formative assessment was performed to evaluate the current status of each student’s skills wherein a trainer in charge of the students’ learning provided feedback based on their performance. Before the procedure, students were required to log in to the Moodle app via their smartphone and open a quiz designed as the formative assessment. The students then handed their smartphones to the trainer, who was responsible for filling out the assessment. Students began the endotracheal suction procedure in front of the trainer. Since the quiz was locked with a pass code, the quiz questions were invisible. To begin the assessment, the trainer input a passcode on the landing page of the quiz and completed the formative assessment based on the student’s demonstrated skillset.
The quiz consisted of 17 questions which dealt with critical check points for a proficient endotracheal suction procedure. Each of the questions had rated answers based on the level of student competence demonstrated. When a trainer selected one of the answers, corresponding feedback would automatically generate. However, the trainer could add additional constructive feedback if necessary. After completing the formative assessment, students would read both the algorithm-generated answers in addition to the trainer’s on-spot written feedback.
2. Self-reflection
After completing the formative assessment, students could see the quiz questions and answers as well as feedback online. The quiz results and feedback provided the students with guidelines regarding their competency in the endotracheal suction procedure. They reflected on their performance and the written feedback from the trainer and reported essays for self-reflection. We used the “assignment” activity within Moodle, on the WBI, for students to submit their self-reflection essay, in which they had the opportunity to consider their performance and correct their weaknesses.
3. Summative assessment
Three days after completing the formative assessment, the students performed another round of endotracheal suction to demonstrate their learned skillset. As in the formative assessment, students presented their smartphones right before the procedure, and the trainer assessed their performance using the Moodle app. The quiz questions on the summative assessment were the same as those on the formative assessment. The trainer compared students’ performances with those recorded during the formative assessments via quizzes and checked whether their levels of competency had improved.
4. Survey
We used the “survey” activity in Moodle to obtain student feedback about satisfaction and usefulness of the educational system utilized during this study. The survey consisted of one subjective question and 14 objective questions. Of the 14 objective questions, half asked for degree of satisfaction and half about the usefulness of the educational system. Students were asked to specify their level of agreement to a statement on a 5-point Likert scale: 1, strongly disagree; 2, disagree; 3, neither agree nor disagree; 4, agree; 5, strongly agree [14]. The subjective question was intended to collect additional opinions that had not been considered in the objective portion of the survey.
5. Statistical analysis
Cronbach’s α was measured to gauge the internal consistency of survey questions. Paired-samples t-test was used to analyze the difference of mean between the formative assessment and the summative assessment. Statistical analyses were performed using SPSS version 26 (IBM Corp., Armonk, NY, USA). Statistical significance was established at p-value<0.05.
1. Difference of students' learning achievements between the formative and summative assessments
We analyzed the difference between the scores of formative and summative assessments to confirm how much students’ learning had improved. The formative and summative assessment questions were designed to check that the essential requirements had been met for performing a correct endotracheal suction procedure. Students are required to prove that they have enough knowledge and skill to safely and correctly complete the procedure without any adverse effects to the patient such as hypoxemia and pneumothorax. Many aspects have to be taken into consideration to begin the endotracheal suction procedure; depth of suction, saline instillation, pre-oxygenation, disconnection of a tube, size of suction catheter, suction ion, aseptic procedure, adverse effects, etc. The quiz questions were designed to cover all these concerns related to a safe and successful procedure. From this perspective, it was assumed that the score of formative and summative assessments represented the students’ overall proficiency in the procedure.
The means (standard deviations, SDs) for the formative and summative assessments were 8.80 (2.53) and 14.24 (1.97) out of a total of 17 points, respectively (Table 1). There was a significant difference in score between the two assessments (p<0.001). Students’ proficiencies have been remarkably improved (162.8% increase in mean score). Only two students scored more than 14 of 17 (2.5%) at formative assessment; this increased to 56 students (70%) at the summative assessment (Fig. 3).
2. Students' satisfaction with the new learning tool
A total of 56 students (70%) reported degree of satisfaction and usefulness of the smartphone app and WBI system (Table 2). The mean (SD) of the degree of satisfaction was 4.60 (0.58), and the mean (SD) for smartphone app and WBI system usefulness was 4.60 (0.65). The average internal consistency of the survey for degree of satisfaction and usefulness was greater than 0.9 (Cronbach’s α=0.918 and 0.919, respectively).
Students were highly satisfied with the newly introduced written feedback and assessment system via a smartphone and WBI website (question no. 3, mean=4.72). The students indicated that the written feedback was helpful (question nos. 4 and 5, mean=4.80, respectively). Students were asked whether the feedback provided during practice was consistent with that provided in a real-world situation. The students responded positively to this question. However, the degree of satisfaction was relatively lower when compared to other questions (question no. 6, mean=4.30).
About the usefulness of a smartphone and WBI system, students felt it was easy to operate (question no. 8, mean=4.58), and that it helped them reflect on their performance and maintain competence (question nos. 9 and 11, mean=4.73 and mean=4.66, respectively). They indicated that the assessments and feedback could potentially be reused for their portfolio in the future (question no. 13, mean=4.41).
Effective feedback on medical student performance from formative or summative assessments can be a cornerstone to improve learning outcomes [15-18]. We evaluated a practical and implementable method to provide retrievable and clear feedback to medical students. To accomplish this goal, a virtual learning environment was presented in Moodle [19,20]. Smartphone use in clinical and medical education has been reported to be effective for enhancing patient care and medical education [21-23]. Therefore, we combined these two tools, Moodle platform (WBI website) and smartphone using several Moodle functions that could be implemented in both the WBI website and a smartphone app.
During formative assessment for this study, a trainer input comments as feedback on smartphones provided by students, allowing accurate and efficient feedback. We also embedded online comments that the trainer could use to answer each quiz item, which saved time and facilitated the assessment process [24]. With this approach, the students’ primary responsibility during procedural practice class with the trainer was to concentrate on the procedure based on trainer feedback. We used the quiz as the basis for feedback because many checkpoints in the endotracheal suction procedure can be presented as questions, and various degrees of competency were evaluated through quiz responses. Also, extra feedback can be included as personalized comments. Trainers had the option to provide feedback by selecting pre-provided answers or writing comments.
After completing the formative assessment, students reflected on their activities based on the feedback on the smartphone app and were able to respond by strengthening their weak points [25]. Our goal was that students would correct their mistakes during reflection, on which they were asked to submit an essay which the trainer assessed for accuracy of the students’ insights and provided additional online feedback.
Table 1 shows that students’ mean procedural skills improved significantly from 8.80 (SD, 2.53) in formative assessment to 14.24 (SD, 1.97) in summative assessment (p<0.001). Translating feedback into action through self-reflection is a process for catalyzing positive behavioral change [21]. Timely and retrievable feedback should increase self-reflection by enabling immediate corrective behavior change.
Use of a smartphone and a WBI system could be perceived as inconvenient to both students and trainers compared to in person education. However, the system used in this study was described as easy and helpful by students (Table 2). The degree of satisfaction reported by students was excellent (4.30–4.80), with excellent internal consistency (Cronbach’s α >0.9). Trainer activities in this teaching environment are more intensive than in traditional teaching because the trainer is required to record feedback on the smartphone consistently and accurately while observing the students’ performance. The automatically generated feedback helped alleviate the burden of this task.
The COVID-19 pandemic has had a significant effect on life, including society, culture, the economy, and education. It has accelerated online education, which is not common in medical training [26-28]. It is essential to guarantee quality education while maintaining social distancing during the COVID-19 pandemic [29,30]. To achieve this, student performance could be evaluated and critiqued in a hybrid way via offline practice and online communication. The method introduced herein, harnessing the Moodle online platform and a smartphone for the written feedback, can help achieve this two-pronged goal. Digital communication in medical education is gaining importance [31]. Interactive online medical activities can be a rich source of personal reflection and learning and can contribute to a student’s portfolio to be used for future professional development [32,33]. Written feedback and activities using the WBI and smartphone can be included as useful items in the portfolio because they are stored online and are easily extracted.
We only tested the system developed in our institution on one OSCE item, limiting generalizability to the rest of the OSCE items. Further studies with a larger number of items for application of this educational system are necessary. The improvements in proficiency of the endotracheal suction procedure by students could be partly ascribed to well-written feedback or a well-organized learning environment supported by the Moodle platform and use of a smartphone. However, a detailed analysis was not performed to weigh the degree of contribution between them. As stated, the Moodle platform and smartphone app facilitating the written feedback and assessments were the main factors considered in this study. Other factors which could have potentially affected the results were not analyzed. Although the difference between the formative and summative assessments is significant, this cannot fully explain whether the improvement in procedural learning has been obtained by the combination of the written feedback and the Moodle platform or through the repeated assessment and learning.
We believe a learning process using a smartphone and the Moodle platform can offer good guidance for building OSCE skills because online written feedback is retrievable at all times, allowing students to increase/maintain competency via feedback review even after completion of the OSCE class. Because the teaching method used in this study has never been reported and because teaching and learning environments vary depending on the course subject, further studies that apply this newly designed teaching method are warranted. The educational system evaluated herein could be a good alternative for clinical procedural education during periods when social distancing is necessary, such as during the COVID-19 pandemic.

Conflicts of interest

Hyunyong Hwang is an editorial board member of the journal but was not involved in the peer reviewer selection, evaluation, or decision process of this article. No other potential conflicts of interest relevant to this article were reported.

Funding

None.

Author contributions

Conceptualization: HL, HH. Data curation: SSL, HL, HH. Formal analysis: HL, HH. Methodology: HL, HH. Visualization: SSL, HL, HH. Writing - original draft: SSL, HL. Writing - review & editing: HH. Approval of final manuscript: all authors.

Fig. 1.
WBI website for formative assessment, self-reflection, summative assessment, and feedback. Students accessed the web-based instruction (WBI) website for online endotracheal suction practice (A), where they completed the formative assessment (B), self-reflection (C), summative assessment (D), and survey (E). OSCE, objective structured clinical examination.
kmj-22-010f1.jpg
Fig. 2.
Smartphone app screen for the endotracheal suction practice class.
kmj-22-010f2.jpg
Fig. 3.
Frequency distribution of student grades on the formative assessment and summative assessment.
kmj-22-010f3.jpg
Table 1.
Formative and summative assessment scores
No. Mean Standard deviation T score p-valuea)
Formative assessment 80 8.80 2.53 –17.149 <0.001
Summative assessment 80 14.24 1.97

A total of 80 participants completed the formative and summative assessments.

a) The statistical significance of differences between the scores for the two assessments was evaluated with the paired-samples t-test, and the mean scores were significantly different.

Table 2.
Degree of satisfaction and usefulness of the smartphone app and WBI system in an OSCE class
Questions Min Max Mean SD Cronbach’s α (n=64)
Degree of satisfaction
 1. I was pleased with the endotracheal suction practice. 2 5 4.50 0.73 0.913
 2. I was confident after the endotracheal suction practice that I could perform endotracheal suction on a real patient. 3 5 4.44 0.73 0.922
 3. The feedback and assessments via the smartphone app and WBI system during the OSCE class were helpful for practicing endotracheal suction. 3 5 4.72 0.49 0.917
 4. The teacher pointed out my mistakes and gave me proper feedback. 4 5 4.80 0.41 0.920
 5. I think the teacher was well prepared for class. 4 5 4.80 0.41 0.918
 6. The feedback obtained from practice was consistent with that provided in treatment with real patients. 3 5 4.30 0.75 0.923
 7. I am satisfied with this endotracheal practice class as a whole. 3 5 4.69 0.53 0.913
Usefulness of the smartphone and WBI system
 8. It was easy to operate the smartphone app and WBI system for endotracheal suction practice. 2 5 4.58 0.69 0.914
 9. This system was helpful for reflection by providing feedback on needed corrections prior to the next assessment. 4 5 4.73 0.45 0.914
 10. After formative assessment, utilizing the feedback and self-reflection activities in the app, I noted academic improvement between the two assessments. 3 5 4.66 0.54 0.913
 11. I think the feedback from the smartphone app could help maintain my competence, even after completion of the OSCE class. 2 5 4.66 0.60 0.916
 12. I could systematically learn endotracheal suction skills with the help of the smartphone app and WBI system. 3 5 4.58 0.59 0.914
 13. The assessments and feedback for endotracheal suction practice were stored online and could be useful for my portfolio design. 1 5 4.41 0.87 0.921
 14. Other OSCE practice classes could utilize the smartphone app and WBI system that were applied to this practice class. 1 5 4.30 0.85 0.926

One trainer gave feedback to all students during all classes in order to eliminate the possibility of bias caused by differences in teaching style. This written feedback was given using the smartphone app.

WBI, web-based instruction; OSCE, objective structured clinical examination; SD, standard deviation.

  • 1. Fromme HB, Karani R, Downing SM. Direct observation in medical education: a review of the literature and evidence for validity. Mt Sinai J Med 2009;76:365–71.ArticlePubMed
  • 2. Holmboe ES, Kogan JR. Observation in medical education: time for a broader conversation? Fam Syst Health 2018;36:17–9.ArticlePubMed
  • 3. Johnson CE, Weerasuria MP, Keating JL. Effect of face-to-face verbal feedback compared with no or alternative feedback on the objective workplace task performance of health professionals: a systematic review and meta-analysis. BMJ Open 2020;10:e030672.ArticlePubMedPMC
  • 4. Burr SA, Brodier E, Wilkinson S. Delivery and use of individualised feedback in large class medical teaching. BMC Med Educ 2013;13:63.ArticlePubMedPMC
  • 5. Ende J. Feedback in clinical medical education. JAMA 1983;250:777–81.ArticlePubMed
  • 6. Roediger HL 3rd, Butler AC. The critical role of retrieval practice in long-term retention. Trends Cogn Sci 2011;15:20–7.ArticlePubMed
  • 7. Calleja P, Harvey T, Fox A, Carmichael M. Feedback and clinical practice improvement: a tool to assist workplace supervisors and students. Nurse Educ Pract 2016;17:167–73.ArticlePubMed
  • 8. Branch WT Jr, Paranjape A. Feedback and reflection: teaching methods for clinical settings. Acad Med 2002;77(12 Pt 1):1185–8.Article
  • 9. Hewson MG, Little ML. Giving feedback in medical education: verification of recommended techniques. J Gen Intern Med 1998;13:111–6.ArticlePubMedPMC
  • 10. Ference K, Mackesy BL, Reinert P, Foote EF. The Importance of written feedback on the individual and team performance of student pharmacists. Am J Pharm Educ 2020;84:ajpe7870.ArticlePubMed
  • 11. Kim JY, Na BJ, Yun J, Kang J, Han S, Hwang W, et al. What kind of feedback do medical students want? Korean J Med Educ 2014;26:231–4.ArticlePubMedPMC
  • 12. Chick RC, Clifton GT, Peace KM, Propper BW, Hale DF, Alseidi AA, et al. Using technology to maintain the education of residents during the COVID-19 pandemic. J Surg Educ 2020;77:729–32.ArticlePubMedPMC
  • 13. Sahi PK, Mishra D, Singh T. Medical education amid the COVID-19 pandemic. Indian Pediatr 2020;57:652–7.ArticlePMC
  • 14. Jamieson S. Likert scales: how to (ab)use them. Med Educ 2004;38:1217–8.ArticlePubMed
  • 15. Anderson PA. Giving feedback on clinical skills: are we starving our young? J Grad Med Educ 2012;4:154–8.ArticlePubMedPMC
  • 16. Klaber B. Effective feedback: an essential skill. Postgrad Med J 2012;88:187–8.ArticlePubMed
  • 17. Lerchenfeldt S, Mi M, Eng M. The utilization of peer feedback during collaborative learning in undergraduate medical education: a systematic review. BMC Med Educ 2019;19:321.ArticlePubMedPMC
  • 18. Watling CJ, Ginsburg S. Assessment, feedback and the alchemy of learning. Med Educ 2019;53:76–85.ArticlePubMed
  • 19. Popovic N, Popovic T, Rovcanin Dragovic I, Cmiljanic O. A Moodle-based blended learning solution for physiology education in Montenegro: a case study. Adv Physiol Educ 2018;42:111–7.ArticlePubMed
  • 20. Seluakumaran K, Jusof FF, Ismail R, Husain R. Integrating an open-source course management system (Moodle) into the teaching of a first-year medical physiology course: a case study. Adv Physiol Educ 2011;35:369–77.ArticlePubMed
  • 21. Martínez F, Tobar C, Taramasco C. Implementation of a smartphone application in medical education: a randomised trial (iSTART). BMC Med Educ 2017;17:168.ArticlePubMedPMC
  • 22. Payne KB, Wharrad H, Watts K. Smartphone and medical related App use among medical students and junior doctors in the United Kingdom (UK): a regional survey. BMC Med Inform Decis Mak 2012;12:121.ArticlePubMedPMC
  • 23. Valle J, Godby T, Paul DP 3rd, Smith H, Coustasse A. Use of smartphones for clinical and medical education. Health Care Manag (Frederick) 2017;36:293–300.ArticlePubMed
  • 24. Hwang H. A computer-assisted, real-time feedback system for medical students as a tool for web-based learning. Kosin Med J 2016;31:134–45.Article
  • 25. Pelgrim EA, Kramer AW, Mokkink HG, van der Vleuten CP. Reflection as a component of formative assessment appears to be instrumental in promoting the use of feedback; an observational study. Med Teach 2013;35:772–8.ArticlePubMed
  • 26. Ahmed S, Zimba O, Gasparyan AY. Moving towards online rheumatology education in the era of COVID-19. Clin Rheumatol 2020;39:3215–22.ArticlePubMedPMC
  • 27. Seymour-Walsh AE, Bell A, Weber A, Smith T. Adapting to a new reality: COVID-19 coronavirus and online education in the health professions. Rural Remote Health 2020;20:6000.ArticlePubMed
  • 28. Singhi EK, Dupuis MM, Ross JA, Rieber AG, Bhadkamkar NA. Medical hematology/oncology fellows’ perceptions of online medical education during the COVID-19 pandemic. J Cancer Educ 2020;35:1034–40.ArticlePubMedPMC
  • 29. Gaur U, Majumder MA, Sa B, Sarkar S, Williams A, Singh K. Challenges and opportunities of preclinical medical education: COVID-19 crisis and beyond. SN Compr Clin Med 2020;2:1992–7.ArticlePubMedPMC
  • 30. Radu MC, Schnakovszky C, Herghelegiu E, Ciubotariu VA, Cristea I. The impact of the COVID-19 pandemic on the quality of educational process: a student survey. Int J Environ Res Public Health 2020;17:7770.ArticlePubMedPMC
  • 31. Kuhn S, Frankenhauser S, Tolks D. Digital learning and teaching in medical education: already there or still at the beginning? Bundesgesundheitsblatt Gesundheitsforschung Gesundheitsschutz 2018;61:201–9.ArticlePubMed
  • 32. Dalton CL, Wilson A, Agius S. Twelve tips on how to compile a medical educator’s portfolio. Med Teach 2018;40:140–5.ArticlePubMed
  • 33. Heeneman S, Driessen EW. The use of a portfolio in postgraduate medical education: reflect, assess and account, one for each or all in one? GMS J Med Educ 2017;34:Doc57.PubMedPMC

Figure & Data

References

    Citations

    Citations to this article as recorded by  
    • Faculty development: the need to ensure educational excellence and health care quality
      Hyekyung Shin, Min-Jeong Kim
      Kosin Medical Journal.2023; 38(1): 4.     CrossRef
    • Is It Time to Revise the Competency-Based Assessment? Objective Structured Clinical Examination and Technology Integration
      Haniye Mastour, Nazanin Shamaeian Razavi
      Shiraz E-Medical Journal.2023;[Epub]     CrossRef
    • Do we need Moodle in medical education? A review of its impact and utility
      Seri Jeong, Hyunyong Hwang
      Kosin Medical Journal.2023; 38(3): 159.     CrossRef
    • The effectiveness of Moodle's “Lesson” feature in pre-learning about arterial puncture and blood transfusion procedures
      Haeyoung Lee, Sang-Shin Lee, Hyunyong Hwang
      Kosin Medical Journal.2023; 38(4): 278.     CrossRef
    • How does quiz activity affect summative assessment outcomes? An analysis of three consecutive years’ data on self-directed learning
      Chi Eun Oh, Hyunyong Hwang
      Kosin Medical Journal.2022; 37(3): 228.     CrossRef

    • PubReader PubReader
    • ePub LinkePub Link
    • Cite
      CITE
      export Copy
      Close
    • Download Citation
      Download Citation
      Download a citation file in RIS format that can be imported by all major citation management software, including EndNote, ProCite, RefWorks, and Reference Manager.

      Format:
      • RIS — For EndNote, ProCite, RefWorks, and most other reference management software
      • BibTeX — For JabRef, BibDesk, and other BibTeX-specific software
      Include:
      • Citation for the content below
      New approach to learning medical procedures using a smartphone and the Moodle platform to facilitate assessments and written feedback
      Kosin Med J. 2022;37(1):75-82.   Published online March 25, 2022
      Close
    • XML DownloadXML Download
    Figure
    • 0
    • 1
    • 2
    New approach to learning medical procedures using a smartphone and the Moodle platform to facilitate assessments and written feedback
    Image Image Image
    Fig. 1. WBI website for formative assessment, self-reflection, summative assessment, and feedback. Students accessed the web-based instruction (WBI) website for online endotracheal suction practice (A), where they completed the formative assessment (B), self-reflection (C), summative assessment (D), and survey (E). OSCE, objective structured clinical examination.
    Fig. 2. Smartphone app screen for the endotracheal suction practice class.
    Fig. 3. Frequency distribution of student grades on the formative assessment and summative assessment.
    New approach to learning medical procedures using a smartphone and the Moodle platform to facilitate assessments and written feedback
    No. Mean Standard deviation T score p-valuea)
    Formative assessment 80 8.80 2.53 –17.149 <0.001
    Summative assessment 80 14.24 1.97
    Questions Min Max Mean SD Cronbach’s α (n=64)
    Degree of satisfaction
     1. I was pleased with the endotracheal suction practice. 2 5 4.50 0.73 0.913
     2. I was confident after the endotracheal suction practice that I could perform endotracheal suction on a real patient. 3 5 4.44 0.73 0.922
     3. The feedback and assessments via the smartphone app and WBI system during the OSCE class were helpful for practicing endotracheal suction. 3 5 4.72 0.49 0.917
     4. The teacher pointed out my mistakes and gave me proper feedback. 4 5 4.80 0.41 0.920
     5. I think the teacher was well prepared for class. 4 5 4.80 0.41 0.918
     6. The feedback obtained from practice was consistent with that provided in treatment with real patients. 3 5 4.30 0.75 0.923
     7. I am satisfied with this endotracheal practice class as a whole. 3 5 4.69 0.53 0.913
    Usefulness of the smartphone and WBI system
     8. It was easy to operate the smartphone app and WBI system for endotracheal suction practice. 2 5 4.58 0.69 0.914
     9. This system was helpful for reflection by providing feedback on needed corrections prior to the next assessment. 4 5 4.73 0.45 0.914
     10. After formative assessment, utilizing the feedback and self-reflection activities in the app, I noted academic improvement between the two assessments. 3 5 4.66 0.54 0.913
     11. I think the feedback from the smartphone app could help maintain my competence, even after completion of the OSCE class. 2 5 4.66 0.60 0.916
     12. I could systematically learn endotracheal suction skills with the help of the smartphone app and WBI system. 3 5 4.58 0.59 0.914
     13. The assessments and feedback for endotracheal suction practice were stored online and could be useful for my portfolio design. 1 5 4.41 0.87 0.921
     14. Other OSCE practice classes could utilize the smartphone app and WBI system that were applied to this practice class. 1 5 4.30 0.85 0.926
    Table 1. Formative and summative assessment scores

    A total of 80 participants completed the formative and summative assessments.

    The statistical significance of differences between the scores for the two assessments was evaluated with the paired-samples t-test, and the mean scores were significantly different.

    Table 2. Degree of satisfaction and usefulness of the smartphone app and WBI system in an OSCE class

    One trainer gave feedback to all students during all classes in order to eliminate the possibility of bias caused by differences in teaching style. This written feedback was given using the smartphone app.

    WBI, web-based instruction; OSCE, objective structured clinical examination; SD, standard deviation.


    KMJ : Kosin Medical Journal
    TOP