Advertisement

Enhancing Learner Engagement through Experiential Learning with Learner-Generated Data

Open AccessPublished:November 07, 2022DOI:https://doi.org/10.1016/j.acap.2022.11.002

      Abbreviations

      Educational Scholar's Program
      (ESP)
      confidence interval
      (CI)

      Background

      Learner engagement is strongly associated with teaching effectiveness.
      • Stephenson CR
      • Bonnes SL
      • Sawatsky AP
      • et al.
      The relationship between learner engagement and teaching effectiveness: a novel assessment of student engagement in continuing medical education.
      Therefore, strategies to increase learner engagement, such as evidence-informed educational design, experiential learning, feedback and reflection, and longitudinal design, are important tools to increase teaching effectiveness.
      • Steinert Y
      • Mann K
      • Anderson B
      • et al.
      A systematic review of faculty development initiatives designed to enhance teaching effectiveness: A 10-year update: BEME Guide No. 40.
      ,
      • Cook DA
      • Steinert Y.
      Online learning for faculty development: a review of the literature.
      Kolb's experiential learning theory
      • Kolb DA.
      Experiential learning: experience as the source of learning and development.
      and Schon's reflective practice

      Schon DA. The Reflective Practitioner: How Professionals Think in Action. London: Routledge; 2017.

      suggest importance of active learning
      • Graffam B.
      Active learning in medical education: strategies for beginning implementation.
      • Freeman S
      • Eddy SL
      • McDonough M
      • et al.
      Active learning increases student performance in science, engineering, and mathematics.
      • Yardley S
      • Teunissen PW
      • Dornan T.
      Experiential learning: AMEE Guide No. 63.
      and reflection
      • Winkel AF
      • Yingling S
      • Jones AA
      • Nicholson J.
      Reflection as a Learning Tool in Graduate Medical Education: A Systematic Review.
      in learning. We used these frameworks and wondered whether addition of learner self-generated data as the basis for active participation and reflection could further enhance learner engagement.

      Educational Approach and Innovation

      We developed a 12-week course as a component of Education Scholars Program (ESP), a 3-year national faculty development program to develop educational scholarship skills in pediatric educators.
      • Baldwin CD
      • Gusic ME
      • Chandran L.
      The Impact of a National Faculty Development Program Embedded Within an Academic Professional Organization.
      Our overall goal was to increase participants’ ability to design, implement, and evaluate quantitative educational research. Our innovation was utilizing learner-generated data as basis for hands-on exercises to increase learner engagement.
      The course consisted of 4 modules (1 – course overview and program evaluation; 2 – study design and data management; 3 – validity; 4 – data analysis and presentation). Modules included pre-readings, recorded didactic videos, hands-on exercises to reinforce learning with sample completed exercises released after module completion, discussion boards, and 2 synchronous videoconferencing sessions to facilitate discussion of challenging material with instructors and peers. Learners completed a pre-course survey and evaluations after each module. Due to the decision to use learner-generated data, surveys included more questions than would be necessary purely for curriculum evaluation to allow learners to reflect on how approach to questions impacts analysis.
      Rather than using a generic dataset for course activities, learner-generated data (e.g., data set containing learners’ responses to survey questions from pre-course survey and post-Module 1 evaluation) were used throughout the course to engage learners (Figure 1). For example, in Module 2, learners explored data management by using survey questions they completed in pre-course and post-Module 1 surveys to create a coding worksheet with variable names, variable descriptions and a coding schema. In Module 4, learners practiced data analysis and data presentation using their de-identified self-generated data. Reflection in action

      Schon DA. The Reflective Practitioner: How Professionals Think in Action. London: Routledge; 2017.

      encouraged reflection while the activity was happening, when learners completed surveys and analyzed their own responses, allowing learners to test their knowledge through active experimentation and practice
      • Kolb DA.
      Experiential learning: experience as the source of learning and development.
      . Annotated answer keys and zoom calls to discuss answers facilitated reflective observation
      • Kolb DA.
      Experiential learning: experience as the source of learning and development.
      and reflection on action

      Schon DA. The Reflective Practitioner: How Professionals Think in Action. London: Routledge; 2017.

      , encouraging learners to reflect on the learning process, including the impact of using learner-generated data, after activity completion. Hands-on use of a data-set created by self and peer responses allowed for both types of reflection by scholars.
      Figure 1
      Figure 1Educational quantitative research and evaluation course integrated self-generated learner data into hands-on curricula to enhance engagement with material through Kolb's Experiential Learning Cycle and Schon's Reflective Practice

      Evaluation

      Participants completed a pre-course survey and post-intervention retrospective pre/post survey on comfort performing a quantitative educational study using a 6-point scale (1– very uncomfortable, 2– uncomfortable, 3– somewhat uncomfortable, 4– somewhat comfortable, 5– comfortable, 6– very comfortable). To better understand the impact of using learner generated data, our post-intervention survey asked “In this course, we chose to use data generated from the ESP Cohort's pre-course survey and Module Evaluations as a basis for the exercises rather than using other example datasets. How valuable was this approach to your engagement with the course?” using a 5-point scale (1– not at all valuable; 2– of little value; 3– somewhat valuable; 4– moderately valuable; 5– very valuable). We then asked them to explain their response in an open-ended question. We used univariate statistics (means) with paired t-tests and differences in incidence rates for quantitative analysis using SAS 9.4. Comments mentioning the use of participants’ own data were compiled from all open-ended evaluation questions. Two authors (TML and CL) analyzed comments using content analysis. Separately, each author coded comments and then, through discussion, a single codebook was created yielding key themes. The University of California Davis Institutional Review Board classified this study as exempt.

      Results

      Thirty-five learners enrolled in and 32 completed the course. Of learners who completed the course, 87.5% (28/32) completed the post-course evaluation survey. The vast majority (92.9%; 26/28) found use of their own data as basis for exercises during course valuable/very valuable in their engagement (mean 4.5; Standard deviation: 0.69). In pre/post analysis, learners increased their comfort in performing a quantitative educational study by 1.00 point (95% confidence interval (CI): 0.52-1.47), from a mean of 3.35 to 4.18 (p<0.001), improving from 12% to 56% as comfortable/very comfortable (difference: 44%; 95% confidence interval: 11.7-76.3%), p=0.008). Results were even more pronounced in retrospective pre/post analysis, where learners increased their comfort by 1.61 (1.28-1.93) points.
      Participants’ comments reflected two key areas of strength for use of their own data. First, they recognized that using their own data increased their engagement as they found data more relevant and meaningful. One participant stated, “it is easier to engage/practice with data that feels relevant to us.” Second was the ability to directly see a connection between how survey questions were worded and how it affected data collected, and thereby the analysis. One participant described the exercise of using their own data, “added another layer of understanding to look at the data set first as a respondent, and then as a researcher.” Directly answering questions phrased in different ways and then analyzing that data led to a deeper understanding of the complexities of data analysis. Several participants also mentioned that this provided a “better understanding of importance of how questions are asked.”

      Discussion and Next Steps

      Our quantitative educational research course found that utilizing learner-generated data as basis for hands-on learning activities was valuable for learner engagement and contributed to self-assessed skill development. Utilizing learner generated data allowed us to capitalize on the value of experiential learning,
      • Kolb DA.
      Experiential learning: experience as the source of learning and development.
      allowing learners to manipulate their self-generated data to learn through reflection and active experimentation.

      Schon DA. The Reflective Practitioner: How Professionals Think in Action. London: Routledge; 2017.

      Our approach utilizes principles of case-based learning in which learners apply what they are learning to real world scenarios, a strategy known to be effective in health professions education.
      • Thistlethwaite JE
      • Davies D
      • Ekeocha S
      • et al.
      The effectiveness of case-based learning in health professional education. A BEME systematic review: BEME Guide No.
      However, our approach took that one step further by having learners not only analyze a real-world scenario, but also participate in creating data for that scenario. Quantitative and qualitative evaluation data support the idea that adding this additional layer to case-based learning deepened learners’ understanding.
      The main limitation of our study is small sample size which limits generalizability. However, our learners were from multiple institutions across the United States, and we had a high response rate. In addition, we assessed scholars’ self-report of engagement with learning and self-assessed comfort in performing a quantitative educational study, and did not objectively measure changes in knowledge, skill, or behavior. As this was a de novo course, we did not have data from a prior iteration without learner-generated datasets to compare in order to understand the incremental value of using learner-generated data versus case-based learning. Next steps would include gathering additional qualitative data as to how self-generated data led to increased engagement and learning, and objective outcomes data regarding knowledge acquisition and application.
      We present initial evidence for benefits of using learner-generated data to enhance learner engagement and believe this approach can have widespread applicability and benefit for use in the design of diverse educational curricula including topics such as research design, medical education, and quality improvement.

      Sources of funding

      The authors report no external funding source for this study.

      Author contributions

      All authors are responsible for this research and have participated in the concept and design, analysis, interpretation of data, and drafting and revising of the manuscript.

      Conflicts of interest competing interests

      The authors declare they have no competing interests.

      Acknowledgements

      The authors would like to thank the Academic Pediatric Association Educational Scholars Program scholars who participated in this survey.

      References

        • Stephenson CR
        • Bonnes SL
        • Sawatsky AP
        • et al.
        The relationship between learner engagement and teaching effectiveness: a novel assessment of student engagement in continuing medical education.
        BMC Med Educ. 2020; 20: 403
        • Steinert Y
        • Mann K
        • Anderson B
        • et al.
        A systematic review of faculty development initiatives designed to enhance teaching effectiveness: A 10-year update: BEME Guide No. 40.
        Med Teach. 2016; 38: 769-786
        • Cook DA
        • Steinert Y.
        Online learning for faculty development: a review of the literature.
        Med Teach. 2013; 35: 930-937
        • Kolb DA.
        Experiential learning: experience as the source of learning and development.
        Pearson Education, Inc., New Jersey2015 (Second edition. ed. Upper Saddle River)
      1. Schon DA. The Reflective Practitioner: How Professionals Think in Action. London: Routledge; 2017.

        • Graffam B.
        Active learning in medical education: strategies for beginning implementation.
        Med Teach. 2007; 29: 38-42
        • Freeman S
        • Eddy SL
        • McDonough M
        • et al.
        Active learning increases student performance in science, engineering, and mathematics.
        Proceedings of the National Academy of Sciences of the United States of America. 2014; 111: 8410-8415
        • Yardley S
        • Teunissen PW
        • Dornan T.
        Experiential learning: AMEE Guide No. 63.
        Med Teach. 2012; 34: e102-e115
        • Winkel AF
        • Yingling S
        • Jones AA
        • Nicholson J.
        Reflection as a Learning Tool in Graduate Medical Education: A Systematic Review.
        Journal of graduate medical education. 2017; 9: 430-439
        • Baldwin CD
        • Gusic ME
        • Chandran L.
        The Impact of a National Faculty Development Program Embedded Within an Academic Professional Organization.
        Acad Med. 2017; 92: 1105-1113
        • Thistlethwaite JE
        • Davies D
        • Ekeocha S
        • et al.
        The effectiveness of case-based learning in health professional education. A BEME systematic review: BEME Guide No.
        Med Teach. 2012; 34 (23): e421-e444