If you don't remember your password, you can reset it by entering your email address and clicking the Reset Password button. You will then receive an email that contains a secure link for resetting your password
If the address matches a valid account an email will be sent to __email__ with instructions for resetting your password
Address correspondence to Kristina Dzara, PhD, MMSc, Instructor of Pediatrics, Harvard Medical School, MassGeneral Hospital for Children, 175 Cambridge St, Boston, MA 02114
The traditional journal club (JC) format of reviewing an article followed by group discussion may be misaligned with millennial learners’ needs and may not rely on best principles of adult learning. Our objective was to deliver an interactive JC allowing pediatric residents to critically engage with medical education research without previous preparation.
Methods
We conducted 4 one-hour “interactive, no-prep” medical education JCs for pediatric residents in a medium-sized program in 2018. Without previous reading, participants developed methods to answer the study question, compared that with actual methods, analyzed the results, and extrapolated the findings. We developed a simple, anonymous evaluation tool to determine perceived educational impact, analyzed using mixed methods.
Results
A total of 52 of 59 participants (88% response rate) indicated on a 7-point scale that the JC helped them think about how to analyze a paper (mean = 5.32), use a paper to inform further study questions (mean = 5.42), and understand medical education research (mean = 6.00). Four qualitative themes indicated that, although improvement was possible, it provided a strong interactive learning experience.
Conclusions
Our JC approach using active learning principles and requiring no advance preparation is proof of concept that faculty's objectives to teach critical literature evaluation and millennial needs for engagement can be simultaneously met.
We describe the creation and implementation of an innovative, interactive journal club model which requires no prior preparation, relies on adult learning principles, and is adapted to the needs of the millennial learner.
Journal clubs (JCs) are a versatile graduate medical education tool to develop critical appraisal skills, encourage evidence-based practice, support research training, and disseminate new findings.
Anecdotally, faculty at our institution have noted that residents are increasingly not reading articles for JC and are either coming unprepared or not attending at all. This aligns with what is known about millennial, as well as Generation Z, learners; they appreciate engaging in active, team-based learning, desire convenience, have short attention spans, desire to use technology when learning, and prefer multitasking.
Recognizing the problem of waning trainee engagement in JC, there have been innovative methods to improve them, for example, focusing on peer mentorship,
In one approach, faculty facilitators provided participants with the study title, tables, and figures, and prompted participants to offer their own interpretation.
Neither method was formally evaluated. Overall, very little literature addresses the challenge of unprepared attendees. Furthermore, limited literature addresses the use of JCs to expose learners to medical education research.
We transformed the traditional JC format into an “interactive, no-prep” series for the MassGeneral Hospital for Children pediatric residents. Our aim was to deliver a time-efficient JC using active learning techniques to allow pediatric residents to critically engage with medical education research without previous preparation. The aim of our educational evaluation was to determine participants’ perceived learning impact of the JCs using a mixed methods analysis.
Methods
Setting and Participants
We conducted 4 one-hour “interactive, no-prep” medical education JCs for pediatrics residents in a medium-sized program between February and September 2018. Sessions were held during the noon conference hour in a conference room with a whiteboard and projector. Medical students, pediatrics residents of all levels including chiefs, and a small number of faculty attended. Residents were expected to attend if clinical demands permitted.
Implementation
Learning objectives were to have participants develop a plan for answering the article's study question, compare that with the actual study methodology, analyze the results of the study, and extrapolate the study findings to their own experiences. Participants were expected to analyze medical education research methodology to determine strengths and weaknesses of the design and analysis, interpret the practical significance of results, and recognize further questions the study raised.
Each JC explored a medical education research article chosen by the session leaders (K.D., A.F.V.). Purposefully selected articles fit 4 criteria aligning with adult learning principles
relevant to pediatrics residents’ real-world training experience (fits with learners’ previous experience)
2)
focused on graduate medical education (aligns with learners’ self-concept)
3)
well-described research methodologies (encourages learners’ need to know about research methods for research projects)
4)
published within the past 3 months (supports learners’ motivation to review new literature)
Articles collectively had quantitative, qualitative, and mixed-methodologies to expose participants to varied medical education research methods.
We introduced participants to medical education research, provided the article title, discussed the background to the current study, and provided the research question. Participants were divided into small groups to brainstorm how to design study methodology for the research question specifically focusing on what they could measure to answer the question and how they would collect the data.
After 15 minutes of discussion, each group reported their proposed methods and perceived challenges. In the large group, we revealed and scrutinized the study methodology, results, and conclusions. We focused on interpreting medical education terminology and techniques in distilled, approachable language. We then discussed results and implications. Post-session, the article was e-mailed to the residents. The Figure displays an implementation flowchart.
We developed a simple, anonymous evaluation tool to determine participants’ perceived educational impact, which was not pre-tested before use. Hand-written responses were collected anonymously at the end of each session; responses were voluntary. Participants rated 3 critical appraisal outcome measures anchored from 1 (not at all helpful) to 7 (extremely helpful) (Table 1) and were asked 2 qualitative, open-ended questions about what went well and what could have been done differently.
Table 1Quantitative and Qualitative Educational Evaluation Results
Quantitative—How Helpful Was This Journal Club in:
Low Score
High Score
Mean Score
Modal Score
Standard Deviation
Interquartile Range
Helping you to think about how to analyze a paper
3
7
5.32
6.00
1.11
1.25
Helping you to think about how to use a paper to inform further study questions
3
7
5.42
6.00
1.24
1.00
Helping you to understand what medical education research is and how it can be conducted
Quantitative data were entered into Microsoft Excel (Office 365; Microsoft, Redmond, Wash) and analyzed for descriptive statistics including mean, mode, and interquartile range.
To analyze open-ended questions, the authors conducted a thematic analysis following the “five stages to qualitative research” framework.
Dedoose (version 8.1; SocioCultural Research Consultants, LLC, Los Angeles, Calif) was used to facilitate data management. First, 2 coders (K.D. and A.F.V.) familiarized themselves with the data, each independently reading the responses to the 2 qualitative questions and creating 2 independent preliminary code lists with 40 (A.F.V.) and 26 (K.D.) codes, respectively. The coders then compared codes, collapsed codes, and refined definitions to develop a codebook containing 29 final codes that were then applied to all data. After all codes were applied, the coders discussed recurring data patterns and combined codes into categories (n = 11), followed by themes (n = 4). The coders independently reread the coded data within each theme to ensure coding consistency.
The educational evaluation was designated “clinical quality improvement/measurement” by the Partners Human Research Committee, requiring no additional institutional review board review.
Results
In total, 59 participants attended at least 1 of the JCs. Fifty-two evaluations were returned (88% response rate). Overall, the JC was well-received, with mean scores for the 3 objective outcome measures falling between 5.32 and 6.00 on a 7-point scale (Table 1).
A total of 45 respondents (86.5%) answered at least one qualitative question. Thematic analysis resulted in 4 themes:
Theme #1: Strong Interactive Learning Experience
Respondents felt that developing methods via active learning was a strength, the small group experience was valuable, and discussion enabled a positive learning experience. Although the JC was not perceived as easy, the session leaders were viewed as strong teachers providing an innovative and engaging learning experience (Table 1).
Theme #2: Preparation Not Required
That preparation was not required before the JC session was perceived as a welcome change to their training and a strength by participants (Table 1).
Theme #3: Improvement Is Possible
Potential improvements included a desire for additional input from the session leaders and an adequate mix of different training levels within small groups. Respondents wanted group facilitation to keep small group discussion on track. Some suggestions included preferences that were outside the scope of the JC, such as including case studies or vignettes (Table 1).
Theme #4: Desire for Further Learning
Respondents indicated an interest in continued learning. They would have preferred that printed copies of the article were available at the JCs. They also suggested that the large group discussions led by the session leaders might offer a deeper dive into the results. Post-session, respondents desired additional research terms and methods resources (Table 1).
Discussion
We designed an “interactive, no-prep” approach to JC targeted toward millennial learners. The JCs were well received; participants indicated having a strong overall learning experience. This approach adds to the literature on innovative JCs by combining interactive techniques
both reported in other papers, and assessing not only residents’ quantitative assessment of their learning, but their qualitative feedback. This mixed methods approach allows for a better understanding of what JC aspects were helpful to participants. Based on their qualitative feedback, we attribute our success, in part, to the session leaders recognizing the learning preferences of clinically busy millennial learners in 3 key ways: we did not require previous preparation, we allowed for active engagement with the material, and we made the information relevant. To allow for participants not to prepare in advance, we needed to ensure that we explained educational research terminology and methodology in real time. As participants developed ideas for study methods, we explained how their ideas fit into research methodology and offered terminology for the concepts they described. We applied active learning techniques by having participants engage in study design, analysis of results, and application of findings to their own experiences; by having participants create their own study methodology, we used the highest level of learning objectives in Bloom's taxonomy, creating.
We used articles that aligned with adult learning principles and made the information relevant by choosing articles on familiar graduate medical education topics such as patient feedback, milestones-based ratings, organizational culture and feedback, and remote supervision. The discussions offered opportunity for participants to demonstrate autonomy, share expertise, and work in teams. Participants desired additional learning resources and more engagement with material, such as additional in-depth discussion, which suggests they found the material meaningful. Because we formally evaluated the JCs, we add to the limited literature regarding the utility of both interactive and no-preparation approaches to JCs.
Limitations
The JCs took place at a single pediatric residency program over a short time period. Resident schedules limited participation. Because we conducted a routine educational evaluation, our tool was not validated and only assessed participants’ perception of learning, which lacks objectivity; there is evidence that physicians have a limited ability to accurately self-assess.
We also did not conduct a pre-survey to demonstrate change in participants’ perception of learning. To mitigate this, we used thematic analysis to understand participants’ qualitative reactions to their learning experience. Finally, using 4 different articles limits generalizability but demonstrates wide application potential.
Next Steps
This JC strategy could be implemented in other medical fields and for other research types. It is reproducible; the overall time commitment to develop each session was 2 to 3 hours. The implementation flowchart aids educators in adapting this model for their own institutions (Figure). Faculty also could consider encouraging this model for self-study. Participants could read the article title and introduction, stop and detail a possible methodology for the study, and then read the methods and critique them. Participants could then compare their proposed methodology with the actual methodology and consider why the authors made their methodologic decisions. They could then read the results and conclusions and determine, for themselves, the paper's implications and whether they agreed with the authors’ conclusions. By practicing this in a large group setting and then trying it out individually, participants might learn a literature review technique which forces them to actively analyze while reading.
We plan to improve our JC by introducing a faculty moderator in each group, ensuring the groups have a mix of learners, leaving more time for detailed discussion of results and implications, and providing resources for additional self-directed learning. Future studies could assess the impact on residents’ approach to reading literature and their knowledge of medical education research methodologies by giving residents a medical education article and having them individually analyze the methodology, results, conclusions, and implications to determine their ability to apply the process individually.
Conclusions
We designed an “interactive, no-prep” approach to JC targeted toward millennial learners using the principles of adult learning and combating the challenge of engaging unprepared learners. We focused on exposing pediatric residents to medical education research using active learning principles, making the experience relevant to them by choosing articles focusing on graduate medical education. Our approach was well-received by participants, who noted having a better understanding of what medical education research is and how it is conducted and left the sessions better equipped to analyze a medical education research paper and use that information to inform further study questions. Our experience is proof of concept that the need to teach critical evaluation of the literature and millennial residents’ needs can be met simultaneously in an “interactive, no-prep” JC.
Acknowledgments
We acknowledge Benjamin Nelson, MD, for his helpful comments on an early draft of this paper.
References
Deenadayalan Y
Grimmer-Somers K
Prior M
et al.
How to run an effective journal club: a systematic review.