Abstract
Background: Patient and family engagement is thought to improve the quality and relevance of child health research. We developed and evaluated the usability of Patient Engagement 101, an e-learning module designed to strengthen the patient-oriented research readiness of health care professionals, researchers, trainees and other stakeholders.
Methods: The development of Patient Engagement 101 was co-led by a parent and a researcher and overseen by a diverse multistake-holder steering committee. The module was refined and evaluated using a mixed-methods usability testing approach with 2 iterative cycles of semistructured interviews, observations and questionnaires. We collected module feedback by way of semistructured interviews, the validated System Usability Scale, and satisfaction, knowledge and confidence questionnaires. Thematic coding of transcripts and field notes, informed by team discussions, guided the module revisions.
Results: Thirty end-users completed usability testing (15 per cycle). In each cycle, we modified the module with respect to its content, learner experience, learner-centred design and aesthetic design. Participants were highly satisfied, and System Usability Scale scores indicated the module had the best imaginable usability. Substantial increases in the participants’ knowledge test scores and the confidence to engage in patient-oriented research, but not self-rated knowledge, were observed after module completion.
Interpretation: Codevelopment with patients and caregivers, and refinement through comprehensive end-user testing, resulted in a training resource with exceptional usability that improved knowledge and confidence to engage in patient-oriented research in child health. Patient Engagement 101 is openly available online, and the methods used to develop and evaluate it may facilitate the creation and evaluation of similar capacity-building resources.
Plain language summary:
Why we did this research
Research teams that include patients and family members can create research that is more useful and better quality. However, some researchers do not know how to work with patients and families. We wanted to teach researchers how to involve patients and families on their team.
What we did
Our team built an online course called “Patient Engagement 101.” We wanted this course to teach researchers the knowledge, skills, and attitudes they need to involve patients in research. To test how well the course worked, we asked 15 people to complete it and tell us what they learned, how satisfied they were with the training, and how confident they felt to involve patients in the future. We changed the course based on their feedback and asked another 15 people to complete it to see if it improved.
What we found
People who completed the new course were happy with it and thought that it was user-friendly. They also said that they felt confident and knowledgeable to work with patients and families in the future.
Take-home message
Our team created a useful online course for clinicians and researchers who want to learn how to involve patients and family members on their research team. To take the course, please visit www.porcch.ca.
Patient-oriented research aims to improve health outcomes by engaging patients and the public as active partners in all stages of research.1 Partnering with patients and their caregivers in child health research can add value and applicability to the research, improve study design, increase its relevance and broaden dissemination.2–4 However, patient engagement involves unique challenges, such as recruiting diverse partners, managing power imbalances and skill gaps, merging varying perspectives and priorities, and sustaining engagement throughout the research process, all of which require additional competencies of clinician-investigators and other research team members.2,5,6 Ensuring all stakeholders are trained and supported to partner effectively is critical to expanding patient-oriented research capacity.7
Tailored training for researchers has been associated with several benefits, including a deeper understanding of patient-oriented research, improved awareness of relevant tools and strategies, and greater self-efficacy to engage patients and carry out specific engagement activities.8–10 Although surveys of researchers show broad interest in patient engagement–related training, several barriers, such as lack of time, unavailability of training opportunities and uncertainty regarding the relevance of patient engagement to their research, have been identified.5 In addition, there is a need for resources that are pediatric-specific that address the complexities of patient engagement in a child health context.2,11 To help bridge this gap, we developed the Patient-Oriented Research Curriculum in Child Health (PORCCH), an open-access online curriculum to strengthen capacity in patient-oriented research in child health, with specialized modules for different stakeholder groups.12 Online learning (e-learning) confers several benefits compared with in-person learning, such as scalability and cost-effective dissemination, remote and asynchronous learning, and the capacity for learners to tailor education to their learning needs.13 Usability, which denotes the effectiveness, efficiency and satisfaction with which users interact with a system for a particular purpose, is key to the evaluation of e-learning quality.14 Our study aimed to develop, refine and evaluate the usability of Patient Engagement 101; the PORCCH e-learning module intended to explain key principles, review practical aspects and best practices, and stimulate research readiness for patient engagement among health care professionals, researchers, trainees and other stakeholders. The module was codeveloped through a shared responsibility between clinicians, researchers, and patients and caregivers, in accordance with the International Association for Public Participation (IAP2) definition of collaboration.15
Methods
Study design and setting
We used a mixed-methods usability testing approach, with 2 cycles of semistructured interviews, observations and questionnaires to develop, refine and evaluate the module. This approach, which has been previously employed for e-learning resources,16,17 involves implementing a design, garnering issues and areas for improvement by way of end-user testing and thematic analysis of feedback, and making subsequent modifications in an iterative manner.18 Our study was conducted at an academic children’s hospital in Toronto, Ontario, with virtual meetings and phone calls used to connect steering committee members and participants residing in other provinces. This report follows the Guidance for Reporting Involvement of Patients and the Public (GRIPP2) reporting checklist.19
Development of Patient Engagement 101
Patient Engagement 101 comprises 2 complementary parts. Part 1 (Foundations of Patient Engagement) includes an overview of international initiatives promoting patient engagement, values and goals of patient engagement, and key elements of effective engagement. Part 2 (Patient Engagement in Practice) discusses practical aspects, unique challenges and various methods of patient engagement. The module was created with Storyline 360 (Articulate Global Inc.), an e-learning development application, following Mayer’s principles of multimedia design20 and plain language writing recommendations.21 Each part takes about 30 minutes to complete and includes interactive tools, video vignettes, assessment exercises, links to additional resources and certificates of completion.
Participants
Through study advertisements distributed by way of newsletters and email lists, English-speaking health care professionals, researchers, research staff, trainees, patients and family members were recruited from the Canadian Child Health Clinician Scientist Program, CHILD-BRIGHT Network, Strategy for Patient-Oriented Research (SPOR) SUPPORT Units and family advisory networks across Canada. A maximum variation purposive sampling approach was employed to ensure diversity in the testing group, particularly with respect to participants’ role (e.g., researcher, clinician-researcher, patient or caregiver), patient engagement in research experience and geographic location.22,23 Researchers and clinician-researchers, the module’s target audience, were predominantly sampled, alongside a smaller number of patients and caregivers for the complementary knowledge and perspectives of these key stakeholders.
Study procedures
Usability testing sessions were carried out by a research assistant with graduate training in psychology (G.A.M.), either in person or over the phone, between February and May 2020. At baseline, participants completed a demographic form (Appendix 1A available at www.cmajopen.ca/content/10/4/E872/suppl/DC1) and questionnaire on confidence to engage in patient-oriented research, developed according to Bandura’s framework for constructing self-efficacy scales (Appendix 1B).24 In addition, participants completed a multiple-choice knowledge test on patient engagement that was designed to target assessment of the “knows how” level of Miller’s framework for assessing levels of clinical competence (i.e., interpretation, application of knowledge)25 (Appendix 1C) and self-rated their knowledge of patient engagement on a 5-point Likert scale. These questionnaires were pilot-tested by 3 child health researchers or clinician-researchers to verify clarity and content validity.
Participants subsequently undertook a usability testing session during which they were asked to complete Patient Engagement 101 and think aloud regarding what they liked and disliked about the module. At various points, the research assistant asked questions to facilitate interpretation of loud thought, elicit the participant’s understanding of the module and solicit suggestions for improvement. Field notes were also recorded.
After completing Patient Engagement 101, participants engaged in a semistructured interview (about 20 min) designed to elicit their perceptions of the module’s usability (Appendix 1D). Participants then completed the same confidence questionnaire, knowledge test and overall knowledge rating as they had done before. In addition, they completed an e-learning satisfaction questionnaire (Appendix 1E) and the System Usability Scale (SUS), a validated 10-item questionnaire for evaluating user satisfaction of technologies.26 System Usability Scale scores range from 0 to 100, with scores of 71.4, 85.5 and 90.9 corresponding to overall usability ratings of “good,” “excellent” and “best imaginable,” respectively.27
After the first usability testing cycle, changes were made to the module prototype based on themes identified through content analysis of the usability testing sessions, field notes and questionnaires. A second cycle was then conducted to garner any further suggestions.
Patient engagement
Development of Patient Engagement 101 was co-led in equal partnership by a parent (F.B.) and a researcher with expertise in patient-oriented research (C.M.). Module co-leads met monthly over 9 months to identify and collate relevant peer-reviewed and grey literature, integrate it into the curriculum and draft the module. Feedback on module content, design and development was provided by a diverse steering committee that included 2 clinician-researchers, 2 SPOR SUPPORT Unit leads, 3 parent partners, a knowledge translation expert, an educational researcher and 2 instructional design experts. The parents on the steering committee and the parent co-lead all had lived experience with a child with a long-term health condition and were selected from established family advisory networks at 2 children’s hospitals. During usability testing, the entire study team met 2–3 times per cycle to clarify and critique module feedback and discuss and refine the evolving themes and framework.
Data analysis
Recorded testing sessions were transcribed verbatim, de-identified and coded using a qualitative data analysis program to facilitate data organization and analysis (Dedoose, SocioCultural Research Consultants). After each testing cycle, 2 coders (G.A.M., L.A.) read the transcripts to identify preliminary codes regarding usability (e.g., satisfaction, efficiency, learnability and errors), which were then refined through systematic iterative coding and sorting using the constant comparison method and then grouped into usability-related themes and subthemes.28 We used the thematic framework from a previous e-learning usability study17 and published usability attributes29–32 to guide the analysis. To enhance the trustworthiness of the findings,33 we encouraged reflexivity by having the team of patients, clinicians and nonclinician researchers engage in a dialogue to question and challenge each other’s assumptions throughout analysis. Team members not directly involved in coding reviewed the emerging themes and used their expertise in education, child health and research to help clarify and critique the findings. We purposively selected a sample size of 15 participants per usability testing cycle based on a usability perspective to ensure thematic saturation.34,35
Statistical analysis
We summarized data from the demographic, satisfaction, confidence and knowledge questionnaires using means, standard deviations (SDs) and proportions. Paired t tests were used to assess before and after changes in confidence and knowledge (α = 0.05, 2-sided). Quantitative analyses were conducted in R version 4.0.0 (R Core Team).
Ethics approval
The study received approval from the Research Ethics Board at The Hospital for Sick Children (Toronto).
Results
Usability testers for Patient Engagement 101 comprised 14 clinician-researchers in child health, 12 researchers in child health, 2 former pediatric patients and 2 caregivers (Table 1). We found that module completion time was similar across cycles, with a mean completion time of 65 (SD 12) minutes in cycle 1 and 71 (SD 12) minutes in cycle 2.
Participant characteristics
E-learning module usability
Qualitative analysis identified 4 key themes related to usability (outlined below). Illustrative quotes and example module changes are presented in Table 2 and Appendix 1F.
Usability testing feedback and corresponding module changes*
Content
We categorized the content domain into subthemes of quantity, completeness, quality and trustworthiness, relevance, understandability and usefulness. Participants thought the module included the most important topics relevant to patient engagement and discussed them in a manner that was complex enough to appeal to clinicians and researchers yet understandable by interested youth or adults without a formal research background. Participant gaps in understanding and requests for additional information during testing led us to add more glossary terms (e.g., tokenism, patient-reported outcome measures) and external resources (e.g., budget tools, patient engagement plan templates and patient-oriented research vignettes), and a new slide highlighting unique considerations for patient engagement in child health research to support learning (Figure 1).
A new slide and an additional resources tool box on the unique considerations for patient and family engagement in child health research was added to part 1 of the module, based on comments from participants in cycle 1.
Learner experience
Learner experience comprised subthemes of satisfaction, module length, motivation, engagement and preference for information access. Participants liked the brevity of the module and felt it would appeal to a wide audience in child health research. They also liked having to advance frequently the module by way of clicking, which helped sustain engagement. Participants varied in how quickly they wanted to proceed through the module, with some speed reading and others frequently pausing to reflect. Speedier participants were sometimes frustrated by sections that required them to wait (e.g., latencies before interactive elements became clickable); we modified these to be immediately interactive.
Participants valued the tool boxes with additional resources summarizing patient engagement tools and publications in plain language, with links to original materials, for providing extra information on key topics without detracting from the experience of those who chose not to view the tool box resources.
Attributes of learner-centred design
Attributes of learner-centred design encompassed ease of use, intuitive design and learnability. Overall, participants found Patient Engagement 101 to be user friendly. They appreciated its division into 2 complementary parts as well as the ability to pause and resume the module at a later date. Although most participants quickly learned how to use the navigational controls and interactive features, we identified a few sections that were not sufficiently intuitive and the need to provide subsequent reminders about the additional resource tool boxes.
Design aesthetic
Aspects of design aesthetic included multimedia components, module features, navigation and visual assets. Most participants enjoyed listening to the module’s narration, which was recorded by a professional voice talent, as this slowed their progression through the module, providing natural opportunities between slides to reflect on the material and its applications to their own research. Participants appreciated the module’s navigational aids (e.g., colourcoded sections, animated checkmarks to show completed sections), which facilitated navigation and permitted greater focus on the content.
Visual assets in the module (e.g., diagrams, animations, videos) were appreciated for adding diversity to the learning materials. In particular, the videos of stakeholders’ experiences were well received by participants for being engaging and providing powerful insights regarding perspectives, motivations and challenges to patient-oriented research.
In cycle 1, a few participants inadvertently skipped sections by double-clicking a button that required only a single click. To prevent this, we throttled the click rate in the module, allowing for any number of clicks to register only as a single click, and this issue did not reoccur. Other minor issues, such as language-related errors and audio-volume imbalances, were also identified and subsequently fixed.
Overall, SUS scores for Patient Engagement 101 were high (cycle 1: mean 92.0 [SD 6.0]; cycle 2: mean 91.5 [SD 8.6]), corresponding to best imaginable usability.
E-learning evaluations of Patient Engagement 101 were positive (Table 3). In cycle 2, the mean overall satisfaction with the education was 4.7 (SD 0.5), out of 5.
E-learning module feedback
In each cycle, participants’ knowledge test scores and confidence to engage in patient-oriented research increased significantly after completing Patient Engagement 101 (p < 0.05, Table 4). Self-reported knowledge of patient-oriented research did not change significantly (p > 0.05).
Pre- and postmodule completion differences in confidence to engage in patient-oriented research, knowledge test scores and self-reported knowledge
Interpretation
Patient Engagement 101 was codeveloped with patients and caregivers and refined through comprehensive end-user testing, resulting in an open-access, e-learning resource with exceptional usability that substantially increased knowledge and confidence to engage in patient-oriented research in child health. In the first 12 months (June 2021 to May 2022), the PORCCH website (www.porcch.ca) had over 50 000 unique site visitors, with over 380 users enrolled in Patient Engagement 101.
Patient Engagement 101 adds to a growing landscape of effective capacity-building resources on patient and public involvement for researchers and other stakeholders. In-person and online workshops have traditionally been a popular approach to providing support.8–10,36,37 In the United Kingdom, a program of workshops to build confidence and skills, delivered at a large biomedical research centre, attracted more than 700 attendees across 72 workshops over 5 years.10 Pre- and postworkshop surveys showed marked increases in attendees’ understanding of, and confidence in, carrying out engagement activities. INNOVATE Research delivered a youth engagement capacity-building intervention by way of a 1-day workshop, run at 3 Canadian academic research institutions.8 Six months later, attendees reported greater familiarity and self-efficacy to engage youth, and more frequent engagement of youth in their teaching and as conference copresenters.8
Longer-term initiatives, such as engagement-related training embedded into graduate programs,38,39 coaching programs and communities of practice,40 and standalone accredited programs,41 are emerging. A 2020 study evaluated a 1-year studentship program for graduate health sciences students in Alberta that included funding, specialized training, and self-selected networking and mentorship opportunities.39 Awardees’ final impact narrative reports were analyzed to identify program benefits, which included the development of engagement-related skills and leadership, collaborations and partnerships, new perspectives on patient-oriented research and modifications to their research projects and career goals. In a 2018 study, these same authors also evaluated an 18-week, 56-hour blended learning program intended to integrate patient-oriented research into clinical trials, initially completed by 22 clinical trials staff.41 At program completion, 15 learners reported changes to increase patient engagement in their trials.
These findings, in concert with the improvements in knowledge and confidence after completion of Patient Engagement 101, suggest that brief training opportunities are useful to disseminate best practices and increase researchers’ confidence to carry out engagement-related activities. However, deep integration of patient engagement into a research program may require more intensive capacity-building opportunities that incorporate mentorship, networking opportunities and practical guidance.42 Although participants’ knowledge test scores increased after module completion, their self-reported knowledge did not, which suggests that the knowledge test may not have captured all relevant domains or that participants had unperceived needs related to patient-oriented research.43
The development and provision of institutional training to build capacity in patient engagement requires substantial financial and human resources,37 which can result in tensions between patient-oriented research values such as inclusivity and operational concerns in relation to cost recovery.44 It should also be noted that the training resources in the peer-reviewed literature likely represent the tip of the iceberg of relevant training materials, given that capacity development is a cornerstone of national frameworks on patient-oriented research.1,36 To efficiently build capacity in patient-oriented research, there is a growing need to coherently evaluate training materials, identify high-quality resources and determine how best to integrate them into larger curricula.7
Limitations
Some limitations to our study should be noted. The sample size, although sufficient from a usability testing perspective, was not large enough to permit investigation of participant characteristics on usability ratings or other outcomes. Participants, who were recruited through pediatric research and family advisory networks for their familiarity with patient-oriented research, may not fully represent the intended end users of Patient Engagement 101. In addition, the self-report measures used to evaluate the impact of the module on knowledge and confidence were chosen to minimize participant burden. Evaluating long-term outcome, such as whether completion of Patient Engagement 101 is associated with greater engagement of patients and caregivers in research, in a larger and more representative sample that reflects the diversity (e.g., education, ethnicity, gender identity, sex, people with disabilities) of those involved in patient-oriented research, would be useful. Finally, PORCCH is available only in English at present; however, French translation efforts have begun.
Lessons learned from patient involvement
Patients and caregivers on the steering committee contributed important insights on key topics, such as the representativeness of patient partners, the plurality of perspectives in child health (e.g., patient, parent, caregiver, siblings, grandparents, teachers), and how to build and sustain authentic and meaningful partnerships. They also provided feedback on the content and design of module prototypes, and helped interpret the findings and review emerging themes throughout testing. Codeveloping a patient-oriented research curriculum with patients and caregivers enhanced the quality and credibility of the curriculum.37 The parent co-lead of the module (F.B.), who was a member of the family advisory network at an academic children’s hospital, began graduate work in patient engagement during the module development, has subsequently completed a PhD and has been hired as a patient and family engagement in research coordinator at the same hospital.
Conclusion
Patient Engagement 101 — part of PORCCH, an open-access, online curriculum — may be useful in a variety of educational contexts, including graduate curricula, research institute onboarding programs and professional development for researchers who are new to patient-oriented research in child health. In addition, the methods used to create and evaluate PORCCH may help guide the development and evaluation of other online resources. It is our hope that Patient Engagement 101 will help build capacity in patient-oriented research in child health among health care professionals, researchers, trainees and other stakeholders.
Acknowledgement
The authors wish to thank Pathways Training and eLearning for their e-learning support.
Footnotes
Competing interests: None declared.
Contributors: Francine Buchanan and Colin Macarthur are co–senior authors. Both of these authors contributed equally to the work. Catharine Walsh, Nicola Jones, Francine Buchanan and Colin Macarthur conceived and designed the study. Catharine Walsh, Graham McCreath and Veronik Connan acquired the data. Catharine Walsh and Graham McCreath drafted the manuscript. All of the authors analyzed and interpreted the data, revised the manuscript critically for important intellectual content, gave final approval of the version to be published and agreed to be accountable for all aspects of the work.
Data sharing: Study data are available upon request by contacting the corresponding author (Catharine Walsh).
Supplemental information: For reviewer comments and the original submission of this manuscript, please see www.cmajopen.ca/content/10/4/E872/suppl/DC1.
Funding: The study was funded by a Canadian Institutes of Health Research Strategy for Patient-Oriented Research (SPOR) — Patient-Oriented Research Collaboration Grant no. 397481 (matching funds were provided by CHILD-BRIGHT, the BC SUPPORT Unit, the Canadian Child Health Clinician Scientist Program, and the Ontario Child Health SUPPORT Unit). Catharine Walsh holds an Early Researcher Award from the Ontario Ministry of Research and Innovation. The funder had no role in the design and conduct of the study, decision to publish, or preparation, review or approval of the manuscript.
This is an Open Access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY-NC-ND 4.0) licence, which permits use, distribution and reproduction in any medium, provided that the original publication is properly cited, the use is noncommercial (i.e., research or educational use), and no modifications or adaptations are made. See: https://creativecommons.org/licenses/by-nc-nd/4.0/
References
- © 2022 CMA Impact Inc. or its licensors