Skip to main content
  • Original research article
  • Open access
  • Published:

Unintended consequences of technology in competency-based education: a qualitative study of lessons learned in an OtoHNS program

Abstract

Background

Formative feedback and entrustment ratings on assessments of entrustable professional activities (EPAs) are intended to support learner self-regulation and inform entrustment decisions in competency-based medical education. Technology platforms have been developed to facilitate these goals, but little is known about their effects on these new assessment practices. This study investigates how users interacted with an e-portfolio in an OtoHNS surgery program transitioning to a Canadian approach to competency-based assessment, Competence by Design.

Methods

We employed a sociomaterial perspective on technology and grounded theory methods of iterative data collection and analysis to study this OtoHNS program’s use of an e-portfolio for assessment purposes. All residents (n = 14) and competency committee members (n = 7) participated in the study; data included feedback in resident portfolios, observation of use of the e-portfolio in a competency committee meeting, and a focus group with residents to explore how they used the e-portfolio and visualize interfaces that would better meet their needs.

Results

Use of the e-portfolio to document, access, and interpret assessment data was problematic for both residents and faculty, but the residents faced more challenges. While faculty were slowed in making entrustment decisions, formative assessments were not actionable for residents. Workarounds to these barriers resulted in a “numbers game” residents played to acquire EPAs. Themes prioritized needs for searchable, contextual, visual, and mobile aspects of technology design to support use of assessment data for resident learning.

Conclusion

Best practices of technology design begin by understanding user needs. Insights from this study support recommendations for improved technology design centred on learner needs to provide OtoHNS residents a more formative experience of competency-based training.

Graphical abstract

Introduction

Programmatic assessment in competency-based medical education (CBME) is a multi-faceted approach intended to support the development of competency [1,2,3]. When learners know what they need to do to improve, they can begin to self-regulate their learning [4,5,6]. Formative feedback from frequent assessments of entrustable professional activities (EPAs) should provide learners this growth-oriented information [3,4,5,6]. Entrustment ratings on EPAs also contribute numerous data points to assist competency committees in making summative decisions about learner progression [2, 3, 7,8,9]. However, due to the ongoing nature of EPA assessment in clinical learning settings, there is a burgeoning amount of data to document and analyze. Advocates for CBME have urged the development of e-portfolios to facilitate the aims of programmatic assessment [10,11,12,13]. Are these technologies living up to their promise?

Complex educational reforms require nuanced attention to core values to implement with fidelity in diverse contexts [1, 2, 14, 15]. There is a strong focus in the CBME literature on ways to mitigate human factors that may thwart intended outcomes. Discussions after implementation have shown the need for improvement in change management [1, 12, 14, 15], faculty development [2, 16, 17], and orientation of learners [18, 19]. When it comes to technology, solutions are presented for the human challenges of managing the data generated by EPA assessments [20,21,22,23]. However, little research has explored the role technology may play in creating unintended consequences for CBME [24, 25]. This leaves a blind spot for implementation, because technology often has unpredictable or unintended effects in practice [26,27,28].

In Canada, Otolaryngology-Head & Neck (OtoHNS) residency programs were among the first to implement Competence by Design (CBD), an approach to competency-based postgraduate medical education mandated by the Royal College of Physicians and Surgeons of Canada. As early adopters, OtoHNS programs were required to use an e-portfolio developed by the Royal College. Since implementation in 2017, a longitudinal survey of learner experience with CBD has identified unintended effects on residents, including difficulties that residents in OtoHNS programs and others have encountered with use of this e-portfolio [29].

Research that investigates the effects of technology in practice may contribute insight to this problem. For example, Suchman, a pioneer in the field of human-technology interaction, drew on ethnographic methods and sociomaterial perspectives to show that unproductive workarounds are the result of communication barriers between users and technology [27]. A sociomaterial lens draws attention to the agency of material ‘actors’ to position us in practice, acting with and on us [30,31,32]. Following Suchman’s example [27, 28], this study sought an in-depth understanding of the effects of the e-portfolio on assessment practices in a Canadian OtoHNS program.

Methods

This study employed situational analysis, a sociomaterial approach to grounded theory using multiple methods of data collection such as document review, observation, interviews, and visualizations to understand the effects of actors in a practice [33, 34]. Grounded theory is suited to the study of educational situations because it generates understandings of how practice emerges through social processes. A sociomaterial approach to research considers material, such as technology, as an actor in social processes because material has agency in structuring how humans interact with it and with one another. Like other grounded theory methodologies, situational analysis takes an iterative approach to data collection and analysis, seeking conceptual depth by theoretically sampling new data for developing insights [35,36,37]. Themes and relationships in the data are identified to develop an explanatory concept of how human and material actors interact to shape practice in the situation under study [33, 34].

Study context

Since OtoHNS was among the first specialties to implement CBD, we chose a focal program to study new assessment practices to afford a rich picture of implications for faculty and learners. The OtoHNS program at Western University is mid-sized by Canadian standards, functions as its own academic department, and is a sought-after training program that attracts high calibre residents. The size, organizational and educational conditions of this program made it an ideal ‘critical case sample’ [38] for an in-depth study of a ‘best-case’ scenario.

Participants and data sources

All residents (n = 14) and competency committee members (n = 7) in the program consented to participate in the study. Four residents consented to analysis of anonymized feedback from their e-portfolios. Faculty consented to collection of anonymized observational fieldnotes. Verbatim quotes from members of the competency committee are indicated by S1, S2 in the results. The audio-recorded focus group with residents occurred during an academic day, with 11 residents present. Due to some overlap in data sources, the total number of residents in the study was 14. The focus group transcript was anonymized, with residents indicated by numbered references in the results (R1, R2, and so on).

Data collection and analysis

Analysis included the content of EPA feedback in the e-portfolios, the discussion of this data in the competency committee meeting, and a resident focus group. Questions for the focus group concerned use of feedback to self-regulate learning, such as how residents decided who to ask and when to ask for EPA assessments, how they used the feedback to gauge their progression, and how they used the e-portfolio to facilitate this process. Since the observation of the competency committee raised faculty concerns with the design of the e-portfolio, the process of theoretical sampling compelled us to further explore the agency of this technology in the documentation and use of EPA data. As part of the focus group, we asked the residents to draw pictures of e-portfolio interfaces that could improve the experience of using EPA data for learning. Conceptual depth was achieved through ongoing analytic meetings with other members of the research team to consider the adequacy of the data to support this theory. We checked for resonance of the findings [38] with the faculty and residents in the program through presentations at department research days both during and after the study concluded. The figures of e-portfolio screens in the results are an encapsulation of the visual and textual data in the focus group and competency committee observation, prepared with the assistance of a graphic artist.

Results

The e-portfolio created barriers to the assessment practices of documenting, accessing, and interpreting EPA data for both faculty and residents, but the residents faced more challenges. Difficulties collating and visualizing EPA data slowed the capacity of competency committee members to review resident performance in depth. However, residents faced three obstacles to use of the e-portfolio: requesting assessments, receiving formative feedback, and using data to self-regulate learning. The workload of trying to manage EPA data led to unintended workarounds for these barriers, resulting in a “numbers game” (R7) residents played to acquire successful EPA assessments.

The findings are organized to detail each technology barrier and resulting workarounds. Figure 1 illustrates themes in the data that prioritized needs for searchable, contextual, visual, and mobile technology solutions to overcome these challenges.

Fig. 1
figure 1

Envisioning a resident dashboard

Requesting EPA assessments

Residents described difficulties requesting EPAs through the e-portfolio that mapped onto the procedures they were learning. One commented, “the stem of the problem is, the website is not friendly, EPAs are not designed properly” (R8). Another explained, “you have to find which is the appropriate EPA, which is not always very obvious. For example, this one about hearing loss is actually the tube insertion one” (R2). The design visualizations of improved interfaces showed that how an EPA is structured in the curriculum plan matters for how residents search for it in the e-portfolio. Their sketches illustrated that organizing EPAs in the database in the same way they are laid out in their program curriculum map, by level of training and rotation, would helpfully narrow the search field to “the appropriate EPA”. The searchable schematic in Fig. 1 encapsulates these ideas.

Residents also felt hampered by the extra work required to manage EPA feedback requests and notifications within the e-portfolio: “You hit request, and then it generates a note to the staff who then sees, ‘Request for EPA Number 3.7’… they don’t get any other information than that” (R2). To prompt faculty to recall the case for feedback, the residents developed communicative workarounds: “you have to either communicate that to them in person and they have to remember, or you have to send a separate email telling them, hey, I’m sending you an EPA, it’s about patient X, Y, Z” (R2). Residents agreed that this tactic of sending extra emails and reminders was essential to ensure that faculty understood which procedure they were being asked to provide feedback on and to complete assessment documentation. However, while residents had to shoulder the workload of filling in contextual gaps for faculty, they faced the same problem of lack of context when receiving feedback from the system.

Receiving formative feedback

The residents described the feedback notifications as “generic”, making it difficult to remember which cases they could be related to.

It’s just a very generic email so, it doesn’t say what you’re being evaluated, it just says so-and-so completed an evaluation on your form. You have to think, 30 days later, you have to think about what did you send them? (R7)

The default setting of up to “30 days” from sending a request to receiving feedback was intended to allow faculty time to complete assessments. However, both the delay of information and lack of context rendered the feedback uninformative, as the following conversation between residents highlights:

R4: I, over time, I just stopped reading the EPA feedback… I mean, just delete it from my inbox. I guess, yeah, it doesn’t tell me the exact contextual context. I know that I achieved it when I asked so I don’t read the feedback for it.

There were two reasons residents found feedback documented in the e-portfolio did not support their learning. First, as Resident 7 pointed out, “30 days later” the assessment served a purely summative, “black or white”, purpose. Residents found the feedback lacked specificity on ways to improve performance. Our own analysis of EPA feedback in the residents’ e-portfolios confirmed that details were scant: “good economy of motion”, “good understanding of relevant anatomy”, “knew all components of the procedure”.

But there was another reason feedback was uninformative. Resident 4’s commentary on why emails with feedback notifications were deleted is telling: “I know that I achieved it when I asked for it”. The residents described a process of waiting until they were reasonably assured they would receive a ‘successful’ entrustment rating before sending feedback requests for an EPA: “When?—confident on procedure, never first-time doing procedure” (R1).

It might seem that residents were hesitant to receive more improvement-focussed assessment. However, when asked how residents decided who to ask and when to ask for EPA assessments, the answer landed on a tactic for efficiency rather than avoidance of constructive feedback:

It’s basically a numbers game. You’re like, are you going to send one out that’s got 30 days and then you’re going to have to re-request it? Probably not. It’s not a good use of your time, it’s not a good use of their time. (R7)

Given the high stakes of the “numbers game” for progression, it might also seem that residents would seek feedback strategically from faculty known to give higher entrustment ratings. However, as the following exchange outlines, this was not the case.

R11: I would say, I just ask based on people who I know will get it back to me and people who are willing to do it.

R2: And most staff will really only happily do one EPA for you a day.

To protect their time and faculty time in the workload of CBD, residents managed the numbers game by deciding who to ask based on faculty approachability and efficiency and deciding when to ask once reasonably confident of success. The contextual features in the second schematic in Fig. 1 show how the workload of managing the numbers game could be reduced and the value of feedback increased with a technology design that included contextual details in feedback requests.

Using assessment data to self-regulate learning

Faculty and residents also shared that the technology design made it difficult to track progression towards entrustment on EPAs. In the CBD curriculum, EPAs have a number of components; for example, entrustment of the tonsillectomy EPA required a variety of patient indications and multiple assessment observations. In the competency committee meeting, members had to toggle between different screens to see how many EPA observations were complete, which contextual variables were incomplete, and to read the feedback on the observations. Faculty struggled to interpret this data holistically. The following exchange between two committee members indicates that faculty understood accessibility of EPA data was a problem for the residents as well:

S2: They need a system for logging themselves—they can’t see them, we struggle because we’re flipping back and forth between screens.

S6: If they could just have a personal dashboard so they know! It’s so hard to keep track.

The problem residents faced in tracking their progression was piecing contextual variables together, finding opportunities on different days to complete EPA requirements. The challenge to “keep track” was compounded by the absence of interpretive details in the reporting system.

As Resident 11 explained, notifications of a “pass” for an EPA assessment did not provide information on which contextual variables “contributed to your pass”. The schematic for visual features in Fig. 1 illustrates the residents’ requests for a more accessible snapshot of EPAs in progress and more informative metrics on completion of contextual variables.

Discussion

Barriers posed by the e-portfolio to requesting EPA assessments, receiving actionable feedback, and self-regulating learning had unintended consequences for the residents in this study. The formative intentions of CBME did not translate to practice, and the work of achieving EPAs was reduced to a numbers game for acquiring successful observations before assessment requests timed out in the system. While the burden of workarounds to the design of the interface fell mainly on the residents, their experiences struggling to access and interpret EPA data were shared by the competence committee. This study resonates with research on workload for a competence committee at a different institution [24]. While this committee used a different technology interface, similar issues with visualization of data slowed capacity to review trainee performance. The Royal College’s own program evaluation of CBD confirms these findings, taking note of calls to improve the design of e-portfolios [39]. However, to avoid the quick fixes of ‘solutionism’ [26, 40] it is essential to understand that the unintended consequences of technology in this case are more than a problem of workload. Insights from our study support recommendations for enhanced design of technology to improve the quality and accessibility of EPA data for learning.

For example, lack of context in EPA requests and notifications posed the largest obstacle to assessment workflow for residents. But the time delays created and communicative workarounds to fill in contextual gaps led to the emergence of efficiency strategies for acquisition of EPA assessments that had little to do with seeking growth-oriented feedback. While improved search functions, contextual details, and push notifications would ease the workload of CBD, we suggest that mobile designs for EPA assessment could go furthest to address the problem of formative assessment. A possible productive workaround using the current version e-portfolio is to have residents pre-fill EPA assessments to the point that faculty feedback can be documented in the moment. However, in an example of truly mobile design for CBME, researchers found that an assessment app increased context specificity and engagement with feedback significantly [21]. A pilot study of a mobile assessment app in an OtoHNS program likewise demonstrates the feasibility of this approach for operative settings [41].

We also showed that lack of clarity in how EPA data was displayed in the e-portfolio limited the ability of residents to monitor their progression. The problem of human–machine communication is well known [27]. Visual dashboards have been developed as a sign language of sorts to facilitate this interaction [42, 43], and user design has emerged as an important field relying on qualitative research methods to empathize with user needs and optimize solutions [20,21,22,23, 44]. In a noteworthy example in the CBD context, researchers co-designed a visual dashboard with residents to support self-regulated learning through improved functions to access and interpret their EPA data [23].

Since best practices of user design begin by understanding user needs, this raises a critical issue. The needs of competence committees making entrustment decisions are different from the needs of learners for assessment data that can support their learning. We see this in research focused on meeting competence committee needs [10, 20, 24], and in our own experience. Due to the challenges with the Royal College’s e-portfolio, our institution is developing a different technology solution and has prioritized design of a visual dashboard for competence committees. Designing technology to meet competence committee needs first may have the unintended consequence of raising the stakes of summative assessment over formative assessment. This priority may communicate to residents that what matters most is efficient acquisition of ‘successful’ EPAs. If we value the importance of learner-centred medical education [1, 4, 6] and the role that formative feedback plays in this process [3,4,5,6], then we must design solutions for CBME that prioritize assessment for learners. This resonates with the recent Royal College Resident Pulse Check 2022 [19], demonstrating issues with electronic portfolios and workplace-based EPA assessment. The impact of EPA acquisition on resident wellness is an immediate priority arising from this document and our observations highlight some feasible solutions.

Limitations

This research employs a purposefully small sample of a particular context for competency-based medical education. While this allows for in-depth analysis, transferability is limited to programs with similar contexts. Additionally, focus groups may converge on similar experiences, which can exclude disconfirming data. Research using individual interviews could provide further confirmation.

Conclusion

Competency-based medical education has the potential to improve residency training through improved feedback and entrustment practices, and technology plays a key role in managing assessment data to support these goals of programmatic assessment. However, this study of an OtoHNS program transitioning to a competency-based curriculum demonstrates that technology design may obstruct these purposes unintentionally by making EPA data difficult for faculty and residents to document, access, and interpret. We also showed that the challenges for residents can have a weightier impact, increasing their workload and making it more difficult to self-regulate their learning. This study provides insight into how technology design centered on learner needs could provide residents a more formative experience of competency-based training.

Availability of data and materials

Due to the identifiable nature of handwriting in the visual data and setting details included in the unredacted focus group transcript, we have not made these data publicly available. Instead, we have opted to use representative images and anonymized quotes in the results.

References

  1. Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, Sherbino J, et al. A core components framework for evaluating implementation of competency-based medical education programs. Acad Med. 2019;94(7):1002–9. https://doi.org/10.1097/ACM.0000000000002743.

    Article  PubMed  Google Scholar 

  2. Carraccio C, Martini A, Van Melle E, Schumacher DJ. Identifying core components of EPA implementation: a path to knowing if a complex intervention is being implemented as intended. Acad Med. 2021;96(9):1332–6. https://doi.org/10.1097/ACM.0000000000004075.

    Article  PubMed  Google Scholar 

  3. Lockyer J, Carraccio C, Chan M-K, Hart D, Smee S, Touchie C, et al. Core principles of assessment in competency-based medical education. Med Teach. 2017;39(6):609–16. https://doi.org/10.1080/0142159X.2017.1315082.

    Article  PubMed  Google Scholar 

  4. Frank JR, Snell LS, Cate OT, Holmboe ES, Carraccio C, Swing SR, Harris P, Glasgow NJ, Campbell C, Dath D, Harden RM, Iobst W, Long DM, Mungroo R, Richardson DL, Sherbino J, Silver I, Taber S, Talbot M, Harris KA. Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638–45. https://doi.org/10.3109/0142159X.2010.501190.

    Article  PubMed  Google Scholar 

  5. Swing SR; International CBME Collaborators. Perspectives on competency-based medical education from the learning sciences. Med Teach. 2010;32(8):663–8. https://doi.org/10.3109/0142159X.2010.500705.

    Article  Google Scholar 

  6. Dagnone JD, Bandiera G, Harris K. Re-examining the value proposition for competency-based medical education. Can Med Educ J. 2021;12(3):155–8. https://doi.org/10.36834/cmej.68245.

    Article  PubMed  PubMed Central  Google Scholar 

  7. ten Cate O, Schwartz A, Chen HC. Assessing trainees and making entrustment decisions: on the nature and use of entrustment-supervision scales. Acad Med. 2020;95(11):1662–9. https://doi.org/10.1097/ACM.0000000000003427.

    Article  PubMed  Google Scholar 

  8. Oudkerk Pool A, Govaerts MJB, Jaarsma DADC, Driessen EW. From aggregation to interpretation: how assessors judge complex data in a competency-based portfolio. Adv in Health Sci Educ. 2018;23(2):275–87. https://doi.org/10.1007/s10459-017-9793-y.

    Article  Google Scholar 

  9. Pack R, Lingard L, Watling CJ, Chahine S, Cristancho SM. Some assembly required: tracing the interpretative work of clinical competency committees. Med Educ. 2019;53(7):723–34. https://doi.org/10.1111/medu.13884.

    Article  PubMed  Google Scholar 

  10. Friedman KA, Raimo J, Spielmann K, Chaudhry S. Resident dashboards: helping your clinical competency committee visualize trainees’ key performance indicators. Med Educ Online. 2016;21:29838. https://doi.org/10.3402/meo.v21.29838.

    Article  PubMed  Google Scholar 

  11. Sonnenberg LK, Hauff P von, Lemieux L. Electronic portfolios for assessment in your postgraduate medical education program: Essential questions to ask when selecting a platform for competency-based medical education (CBME). MedEdPublish. 2017;6(2). https://doaj.org/article/a18061a0017443beb37882897fc9e3fe. Accessed 7 Feb 2022.

  12. Frank JR, Snell L, Englander R, Holmboe ES, ICBME Collaborators. Implementing competency-based medical education: moving forward. Med Teach. 2017;39(6):568–73. https://doi.org/10.1080/0142159X.2017.1315069.

    Article  PubMed  Google Scholar 

  13. Peters H, Holzhausen Y, Boscardin C, ten Cate O, Chen HC. Twelve tips for the implementation of EPAs for assessment and entrustment decisions. Med Teach. 2017;39(8):802–7. https://doi.org/10.1080/0142159X.2017.1331031.

    Article  PubMed  Google Scholar 

  14. Hall AK, Rich J, Dagnone JD, et al. It’s a marathon, not a sprint: rapid evaluation of competency-based medical education program implementation. Acad Med. 2020;95(5):786–93. https://doi.org/10.1097/ACM.0000000000003040.

    Article  PubMed  Google Scholar 

  15. Dagnone JD, Chan MK, Meschino D, Bandiera G, den Rooyen C, Matlow A, McEwen L, Scheele F, St CR. Living in a world of change: bridging the gap from competency-based medical education theory to practice in Canada. Acad Med. 2020;95(11):1643–6. https://doi.org/10.1097/ACM.0000000000003216.

    Article  PubMed  Google Scholar 

  16. Englander R, Frank JR, Carraccio C, Sherbino J, Ross S, Snell L, ICBME Collaborators. Toward a shared language for competency-based medical education. Med Teach. 2017;39(6):582–7. https://doi.org/10.1080/0142159X.2017.1315066.

    Article  PubMed  Google Scholar 

  17. Carraccio C, Englander R, Van Melle E, Ten Cate O, Lockyer J, Chan MK, Frank JR, Snell LS, ICBME Collaborators. Advancing competency-based medical education: a charter for clinician-educators. Acad Med. 2016;91(5):645–9. https://doi.org/10.1097/ACM.0000000000001048.

    Article  PubMed  Google Scholar 

  18. Hall J, Oswald A, Hauer KE, Hall AK, Englander R, Cheung WJ, ICBME Collaborators. Twelve tips for learners to succeed in a CBME program. Med Teach. 2021;43(7):745–50. https://doi.org/10.1080/0142159X.2021.1925233.

    Article  PubMed  Google Scholar 

  19. Resident Doctors of Canada and Royal College of Physicians and Surgeons. Competence by design: resident pulse check report executive summary. 2022. https://www.royalcollege.ca/rcsite/cbd/cbd-program-evaluation-e

  20. Thoma B, Bandi V, Carey R, Mondal D, Woods R, Martin L, et al. Developing a dashboard to meet competence committee needs: a design-based research project. Can Med Educ J. 2020;11(1):e16-34. https://doi.org/10.36834/cmej.68903.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Young JQ, Sugarman R, Schwartz J, McClure M, O’Sullivan PS. A mobile app to capture EPA assessment data: utilizing the consolidated framework for implementation research to identify enablers and barriers to engagement. Perspect Med Educ. 2020;9(4):210–9. https://doi.org/10.1007/s40037-020-00587-z.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Hsiao CT, Chou FC, Hsieh CC, Chang LC, Hsu CM. Developing a competency-based learning and assessment system for residency training: analysis study of user requirements and acceptance. J Med Internet Res. 2020;22(4):e15655. https://doi.org/10.2196/15655.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Carey R, Wilson G, Bandi V, et al. Developing a dashboard to meet the needs of residents in a competency-based training program: a design-based research project. Can Med Ed J. 2020;11(6):e31–45. https://doi.org/10.36834/cmej.69682.

    Article  Google Scholar 

  24. Alberhamy Y, Connell, ER, Ladonna, K, LeBlanc V, Chaput AJ, Hohnson C, Watterson J, Seabrook C, Rangel C. Evaluation of technology platforms: Competence committee’s perspectives and needs. Paper presented at CBME evaluation summit [Virtual] September 22, 2020. https://www.kaltura.com/index.php/extwidget/preview/partner_id/1688662/uiconf_id/22517242/entry_id/1_s9cdk0g1/embed/dynamic? Accessed 7 Feb 2022.

  25. Sebok-Syer SS, Goldszmidt M, Watling CJ, Chahine S, Venance SL, Lingard L. Using electronic health record data to assess residents’ clinical performance in the workplace: the good, the bad, and the unthinkable. Acad Med. 2019;94(6):853–60. https://doi.org/10.1097/ACM.0000000000002672.

    Article  PubMed  Google Scholar 

  26. Nerland M, Hasu M. Challenging the belief in simple solutions: the need for epistemic practices in professional work. Med Educ. 2021;55(1):65–71. https://doi.org/10.1111/medu.14294.

    Article  PubMed  Google Scholar 

  27. Suchman LA. Plans and situated actions: the problem of human–machine communication. Cambridge: Cambridge University Press; 1987.

    Google Scholar 

  28. Suchman LA. Human–machine reconfigurations: plans and situated actions. 2nd ed. Cambridge: Cambridge University Press; 2007.

    Google Scholar 

  29. Federation des Medicins Residents du Quebec. Year 3 of implementation of competence by design: negative impact still outweighs theoretical benefits. July 2020. Accessed 19 May 2021. https://fmrq.qc.ca/files/documents/af/93/fmrq-report-cbd-implementation-year-3_1.pdf.

  30. Latour B. Reassembling the social: an introduction to actor-network-theory. Oxford University Press; 2005.

    Book  Google Scholar 

  31. Fenwick T. Sociomateriality in medical practice and learning: attuning to what matters. Med Educ. 2014;48(1):44–52. https://doi.org/10.1111/medu.12295.

    Article  PubMed  Google Scholar 

  32. MacLeod A, Ajjawi R. Thinking sociomaterially: why matter matters in medical education. Acad Med. 2020;95(6):851–5. https://doi.org/10.1097/ACM.0000000000003143.

    Article  PubMed  Google Scholar 

  33. Clarke AE. Situational analysis: grounded theory after the postmodern turn. Thousand Oaks: Sage Publications; 2005.

    Book  Google Scholar 

  34. Washburn R, Clarke AE, Friese C. Situational analysis in practice: mapping research with grounded theory. Walnut Creek: Routledge; 2016.

    Google Scholar 

  35. Apramian T, Cristancho S, Watling C, Lingard L. (Re)Grounding grounded theory: a close reading of theory in four schools. Qual Res. 2017;17(4):359–76. https://doi.org/10.1177/1468794116672914.

    Article  Google Scholar 

  36. Nelson J. Using conceptual depth criteria: addressing the challenge of reaching saturation in qualitative research. Qual Res. 2017;17(5):554–70. https://doi.org/10.1177/1468794116679873.

    Article  Google Scholar 

  37. Mills J, Bonner A, Francis K. The development of constructivist grounded theory. Int J Qual Methods. 2006;5(1):25–35. https://doi.org/10.1177/160940690600500103.

    Article  Google Scholar 

  38. Kuper A, Lingard L, Levinson W. Critically appraising qualitative research. BMJ. 2008;337:a1035. https://doi.org/10.1136/bmj.a1035.

    Article  PubMed  Google Scholar 

  39. CBD Program Evaluation Operations Team. Competence by design (CBD) implementation pulse check. 2020. Royal College of Physicians and Surgeons of Canada. https://www.royalcollege.ca/rcsite/documents/cbd/cbd-pulse-check-annual-report-2020-e.pdf Accessed 7 Feb 2022.

  40. Ajjawi R, Eva KW. The problem with solutions. Med Educ. 2021;55(1):2–3. https://doi.org/10.1111/medu.14413.

    Article  PubMed  Google Scholar 

  41. Chen JX, Kozin E, Bohnen J, George B, Deschler DG, Emerick K, Gray ST. Assessments of otolaryngology resident operative experiences using mobile technology: a pilot study. Otolaryngol Head Neck Surg. 2019;161(6):939–45. https://doi.org/10.1177/0194599819868165.

    Article  PubMed  Google Scholar 

  42. Sedig K, Parsons P. Design of visualizations for human-information interaction: a pattern-based framework. Morgan Claypool Publishers; 2016.

    Book  Google Scholar 

  43. Ruiz J, Serral E, Snoeck M. Unifying functional user interface design principles. Int J Hum Comput Interact. 2021;37(1):47–67. https://doi.org/10.1080/10447318.2020.1805876.

    Article  Google Scholar 

  44. Lee Y. Design participation tactics: the challenges and new roles for designers in the co-design process. CoDesign. 2008;4(1):31–5. https://doi.org/10.1080/15710880701875613.

    Article  Google Scholar 

Download references

Acknowledgements

The authors gratefully acknowledge the time and insights contributed by the faculty and residents in this study. We also acknowledge graphic design work by Sandra Poczobut for the figures (sandra.poczobut@gmail.com).

Funding

This research was funded by a Royal College Education Research Grant (Number: 17-SRP-09) to support data collection, analysis, and dissemination.

Author information

Authors and Affiliations

Authors

Contributions

TA and KR conceptualized the study and applied for funding. MO conducted data collection and preliminary analysis and drafted the manuscript. SC and KR also contributed to data analysis as the theory was refined. All authors contributed to revisions to the manuscript and take responsibility for the conduct of the study and findings.

Corresponding author

Correspondence to Mary Ott.

Ethics declarations

Ethics approval and consent to participate

The study was approved by Western University’s Research Ethics Board. Consent to participate was conducted by a member of the research team with no supervisory relationship to the participants.

Consent for publication

Consent to include anonymized quotes from participants in publications was part of the research consent process.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ott, M., Apramian, T., Cristancho, S. et al. Unintended consequences of technology in competency-based education: a qualitative study of lessons learned in an OtoHNS program. J of Otolaryngol - Head & Neck Surg 52, 55 (2023). https://doi.org/10.1186/s40463-023-00649-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40463-023-00649-2

Keywords