您的账号已在其他设备登录,您当前账号已强迫下线,
如非您本人操作,建议您在会员中心进行密码修改

确定
收藏 | 浏览66

Evaluation of educational interventions is often focused on immediate and/or short-term metrics associated with knowledge and/or skills acquisition. We developed an educational intervention to support international medical graduates working in rural Victoria. We wanted an evaluation strategy that included participants' reactions and considered transfer of learning to the workplace and retention of learning. However, with participants in distributed locations and limited program resources, this was likely to prove challenging. Elsewhere, we have reported the outcomes of this evaluation. In this educational development report, we describe our evaluation strategy as a case study, its underpinning theoretical framework, the strategy, and its benefits and challenges. The strategy sought to address issues of program structure, process, and outcomes. We used a modified version of Kirkpatrick's model as a framework to map our evaluation of participants' experiences, acquisition of knowledge and skills, and their application in the workplace. The predominant benefit was that most of the evaluation instruments allowed for personalization of the program. The baseline instruments provided a broad view of participants' expectations, needs, and current perspective on their role. Immediate evaluation instruments allowed ongoing tailoring of the program to meet learning needs. Intermediate evaluations facilitated insight on the transfer of learning. The principal challenge related to the resource intensive nature of the evaluation strategy. A dedicated program administrator was required to manage data collection. Although resource-intensive, we recommend baseline, immediate, and intermediate data collection points, with multi-source feedback being especially illuminating. We believe our experiences may be valuable to faculty involved in program evaluations.

作者:Debra, Nestel;Melanie, Regan;Priyanga, Vijayakumar;Irum, Sunderji;Cathy, Haigh;Cathy, Smith;Alistair, Wright

来源:Journal of educational evaluation for health professions 2011 年 8卷

相似文献
知识库介绍

临床诊疗知识库该平台旨在解决临床医护人员在学习、工作中对医学信息的需求,方便快速、便捷的获取实用的医学信息,辅助临床决策参考。该库包含疾病、药品、检查、指南规范、病例文献及循证文献等多种丰富权威的临床资源。

详细介绍
热门关注
免责声明:本知识库提供的有关内容等信息仅供学习参考,不代替医生的诊断和医嘱。

收藏
| 浏览:66
作者:
Debra, Nestel;Melanie, Regan;Priyanga, Vijayakumar;Irum, Sunderji;Cathy, Haigh;Cathy, Smith;Alistair, Wright
来源:
Journal of educational evaluation for health professions 2011 年 8卷
标签:
Educational measurement Medical education Medical students
Evaluation of educational interventions is often focused on immediate and/or short-term metrics associated with knowledge and/or skills acquisition. We developed an educational intervention to support international medical graduates working in rural Victoria. We wanted an evaluation strategy that included participants' reactions and considered transfer of learning to the workplace and retention of learning. However, with participants in distributed locations and limited program resources, this was likely to prove challenging. Elsewhere, we have reported the outcomes of this evaluation. In this educational development report, we describe our evaluation strategy as a case study, its underpinning theoretical framework, the strategy, and its benefits and challenges. The strategy sought to address issues of program structure, process, and outcomes. We used a modified version of Kirkpatrick's model as a framework to map our evaluation of participants' experiences, acquisition of knowledge and skills, and their application in the workplace. The predominant benefit was that most of the evaluation instruments allowed for personalization of the program. The baseline instruments provided a broad view of participants' expectations, needs, and current perspective on their role. Immediate evaluation instruments allowed ongoing tailoring of the program to meet learning needs. Intermediate evaluations facilitated insight on the transfer of learning. The principal challenge related to the resource intensive nature of the evaluation strategy. A dedicated program administrator was required to manage data collection. Although resource-intensive, we recommend baseline, immediate, and intermediate data collection points, with multi-source feedback being especially illuminating. We believe our experiences may be valuable to faculty involved in program evaluations.