Conference Publication Details
Mandatory Fields
Kropmans T, Magnus H, Morrison J, O'Leary E, Kennedy KM
European Board of Medical Assessors Annual European Conference on Assessment in Medical Education
Sharing reliable and valid OSCE stations so as to improve cross-institutional assessment strategies: Are we equipped for it?
2017
November
Published
1
()
Optional Fields
Egmond Ann Zee, The Netherlands
14-NOV-17
15-NOV-17
Sharing reliable and valid OSCE stations so as to improve cross-institutional assessment strategies: Are we equipped for it? Kropmans T1,4, Magnus H2, Morrison J1, O Leary E3, Kennedy KM1 1College of Medicine, Nursing & Health Sciences, National University of Ireland Galway, School of Medicine. 2Umea University, Umea Sweden. 3University College Cork, Cork Ireland. CEO R&D Qpercom Ltd Background Sharing quality assured assessment outcomes and reliable and valid OSCE stations, in an integrated fashion throughout Europe, has the potential for considerable mutual benefit, but yet is rarely undertaken. Method Institutions in Europe that use an electronic OSCE Management Information System (OMIS) were invited to participate. Written informed consent was obtained from each institution, thereby respecting existing mutual non-disclosure agreements. Institutions that embraced the idea of sharing quality assured assessment results from OSCE stations were included in this study. Those who declined to participate were respectfully excluded. Mixed Methods were used to compare quantitative and qualitative assessment outcomes in terms of quality assurance outcomes, including pass mark/standard setting, internal consistency (Cronbach’s Alpha, CA), generalizability coefficients, the Standard Error of Measurement (SEM) and station goals/names. Results Eight out of twelve institutions participated in sharing their penultimate (pre-final) year clinical skills assessment results. Student numbers varied from 50 to 250 within the 8 EU institutions and pass marks ranged from 50% to 86%, with a median pass mark of 70%. Internal consistency (CA) of the assessments varied from 0.02 (single item) to 0.98, with a median CA of 0.65 and G-Coefficient varied from 0.42 to 0.87 within and between institutions. The SEM around the observed scores varied from 4% to 15% on a scale from 0 – 100%. Discussion Medical educators strive to develop the best possible assessment criteria. This comparison of EU clinical skills assessments hopefully opens the opportunity for widespread sharing of valid and reliable assessment stations, from reliable assessments, among participating institutions. Quality assured assessment outcomes vary widely within and across EU institutions. More emphasis on transparent outcome (big data) analysis is suggested in order to transform and open up mutual EU clinical skills assessment strategies.
Grant Details
Publication Themes