Objectives: To explore inter-rater agreement between reviewers comparing reliability and validity of checklist forms that claim to assess the communication skills of undergraduate medical students in Objective Structured Clinical Examinations (OSCEs).Methods: Papers explaining rubrics of OSCE checklist forms were identified from Pubmed, Embase, PsycINFO, and the ProQuest Education Databases up to 2013. Included were those studies that report empirical validity or reliability values for the communication skills assessment checklists used. Excluded were those papers that did not report reliability or validity.Results: Papers focusing on generic communication skills, history taking, physician-patient communication, interviewing, negotiating treatment, information giving, empathy and 18 other domains (ICC = 0.12-1) were identified. Regarding the validity and reliability of the communication skills checklists, agreement between reviewers was 0.45.Conclusions: Heterogeneity in the rubrics used in the assessment of communication skills and a lack of agreement between reviewers makes comparison of student competences within and across institutions difficult.Practice implications: Consideration should be afforded to the adoption of a standardized measurement instrument to assess communication skills in undergraduate medical education. Future research will focus upon evaluating the potential impact of adoption of a standardized measurement instrument. (C) 2015 Elsevier Ireland Ltd. All rights reserved.