Abstract
Context: A programmatic approach to assessment entails gathering and aggregating ‘rich information’ on candidates to inform progress decisions. However, there is little guidance on how such an approach might be implemented in practice.
Objective: We describe an approach to aggregating rich information across assessment formats to inform committee decision-making in a specialist medical college.
Methods: Each item (n = 272) for every examination was blueprinted to 15 curriculum modules and 7 proficiencies. We developed a six-point holistic rating scale with detailed rubrics outlining expected performance standards for every item. Examiners used this rating scale in making judgements for each item, generating rich performance data for each candidate.
Results: A colour-coded ‘mosaic’ of patterns of performance across modules and proficiencies was generated along with frequency distributions of ratings. These data allowed examiners to easily visualise candidate performance and to use these data to inform deliberations on borderline candidates. Committee decision-making was facilitated by maintaining the richness of assessment information throughout the process. Moreover, the data facilitated detailed and useful feedback to candidates.
Conclusions: Our study demonstrates that incorporating aspects of programmatic thinking into high-stakes examinations by using a novel approach to aggregating information is a useful first step in reforming an assessment program.
Objective: We describe an approach to aggregating rich information across assessment formats to inform committee decision-making in a specialist medical college.
Methods: Each item (n = 272) for every examination was blueprinted to 15 curriculum modules and 7 proficiencies. We developed a six-point holistic rating scale with detailed rubrics outlining expected performance standards for every item. Examiners used this rating scale in making judgements for each item, generating rich performance data for each candidate.
Results: A colour-coded ‘mosaic’ of patterns of performance across modules and proficiencies was generated along with frequency distributions of ratings. These data allowed examiners to easily visualise candidate performance and to use these data to inform deliberations on borderline candidates. Committee decision-making was facilitated by maintaining the richness of assessment information throughout the process. Moreover, the data facilitated detailed and useful feedback to candidates.
Conclusions: Our study demonstrates that incorporating aspects of programmatic thinking into high-stakes examinations by using a novel approach to aggregating information is a useful first step in reforming an assessment program.
Original language | English |
---|---|
Pages (from-to) | 567-574 |
Number of pages | 8 |
Journal | Medical Teacher |
Volume | 43 |
Issue number | 5 |
DOIs | |
Publication status | Published - 08 Feb 2021 |