Rasch Analysis to Examine and Validate first year Chemistry Exams, and Provide Individualised Feedback to Students on Their Subject Mastery

Research output: Other contribution to conferenceAbstractpeer-review

62 Downloads (Pure)

Abstract

Do you use Multiple Choice tests? How do you know if the test is a balance between difficulty and coverage of material? Do male students perform better than their female counterparts? Is the exam valid for DE and Internal students? Should the same exam be provided to any course cohort?

This presentation reports of a Rasch analysis (Rasch, 1960) performed on two final exams for first year, service taught, chemistry subjects. These subjects targeted students of vastly different educational backgrounds, but the exams were designed and proofed by the same team of academics.

Using this approach, we were able to assess the performance of students as well as that of the exam questions (Bond and Fox, 2007). We identified mis-fitting of students and question in both exams.

We were able to ascertain that one of the exams would benefit from removing several question and that the other would benefit from integrating one or several “harder” questions. Similarly, we were able to review the performance of students in view of their gender, compare students from different courses, and in one subject, their mode and location of study. These information will be presented in a graphical and non mathematical manner.

Importantly, we were pleased that for both exams, the performance of students correlated well with their grades. We observed however evidence of “guessing” in these exams.

We suggest that the use of Rasch analyses yield important information with regards to the performance of exams and students. From a student’s perspective, we argue that these analyses can provide student specific feedback. For academics, we propose that this method can contribute towards banking validated exam questions for the purpose of randomly populated online exams. We suggest however that these data be used in combination with other data analytics such as CSU Interact reports to enrich the nature of the subject coordinator’s reflections. This is because statistical analyses only describe the “performance” of students, and the Learning Environment data provides “behavioural” and therefore valuable complementary information.
Original languageEnglish
Publication statusPublished - 2016
EventRACI Chemical Education Symposium 2016 - Monash University, Melbourne, Australia
Duration: 31 Mar 201601 Apr 2016
https://www.raci.org.au/document/item/2255 (Conference program)
https://www.raci.org.au/events/event/chemical-education-symposium-2016-2 (Conference website)

Conference

ConferenceRACI Chemical Education Symposium 2016
Country/TerritoryAustralia
CityMelbourne
Period31/03/1601/04/16
OtherThis symposium has two aims: to showcase some of the innovative thinking around chemistry education in Australia, and to facilitate rich discussion across institutions. The second aim will be achieved by using an open program, spearheaded by four key themes, and an intensive workshop focusing on meaningful research in chemistry education.
Internet address

Fingerprint

Dive into the research topics of 'Rasch Analysis to Examine and Validate first year Chemistry Exams, and Provide Individualised Feedback to Students on Their Subject Mastery'. Together they form a unique fingerprint.

Cite this