Abstract
BACKGROUND A major challenge for assessing students’ conceptual understanding of STEM subjects is the capacity of assessment tools to reliably and robustly evaluate student thinking and reasoning. Multiple-choice tests are typically used to assess student learning and are designed to include distractors that can indicate students’ incomplete understanding of a topic or concept based on which distractor the student selects. However, these tests fail to provide the critical information uncovering the how and why of students’ reasoning for their multiple-choice selections. Open-ended or structured response questions are one method for capturing higher level thinking, but are often costly in terms of time and attention to properly assess student responses. PURPOSE The goal of this study is to evaluate methods for automatically assessing open-ended responses, e.g. students’ written explanations and reasoning for multiple-choice selections. DESIGN/METHOD We incorporated an open response component for an online signals and systems multiple-choice test to capture written explanations of students’ selections. The effectiveness of an automated approach for identifying and assessing student conceptual understanding was evaluated by comparing results of lexical analysis software packages (Leximancer and NVivo) to expert human analysis of student responses. In order to understand and delineate the process for effectively analysing text provided by students, the researchers evaluated strengths and weakness for both the human and automated approaches. RESULTS Human and automated analyses revealed both correct and incorrect associations for certain conceptual areas. For some questions, that were not anticipated or included in the distractor selections, showing how multiple-choice questions alone fail to capture the comprehensive picture of student understanding. The comparison of textual analysis methods revealed the capability of automated lexical analysis software to assist in the identification of concepts and their relationships for large textual data sets. We also identified several challenges to using automated analysis as well as the manual and computer-assisted analysis. CONCLUSIONS This study highlighted the usefulness incorporating and analysing students’ reasoning or explanations in understanding how students think about certain conceptual ideas. The ultimate value of automating the evaluation of written explanations is that it can be applied more frequently and at various stages of instruction to formatively evaluate conceptual understanding and engage students in reflective learning.
Original language | English |
---|---|
Title of host publication | Proceedings of the 25th annual conference of the Australasian Association for engineering education (AAEE 2014) |
Place of Publication | Barton, ACT |
Publisher | Massey University |
Pages | 1045-1053 |
Number of pages | 9 |
ISBN (Print) | 9780473304287 |
Publication status | Published - 2014 |
Event | 25th Annual Conference of the Australasian Association for Engineering Education: AAEE 2014 - Te Papa Tongarewa National Museum of New Zealand, Wellington, New Zealand Duration: 08 Dec 2014 → 10 Dec 2014 http://www.aaee.net.au/index.php/resources/send/7-2014/171-25th-annual-aaee-conference-handbook (Conference handbook) https://search.informit.com.au/browsePublication;isbn=9780473304287;res=IELENG (published papers) |
Conference
Conference | 25th Annual Conference of the Australasian Association for Engineering Education |
---|---|
Abbreviated title | Engineering the Knowledge Economy: Collaboration, Engagement & Employability |
Country/Territory | New Zealand |
City | Wellington |
Period | 08/12/14 → 10/12/14 |
Internet address |