The purpose of this study was to explore how university students’ read their online assessments with a focus on the students’ attention towards the essential elements of the assessment task. The objective was to investigate how online assessments are scaffolded by assessment designers to support information searching by students. There is a need for this research as designing effective online assessment tasks is now an integral part of a university lecturer’s work to meet the required learning outcomes. Existing literature indicates that effective assessment in online university study is a contemporary field of study (Weller, Pegler, & Mason, 2005), and previous research around online reading and information searching in a higher education context suggests that: • students typically use a limited array of information searching tools and do not make effective use of the tools they use (Qayyum & Smith, 2015); • students resort to quick and selective reading because of; a) the huge volumes of information generated during online searches (Weinreich, Obendorf, Herder, & Mayer, 2008), and; b) the distractions inherent to online readings, which may lead to behaviours that undermine effective learning (Coiro, 2011; Konnikova, 2014); • there is a need for scaffolding of online reading and information search skills for university students (Smith & Qayyum, 2015). In view of these challenges, this study was designed to investigate and document both the online tools and search methods that university students’ employ to find relevant information for assessment tasks. A mixed methods approach was used across two phases to investigate and compare the information search behaviours of novice and experienced students in the context of their learning behaviours. In each phase, online user behaviour was digitally recorded using an eye-tracking system, followed by retrospective interviews. In Phase I, ten students, enrolled in a transition-to-university subject, undertook two assessment tasks that were embedded in the subject guide’s learning modules. In Phase II, five experienced year 3 students worked on an essay-type assessment with a rubric based assessment structure. The recorded observations and interview transcripts of both phases were analyzed using a constant comparative audit of data to discover emerging themes as per the grounded theory approach (Strauss & Corbin, 1998), with a particular focus on assessment design in higher education. A key finding from the two phases of this study suggests that both first-year and third-year university students miss important elements in assessment task descriptions, and their online reading of descriptions was often unfocussed. The observed information seeking behaviour of the students suggests that most were searching for some terms or keywords perceived to be important as they speed read the online resources. This behaviour leads the researchers to conclude that there is a heightened risk of students’ misunderstanding the intent of the assessment and/or employing ineffective information search strategies in attempting to answer the assessment question. Thus, it is argued that improving the scaffold of online assessment will result in improved student comprehension of the task and subsequent online information searching. References Coiro, J. (2011). Talking about reading as thinking: modeling the hidden complexities of online reading comprehension. Theory Into Practice, 50(2), 107-115. Konnikova, M. (2014). Being a better online reader. The New Yorker, July 16. Retrieved from http://www.newyorker.com/science/maria-konnikova/being-a-better-online-reader Qayyum, M. A., & Smith, D. (2015). Learning from student experiences for online assessment tasks. Information Research, 20(2), paper 674. Smith, D. J., & Qayyum, M. A. (2015). Using Technology to Enhance the Student Assessment Experience. International Journal of Social, Education, Economics and Management Engineering, 9(1). Strauss, A., & Corbin, J. (1998). Basics of qualitative research: techniques and procedures for developing grounded theory. Thousand Oaks, CA: Sage Publications. Weinreich, H., Obendorf, H., Herder, E., & Mayer, M. (2008). Not quite the average: an empirical study of Web use. ACM Transactions on the Web (TWEB), 2(1), 5. Weller, M., Pegler, C., & Mason, R. (2005). Use of innovative technologies on an e-learning course. The Internet and Higher Education, 8(1), 61-71.
|Publication status||Published - 06 Dec 2016|
|Event||12th Research Applications in Information and Library Studies Seminar: RAILS 2016 - Victoria University, Wellington, New Zealand|
Duration: 06 Dec 2016 → 08 Dec 2016
https://railsconference.com/conference-archive/rails-2016/ (Conference website)
|Conference||12th Research Applications in Information and Library Studies Seminar|
|Abbreviated title||Hono Tangata: Rangahaua kia mārama—Bridging the gap: From research to practice in information studies|
|Period||06/12/16 → 08/12/16|
|Other||The 2016 RAILS conference will be hosted by the School of Information Management, Victoria University of Wellington, and held at Victoria University of Wellington’s Pipitea Campus, Wellington, New Zealand, from 6–8 December 2016. |
The conference will include
the Australasian Information Educators’ Symposium 2016 (AIES 2016) on the morning of Tuesday, 6 December;
a Doctoral Workshop on the afternoon of Tuesday, 6 December; and
the formal RAILS conference on Wednesday, 7 December and Thursday 8 December 2016.
RAILS 2016 will be followed by a one-day InterPARES International Symposium (https://interparestrust.org), also hosted by the School of Information Management.
Educators, research students and practitioners are encouraged to submit papers on the conference theme, ‘Hono Tangata: Rangahaua kia mārama—Bridging the gap: From research to practice in information studies’, which focuses on building partnerships between researchers, practitioners, and educators to ensure that a culture of research-led, theoretically-informed, innovative practice is nurtured in the information studies field.