首页 | 本学科首页   官方微博 | 高级检索  
     


Examining the designs of computer-based assessment and its impact on student engagement,satisfaction, and pass rates
Affiliation:1. Institute of Educational Technology, The Open University, UK;2. Bank Julius Baer & Co, Zurich, Switzerland;1. Moray House School of Education, University of Edinburgh, Holyrood Road, Edinburgh, EH8 8AQ, United Kingdom;2. School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh, Midlothian, EH8 9LE, United Kingdom;3. Learning and Teaching Unit, University of New South Wales, Level 4, Mathews Building (F23), Via Gate 11, Botany Street, Sydney, NSW, 2052, Australia;4. Learning and Teaching Unit, University of South Australia, 160 Currie St, Adelaide, SA, 5000, Australia;1. Computer Science Department, University of Salamanca, Plaza de la Merced, s/n, 37008 Salamanca, Spain;2. Departamento de Ingeniería de Organización, Administración de Empresas y Estadística, Universidad Politécnica de Madrid, Av. Complutense 30, 28040 Madrid, Spain;3. Departamento de Informática y Automática, University of Salamanca, Instituto de Investigación en Ciencias de la Educación/Grupo de investigación GRIAL, Universidad de Salamanca, Paseo Canalejas 169, Salamanca, Spain;1. Education Futures, University of South Australia, Australia;2. The University of British Columbia, Canada;3. School of Information, University of Michigan, USA;4. ALSET, National University of Singapore, Singapore;5. INSERM U1284, Université de Paris, France
Abstract:Many researchers who study the impact of computer-based assessment (CBA) focus on the affordances or complexities of CBA approaches in comparison to traditional assessment methods. This study examines how CBA approaches were configured within and between modules, and the impact of assessment design on students' engagement, satisfaction, and pass rates. The analysis was conducted using a combination of longitudinal visualisations, correlational analysis, and fixed-effect models on 74 undergraduate modules and their 72,377 students. Our findings indicate that educators designed very different assessment strategies, which significantly influenced student engagement as measured by time spent in the virtual learning environment (VLE). Weekly analyses indicated that assessment activities were balanced with other learning activities, which suggests that educators tended to aim for a consistent workload when designing assessment strategies. Since most of the assessments were computer-based, students spent more time on the VLE during assessment weeks. By controlling for heterogeneity within and between modules, learning design could explain up to 69% of the variability in students' time spent on the VLE. Furthermore, assessment activities were significantly related to pass rates, but no clear relation with satisfaction was found. Our findings highlight the importance of CBA and learning design to how students learn online.
Keywords:Computer-based assessment  Learning design  Learning analytics  Academic retention  Learner satisfaction  Virtual learning environment
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号