The Mass Implementation and Evaluation of Computer‐based Assessments
- 1 January 1998
- journal article
- research article
- Published by Taylor & Francis in Assessment & Evaluation in Higher Education
- Vol. 23 (2) , 141-152
- https://doi.org/10.1080/0260293980230203
Abstract
An interactive, computer‐based assessment system has been developed at the University of Luton. The system uses ‘Question Mark Designer’ software to deliver end‐of‐module examinations, formative assessment and self‐assessment. The development of the university‐wide system from its inception and implementation to its current level of activity is described. Detailed procedures for the design, compilation and delivery of tests have been developed and implemented. Student and academic staff responses to computer‐based objective testing were favourable. Initial evaluation of student experiences and the reactions of academic staff to computer‐based examinations suggests that this form of assessment can be highly desirable both pedagogically and economically. The benefits of no marking, fast feedback and comprehensive statistical analysis are offset against the time taken to design objective tests. The investment of resources in staff development and central support are a critical factor in determining the effectiveness and efficiency of the system.Keywords
This publication has 5 references indexed in Scilit:
- A correlational analysis of multiple-choice and essay assessment measuresResearch in Education, 1996
- Interactive Computer Assessment of Large Groups: Student ResponsesInnovations in Education and Training International, 1996
- The Introduction of Computer‐based Testing on an Engineering Technology CourseAssessment & Evaluation in Higher Education, 1996
- Software risk managementPublished by Springer Nature ,1989
- PROBLEMS IN IMPROVING THE RELIABILITY OF ESSAY MARKSAssessment in Higher Education, 1980