Software Usability Testing: Do Evaluator Intervention and Task Structure Make any Difference?
- 1 October 1992
- journal article
- research article
- Published by SAGE Publications in Proceedings of the Human Factors Society Annual Meeting
- Vol. 36 (16) , 1215-1219
- https://doi.org/10.1177/154193129203601607
Abstract
The purpose of the present study was to evaluate the effect of evaluator intervention, task structure, and user experience on the user's subjective evaluation of software usability. The study employed a 2 − 2 − 2 factorial between-subjects design with two levels of Evaluator Intervention (Intervention vs. Non-Intervention), 2 levels of Task Structure (Guided-Exploration [free-form] vs. Standard Laboratory), and 2 levels of User Experience (Novice, Experienced). The users were asked to learn to use and then subjectively evaluate a restricted subset of 12 common word processing features over four hours of participation. Day 1 was a training day and Day 2 was a test day. The major finding was that the user's subjective impression of the software was affected by both user Experience and evaluator Intervention. For difficult to use word processing features, experienced users rated the features as more difficult to use under the intervention than non-intervention condition. For novice users, this difference was in the opposite direction but not significant. The same pattern of results was obtained for the subjective rating of ease of learning, overall evaluation of the software, and confidence in ability to use the software. These results were interpreted within context of attribution theory. The effect of structure, although not as prevalent, interacted with user experience in the evaluation of screen features and system capabilities. The relative lack of task structure effects was attributed to the difficulty in implementing free form learning and the number of problems encountered in use of the software under Guided Exploration which counteracted any of its benefits.Keywords
This publication has 6 references indexed in Scilit:
- Usability Engineering: Our Experience and EvolutionPublished by Elsevier ,2014
- The Case for Independent Software Usability Testing: Lessons Learned from a Successful InterventionProceedings of the Human Factors Society Annual Meeting, 1989
- Word processing techniques and user learning preferencesACM SIGCHI Bulletin, 1988
- Development of an instrument measuring user satisfaction of the human-computer interfacePublished by Association for Computing Machinery (ACM) ,1988
- Specification of expertiseInternational Journal of Man-Machine Studies, 1987
- Designing for usability: key principles and what designers thinkCommunications of the ACM, 1985