Understanding User Evaluations of Information Systems
- 1 December 1995
- journal article
- research article
- Published by Institute for Operations Research and the Management Sciences (INFORMS) in Management Science
- Vol. 41 (12) , 1827-1844
- https://doi.org/10.1287/mnsc.41.12.1827
Abstract
Organizations spend millions of dollars on information systems to improve organizational or individual performance, but objective measures of system success are extremely difficult to achieve. For this reason, many MIS researchers (and potentially MIS practitioners) rely on user evaluations of systems as a surrogate for MIS success. However, these measures have been strongly criticized as lacking strong theoretical underpinnings. Furthermore, empirical evidence of their efficacy is surprisingly weak. Part of the explanation for the theoretical and empirical problems with user evaluations is that they are really a measurement technique rather than a single theoretical construct. User evaluations are elicited beliefs or attitudes about something, and they have been used to measure a variety of different ''somethings.'' What is needed for user evaluations to be an effective measure of IS success is the identification of some specific user evaluation construct, defined within a theoretical perspective that can usefully link underlying systems to their relevant impacts. We propose task-technology fit (TTF) as such a user evaluation construct. The TTF perspective views technology as a means by which a goal-directed individual performs tasks. TTF focuses on the degree to which systems characteristics match user task needs. We posit that higher task-technology fit will result in better performance. Further, we posit that users can successfully evaluate task-technology fit. This latter proposition is strongly supported in a survey of 259 users in 9 companies.Keywords
This publication has 8 references indexed in Scilit:
- Information Systems Success: The Quest for the Dependent VariableInformation Systems Research, 1992
- The Concept of Fit in Strategy Research: Toward Verbal and Statistical CorrespondenceAcademy of Management Review, 1989
- Technology, Structure, and Workgroup Effectiveness: A Test of a Contingency ModelThe Academy of Management Journal, 1984
- ENVIRONMENTAL SCANNING: THE EFFECTS OF TASK COMPLEXITY AND SOURCE ACCESSIBILITY ON INFORMATION GATHERING BEHAVIOR*Decision Sciences, 1983
- Variations in Decision Makers' Use of Information Sources: The Impact of Quality and Accessibility of InformationThe Academy of Management Journal, 1982
- Problems with Contingency Theory: Testing Assumptions Hidden within the Language of Contingency "Theory"Administrative Science Quarterly, 1981
- AN EMPIRICAL INVESTIGATION OF THE DIMENSIONALITY OF THE CONCEPT OF INFORMATION *Decision Sciences, 1978
- Perceptions of the Value of a Management Information SystemThe Academy of Management Journal, 1974