Academic Journals Database
Disseminating quality controlled scientific knowledge

Task dimensions of user evaluations of information retrieval systems

ADD TO MY LIST
 
Author(s): F.C.Johnson | J.R.Griffiths | R.J.Hartley

Journal: Information Research: an international electronic journal
ISSN 1368-1613

Volume: 8;
Issue: 4;
Start page: 157;
Date: 2003;
Original page

Keywords: information retrieval systems | search engines | evaluation user satisfaction

ABSTRACT
This paper reports on the evaluation of three search engines using a variety of user-centred evaluation measures grouped into four criteria of retrieval system performance. This exploratory study of users' evaluations of search engines took as its premise that user system success indicators will derive from the retrieval task the system supports (in its objective to facilitate search). This resulted in the definition of user evaluation as a multidimensional construct which provides a framework to link evaluations to system features in defined user contexts. Our findings indicate that users' evaluations across the engines will vary, and the dimensional approach to evaluation suggests the possible impact of system features. Further analysis suggests a moderating effect on the strength of the evaluation by a characterization of the user and/or query context. The development of this approach to user evaluation may contribute towards a better understanding of system feature and contextual impact on user evaluations of retrieval systems.
Affiliate Program      Why do you need a reservation system?