"Remote evaluation for post-deployment usability improvement" H. Rex Hartson, Professor, Computer Science October 27, 1998 Computer Science Graduate Seminar CS 5944 Index 9743 Tuesday 3:30 - 4:45 pm Newman Library 6th Floor Boardroom Sponsored by: Department of Computer Science Information Systems University Libraries Digital Library Research Laboratory Internet Technology Innovation Center Abstract: Although existing lab-based formative evaluation is frequently and effectively applied to improving usability of software user interfaces, it has limitations that have led to the concept of remote usability evaluation. Perhaps the most significant impetus for remote usability evaluation methods is the need for a project team to continue formative evaluation downstream, after deployment. The usual kinds of alpha and beta testing do not qualify as formative usability evaluation because they do not yield detailed data observed during usage and associated closely with specific task performance. Critical incident identification is arguably the single most important source of this kind of data. Consequently, we developed and evaluated a cost-effective remote usability evaluation method, based on real users self-reporting critical incidents encountered in real tasks performed in their normal working environments. Results show that users with only brief training can identify, report, and rate the severity level of their own critical incidents. Biographical Sketch: Ancient degrees from University of Michigan and Ohio State University. Some industrial research experience at Xerox. Current research interests include human-computer interaction, usability methods and tools, interaction design development methodology. Started working in HCI at VT in 1979. Co-author with D. Hix of "Developing User Interfaces: Ensuring Usability Through Product and Process," John Wiley. Series co-editor of "Advances in Human-Computer Interaction," Volumes 1-4, Ablex Publishing. hartson@cs.vt.edu phone: (540) 231-4857