Comparison of Evaluation Methods Using Structured Usability Problem Reports

Darryn Lavery

Abstract

Recent HCI research has produced analytic evaluation techniques which claim to predict potential usability problems for an interactive system. Validation of these methods has involved matching predicted problems against usability problems found during empirical user testing. I will argue that matching of predicted and actual problems requires careful attention, and that current approaches lack rigour or generality. Requirements for more rigorous and general matching procedures will be presented. A solution to one of these requirements is presented: a new report structure for usability problems, which is designed to improve the quality of matches made between usability problems found during empirical user testing and problems predicted by analytic methods.

For more information about this talk please contact darryn@dcs.gla.ac.uk.