Supporting Heuristic Evaluation for the Web

dc.contributorLively, William
dc.creatorFlores Mendoza, Ana
dc.date.accessioned2010-01-15T00:17:20Z
dc.date.accessioned2010-01-16T00:16:11Z
dc.date.accessioned2017-04-07T19:54:39Z
dc.date.available2010-01-15T00:17:20Z
dc.date.available2010-01-16T00:16:11Z
dc.date.available2017-04-07T19:54:39Z
dc.date.created2009-08
dc.date.issued2010-01-14
dc.description.abstractWeb developers are confronted with evaluating the usability of Web interfaces. Automatic Web usability evaluation tools are available, but they are limited in the types of problems they can handle. Tool support for manual usability evaluation is needed. Accordingly, this research focuses on developing a tool for supporting manual processes in Heuristic Evaluation inspection. The research was conveyed in three phases. First, an observational study was conducted in order to characterize the inspection process in Heuristic Evaluation. The videos of evaluators applying a Heuristic Evaluation on a non-interactive, paper-based Web interface were analyzed to dissect the inspection process. Second, based on the study, a tool for annotating Web interfaces when applying Heuristic Evaluations was developed. Finally, a survey is conducted to evaluate the tool and learn the role of annotations in inspection. Recommendations for improving the use of annotations in problem reporting are outlined. Overall, users were satisfied with the tool. The goal of this research, designing and developing an inspection tool, is achieved.
dc.identifier.urihttp://hdl.handle.net/1969.1/ETD-TAMU-2009-08-7210
dc.language.isoen_US
dc.subjectHeuristic Evaluation tool support
dc.subjectinspection tool support
dc.subjectevaluation process analysis
dc.subjectpaper interface evaluation
dc.subjectvideo analysis
dc.subjectWeb usability evaluation tools
dc.titleSupporting Heuristic Evaluation for the Web
dc.typeBook
dc.typeThesis

Files