Browsing by Author "Shanklin, Roslyn Ayanna"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Measuring Subjective Usability by Watching Others Use the Product(2021-12-03) Shanklin, Roslyn Ayanna; Kortum, PhilipThe COVID-19 pandemic instituted a new norm for usability practitioners and researchers by limiting their ability to safely conduct in-person, contact-intensive usability testing protocols. This study explored one promising remote usability assessment method, Watching Others Using Video, wherein users watch videos of others using a product and then rate its usability. Previous studies found that this method results in inflated usability ratings. This study sought to mitigate this inflation by showing users different levels of product use difficulty. Participants watched videos of several products being used: a website, a digital timer, and an electric can opener; and rated them with the System Usability Scale and After-Scenario Questionnaire. Usability score inflation was consistent across products. Participants may not have reliably detected the portrayals of difficulty. Alternatively, the error severities may have been negligible. Further research is needed to understand how Watching Others Using Video can be accurately used for usability testing.Item Measuring Usability by Watching Others With Directed Attention and Curated Instruction(2023-08-09) Shanklin, Roslyn Ayanna; Kortum, PhilipUsability assessment is an important part of the product design process that helps ensure people can easily use a product to accomplish their goals. There are several well developed usability assessment methods, but a need exists for remotely testing physical products. Watching Others Using Video (WOUV) is a potential remote testing solution, wherein people watch videos of another person using a product and then rate its usability. However, previous studies have shown that this method yields inflated usability scores compared to in-person testing. The purpose of this dissertation was to increase the accuracy of Watching Others Using Video and assess how it can best be used as a viable usability scoring and comparison tool. To this end, the current research adapted WOUV to increase the viewer’s acquisition of information critical to accurate usability assessment, as important details may be missed through video. The first experiment established a diverse product selection which was used to evaluate the WOUV adaptations. A second experiment evaluated visually directed attention adaptations of WOUV, which highlighted and explained experience information in several product interactions (e.g., success, error recovery, and failure). The third experiment assessed instructional adaptations of WOUV, which showed how to use a product and demonstrated errors with explanation. For both WOUV adaptation experiments, participants watched videos of several products being used and rated them with the System Usability Scale. The results showed comparable usability ratings between WOUV adaptations and in-person testing for several products when adequate information was provided (i.e., showing and describing errors), but significant score inflation for products with low usability. WOUV adaptations did not reliably maintain relative differences between products compared to in-person testing, though rank order comparison was supported. Practical implications and future directions of Watching Others Using Video are discussed.