Start with an Hour a Week: Enhancing Usability at Wayne State University Libraries
Instead of pursuing traditional testing methods, Discovery and Innovation at Wayne State University Libraries settled on an alternate path to user-centered design when redesigning our library website: running hour-long “guerrilla” usability tests each week for two semesters. The team found immediate...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Michigan Publishing
2018-01-01
|
Series: | Weave: Journal of Library User Experience |
id |
doaj-57f277aaa55947b498c9500c7b4b4a04 |
---|---|
record_format |
Article |
spelling |
doaj-57f277aaa55947b498c9500c7b4b4a042020-11-24T21:06:33ZengMichigan PublishingWeave: Journal of Library User Experience 2333-33162018-01-0118http://dx.doi.org/10.3998/weave.12535642.0001.803Start with an Hour a Week: Enhancing Usability at Wayne State University LibrariesMaria Nuccilli0Elliot Polak1Alex Binno2Wayne State UniversityWayne State UniversityWayne State UniversityInstead of pursuing traditional testing methods, Discovery and Innovation at Wayne State University Libraries settled on an alternate path to user-centered design when redesigning our library website: running hour-long “guerrilla” usability tests each week for two semesters. The team found immediate successes with this simple, cost-effective method of usability testing, leading to similar redesign projects for other online resources. Emphasizing the importance of iterative design and continuous improvement, this article will detail the authors’ experience conducting short weekly tests, suggestions for institutions looking to begin similar testing programs, and low-stakes testing as a pathway to improved design for the library as a whole. In spring 2016, the Discovery and Innovation web team at Wayne State University Libraries began a redesign of our main library homepage, making the first major updates to the site since 2013. After launching the initial design, we wanted to improve the new website with feedback from real users, following Jakob Nielsen’s decades-old assertion that redesigning websites based on user feedback “can substantially improve usability” (Nielsen, 1993). Instead of using remote and large-scale testing options, we chose a lean, iterative approach, and committed to two semesters of brief, weekly “guerrilla” usability sessions with student users. Not only did this method produce powerful results, but its attainable, iterative approach ultimately led to an increased user focus beyond the library homepage. If you have an hour a week, a laptop, and at least one other colleague willing to help, you can start usability testing, guerrilla-style. Though we found success with this method, using such a stripped-down approach wasn’t immediately obvious. At the beginning of this project, we didn’t know a whole lot about conducting user testing, having relied mostly on survey responses, search logs, and Piwik website usage data to inform our initial redesign. Although we ran informal tests during the initial design process, participation was limited to library staff or student workers only. This feedback had been helpful early on, but we suspected that testing only employees may have skewed our findings. When compared to seasoned staff, many users would likely not be as familiar with library resources and website structure. The web team considered a host of research methods before embracing lean usability. We considered running focus groups, but discovered our budget rules prohibited us offering even small gift cards as compensation, generally requisite for participation in such activities. As we explored focus groups further, we also realized that the amount of preparation needed to run a session would make it hard to get regular feedback on any changes we did end up making. Hiring an expert to run the kind of large-scale user test commonly outlined in literature or outsourcing testing to a website like UserTesting.com were other methods we eliminated. We wanted a way to test users that would allow us to spend less time planning and more time testing so we could get feedback and deliver usability improvements as fast as possible. The remainder of this piece will detail our experience conducting weekly tests, and suggestions for institutions looking to begin similar testing programs,. |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Maria Nuccilli Elliot Polak Alex Binno |
spellingShingle |
Maria Nuccilli Elliot Polak Alex Binno Start with an Hour a Week: Enhancing Usability at Wayne State University Libraries Weave: Journal of Library User Experience |
author_facet |
Maria Nuccilli Elliot Polak Alex Binno |
author_sort |
Maria Nuccilli |
title |
Start with an Hour a Week: Enhancing Usability at Wayne State University Libraries |
title_short |
Start with an Hour a Week: Enhancing Usability at Wayne State University Libraries |
title_full |
Start with an Hour a Week: Enhancing Usability at Wayne State University Libraries |
title_fullStr |
Start with an Hour a Week: Enhancing Usability at Wayne State University Libraries |
title_full_unstemmed |
Start with an Hour a Week: Enhancing Usability at Wayne State University Libraries |
title_sort |
start with an hour a week: enhancing usability at wayne state university libraries |
publisher |
Michigan Publishing |
series |
Weave: Journal of Library User Experience |
issn |
2333-3316 |
publishDate |
2018-01-01 |
description |
Instead of pursuing traditional testing methods, Discovery and Innovation at Wayne State University Libraries settled on an alternate path to user-centered design when redesigning our library website: running hour-long “guerrilla” usability tests each week for two semesters. The team found immediate successes with this simple, cost-effective method of usability testing, leading to similar redesign projects for other online resources. Emphasizing the importance of iterative design and continuous improvement, this article will detail the authors’ experience conducting short weekly tests, suggestions for institutions looking to begin similar testing programs, and low-stakes testing as a pathway to improved design for the library as a whole.
In spring 2016, the Discovery and Innovation web team at Wayne State University Libraries began a redesign of our main library homepage, making the first major updates to the site since 2013. After launching the initial design, we wanted to improve the new website with feedback from real users, following Jakob Nielsen’s decades-old assertion that redesigning websites based on user feedback “can substantially improve usability” (Nielsen, 1993). Instead of using remote and large-scale testing options, we chose a lean, iterative approach, and committed to two semesters of brief, weekly “guerrilla” usability sessions with student users. Not only did this method produce powerful results, but its attainable, iterative approach ultimately led to an increased user focus beyond the library homepage. If you have an hour a week, a laptop, and at least one other colleague willing to help, you can start usability testing, guerrilla-style.
Though we found success with this method, using such a stripped-down approach wasn’t immediately obvious. At the beginning of this project, we didn’t know a whole lot about conducting user testing, having relied mostly on survey responses, search logs, and Piwik website usage data to inform our initial redesign. Although we ran informal tests during the initial design process, participation was limited to library staff or student workers only. This feedback had been helpful early on, but we suspected that testing only employees may have skewed our findings. When compared to seasoned staff, many users would likely not be as familiar with library resources and website structure.
The web team considered a host of research methods before embracing lean usability. We considered running focus groups, but discovered our budget rules prohibited us offering even small gift cards as compensation, generally requisite for participation in such activities. As we explored focus groups further, we also realized that the amount of preparation needed to run a session would make it hard to get regular feedback on any changes we did end up making. Hiring an expert to run the kind of large-scale user test commonly outlined in literature or outsourcing testing to a website like UserTesting.com were other methods we eliminated. We wanted a way to test users that would allow us to spend less time planning and more time testing so we could get feedback and deliver usability improvements as fast as possible. The remainder of this piece will detail our experience conducting weekly tests, and suggestions for institutions looking to begin similar testing programs,. |
work_keys_str_mv |
AT marianuccilli startwithanhouraweekenhancingusabilityatwaynestateuniversitylibraries AT elliotpolak startwithanhouraweekenhancingusabilityatwaynestateuniversitylibraries AT alexbinno startwithanhouraweekenhancingusabilityatwaynestateuniversitylibraries |
_version_ |
1716765567503826944 |