Skip to main content

Iterative User-Centered Design (Strategy, Process, Stories)

Proposed by Stu Snydman

Notes

Questions
What is "iterative" user-centered design?
How do you test with your users?
How many times do you go to the well?
When can users expect that usability issues will be fixed?
Why do you do user-centered design?
How can user-centered design solve political and organizational issues manifest in the website?
Can user-centered design solve design by committee?
How can iterative user-centered design influence stakeholders and decision-makers?
The user is the kingmaker.
How do you use user-centered design to improve search?
Library Website Redesign
(Process is documented at http://library.stanford.edu/blogs/library-website-redesign)
Started with a long discovery process, 17 interviews with various users. ("How did you choose?") Called in favors, asked students, faculty they knew. Stood outside library with ($5-10) Coupa Cafe cards. This works on faculty too!
Used this to develop personas (archetypes) of users. Give them a personality, a name, a picture (fictitious). Give them goals for the site.
Created napkin sketches (wireframe).
Did card sorts for information architecture.
Had dedicated User Experience Designer who was the consumer of the user experience research. Can do user research yourself, or can hire out.
Did paper prototyping with wireframes at Library Open House. Users completing a protocol of tasks on paper.
At every stage, took results and used them to inform the design (iterative).
Recruiting was a challenge. Recruited in person, and online with an Ethnio popup. At this point, started trying to recruit a more representative sample.
Long-form talk-aloud user interviews, recorded, with screenflow and/or WebEx; 45 minutes to 1 hour. Used a consistent task protocol. Coded user success on each task (red, yellow, green). Into design and engineering phase by this point. Challenge: had an external developer vendor, who was expecting pixel-perfect comps. Comps were constantly changing based on user feedback.
Guerilla tests: atomic tests of a particular issue. 5-15 minute tests, with lottery tickets and candy bars as incentives.
Recruiting is a daunting task. Based a lot of recruiting out of Green Library. Limitation is that they were self-selecting based on a population that comes to the physical library. Had some remote users via Ethnio. Branch librarians also recruited people at loan desks. For faculty it involved calling in favors. Tried not to go back to faculty more than once.
Personas evolved over time. Started with faculty, students; evolved to new researcher, advanced researcher, undergraduate student, graduate student.
Search: #1 task of graduates and faculty in using library web resources is finding articles. User testing revealed that search results (dependent on an external API) were not working well for users, so that forced them to re-architect search results. Search testing is difficult if you don't have all content into website.
How did user testing influence content strategy? Did some specific testing once they got content in. Information architecture changed based on user testing. Sections and menus were re-worked. Were done on the live site.
How did you manage backlog (that was influenced by user testing)? Had to be strategic about getting changes into development queue. Bundle some together and present to developers. Long sprints (month long).
User testing results can be political. Presenting results to stakeholders can be complicated.
How long did this entire process take? 2 years. Expensive in terms of financial and human resources.
If there was only one piece of user testing that you could do, what would it be? Hard to chose. Long form, talk-aloud 30-40 minute recorded sessions.
Did you change your old site along the way based on user testing results, or put all focus into new site? All in new site.
How do you get buy-in for financial resources to conduct user testing? Start small, show results. Results speak for themselves.
What was the mix between user research and usability testing? 4-6 months of user research, shifted into tactical once live prototype was up.
Do you feel that the effort put into intial user research was worth it? Can you re-use those personas? Are now re-using personas for other web applications. Trying to come up with a canonical set of library personas.
Analytics: what changes have you noticed between the new and old? Don't have a 1:1 match. Peak use time is mid-October.
"Expert proxies" for the user - no replacement for true user research.