By Scott W. H. Young
Google
LITA Forum 2013
We Love Experiments
Using A/B/n Testing to Improve the User Experience of Library Websites
Scott Young
@hei_scott
What is this talk about?
How to make better decisions with better data
How to build a website that generates trust and satisfaction
UX
Trust + Satisfaction*
*Casaló, Luis V. (2010). "Generating Trust and Satisfaction in E-Services: The Impact of Usability on Consumer Behavior". Journal of Relationship Marketing 9 (4), p. 247.
UX Questions
What do users think they will get?
What do users actually get?
How do they feel about that?
UX
Establishing trust by meeting expectations
Long-term positive effects of informed* design
*by user data
What is A/B/n Testing?
A listening technique
Real-time experiments on a site’s live traffic
A = Original Version
B = Variation 1
n = Additional Variations
Isolate a design variable on the page, then serve different variations randomly to a portion of users.
Who else tests their live site with live users?
Etsy
Google
Twitter
and many others
Key Quotes
Etsy
“We love experiments.”
Google
“A/B testing can be really helpful.”
Twitter
“It’s rare for a day to go by when we’re not releasing at least one experiment.”
A/B/n Testing
Beloved
Time-honored
R.L. Deininger, “Human Factors Engineering Studies of the Design and Use of Pushbutton Telephone Sets,” Bell Systems Technical Journal (July 1960): 995-1012.
http://archive.org/details/bstj39-4-995
Key Quotes
“Experimental Approach”
“Design Variables”
“Perhaps the most important factor in the information processing is the individual himself.”
A/B/n testing allows designers* to be guided by users
*the web committee
What does an A/B/n test
look like for libraries?
The A/B/n Process
Ask a question about your website
Research the question with user interviews
Formulate a hypothesis
Define & run the experiment
Collect data & analyze results
Report to web committee & make decision
Find! Request! Interact!
We thought we were so brilliant.
Guess who we didn't ask?
Our users.
Interact
2% of total homepage clicks
Ask a question about your website
Why are Interact click actions so low?
Do users understand Interact content?
Which other words could describe that button?
Connect. Learn. Help. Services.
Research the question with user interviews
Does Interact accurately describe the content that you find underneath?
“Not so much."
“I didn't know that 'About' was under Interact."
“What am I interacting with?”
“Connect is too vague.”
“Connect is better than Interact, but neither are very good."
“Learn doesn't work."
“Learn doesn't really work. I just think, what am I learning?
I think of reading a book or something."
Research the question with user interviews
“Services is more accurate. Help is stronger."
“I am not an English speaker, so I look for strong words.
I look for help, so Help is the best, then Services too."
“Help makes sense. When I'm in the library, and I think I need help, it would at least get me to click there to find out what sort of help there is."
Formulate a hypothesis
Help or Services will generate more clicks and engagement
than Learn, Connect, and Interact.
Define & run the experiment
Google Analytics “Experiments” for the mechanics of A/B/n
CrazyEgg for click data & vizualization
Collect data & analyze results
Experiment Results
More users are clicking through
and following through with Services.
Report to web committee & make decision
Let's see another example.
The A/B/n Process
Ask a question about your website
Research the question with user interviews
Formulate a hypothesis
Define and run the experiment
Collect data & analyze results
Report to web committee & make decision
Ask a question about your website
Users want easy access to discovery tools.
How can we best enable that?
Ask a question about your website
What are the primary actions on this page?
Are users misled by the “Special Collections” link?
Is our landing page wording not concise?
What if we streamlined our language?
Research the question with user interviews
“What's a digital collection?"
“These descriptions don't make sense to me.”
“Maybe I can go to Special Collections to find things that I want?”
“Honestly I don't read the page. My eyes just go past all that text.”
“I like search.”
New version: “It’s clearer. There's just...less"
Formulate a hypothesis
A streamlined design will generate more searches.
Define & run the experiment
Google Analytics “Experiments” for the mechanics of A/B/n
CrazyEgg for click data & vizualization
Collect data & analyze results
Original (A) vs. Variation (B)
30% increase in Search
68% Increase in Pages per Visit
27% Increase in Average Visit Duration
Experiment Results
Users are staying longer and viewing more pages with the variation
Report to web committee & make decision
UX Lessons from A/B/n Testing
Be open to surprises.
User behavior insights can be unexpected.
We don’t have all the answers.
Experimentation helps inform decisions.
UX Lessons from A/B/n Testing
Good UX is built on good user data.
Experimentation in Libraries
Conceptual
UX Improvements
Technical
Google Analytics
Organizational
Open Communication
Experimentation in Libraries
Constant beta is real.
Experimentation in Libraries
Communicate
with public services
Communicate
with admistrators
Communicate
with users
Experimentation in Libraries
There will be disruption
on the road to improvement.
Client-side A/B/n Testing
www.lib.montana.edu/index3.php?utm_expid=23556620-27
Experimentation in Libraries
It's worth it because it works.
The A/B/n testing process
provides the structure
to ask and answer questions
about our websites and our users.