Documente Academic
Documente Profesional
Documente Cultură
381
Illustration by Kevin Cornell
Like everyone else who 1) has performed user research, and 2) is over age 40, I spent the requisite
decade or two wandering a wilderness inhabited by misguided folks who assumed that, at best,
users behaviors and opinions were but minor considerations in the design process.
So imagine my shock when, about five years ago, I found myself trolling (AKA
consulting) down the corridors of a large Silicon Valley tech company. You most
definitely know this companyin fact, youve likely complained bitterly about your
experience with their products. Naturally, I expected to find precious little sensitivity
there to users needs, much less any actual user research.
part:
ople who make
es.
Instead I encountered a series of robust, expensive, well-staffed teams of researchers
many with doctoratesemploying just about every imaginable method to study the user
experience, including (but not limited to):
Clickstream analysis
Field studies
Focus groups
Market research
Search analytics
Usability testing
The company had all this research into what their users were thinking
and doing. And yet their products were still universally despised.
Why?
Youve heard this one before. Some blind men walk into a bar Later, they happen upon an
elephant. One feels the trunk and pronounces it a snake. Another feels a leg and claims its a tree.
And so on. None can see the Big Picture.
Each of those teams is like one of those blind men. Each does an amazing job at studying and
analyzing its trunk or leg, but none can see the elephant. The result is a disjointed, expensive
collection of partial answers, and a glaring lack of insight.
Forget Big Dataright now, our bigger problem is fragmented data that comes from siloed user
research teams. Heres a simple example: one team may rely upon behavioral datalike a shopping
carts conversion rateto diagnose a major problem with their site. But they cant come up with a
solution. Meanwhile, just down the hall, another team has the tools to generate, design, and
evaluate the required solution. Unfortunately, they dont know about the problem. How come?
Because these two teams may not know that the other exists. Or they arent encouraged by their
organization to communicate. Or they dont share enough common cultural references and
vocabulary to have a reasonable dialogue, even if they wanted to. So synthesis doesnt happen, the
opportunity for game-changing insight is missed, and products and services continue to suck.
Ive since encountered the same problem in all sorts of industries and places outside the Valley.
Even relatively small companies like Aarron Walters MailChimp(/article/connectedux) struggle
with fragmented user research.
Organizations that now invest in user research must resist the urge to congratulate themselves;
theyve only achieved Level 1 status. How can we help them reach a higher stage in their evolution
one where the goal isnt simply to generate research, but achieve insight that actually solves real
design problems?
But we can createconditions that get those blind men talking together. Consciously exploring and
addressing the following four themesbalance,cadence,conversation, and perspectivemay help
researchers and designers solve the problems all that precious (and expensive) user research
uncoverseven when their organizations arent on board.
If youre only listening to one blind man, youll be stuck with an incomplete and unbalanced view of
your customers and the world they inhabit. Thats risky organizational behavior: youll miss out on
detecting (and confirming) interesting patterns that emerge concurrently from different research
silos. And you likely wont learn something new and important.
A healthy balance of research methods and tools will give you a chance to really see the elephant.
Sounds simple, but its sadly uncommon in large organizations for two reasons:
Plenty of good books can introduce you to user research methods outside your comfort zone. For
example, ObservingtheUserExperience(http://www.amazon.com/ObservingUserExperience
SecondPractitioners/dp/0123848695/) and UniversalMethodsofDesign
(http://www.amazon.com/UniversalMethodsDesignInnovativeEffective/dp/1592537561/) will
help you inventory research methods from the human-computer interaction world, while Web
Analytics:AnHouraDay(http://www.amazon.com/WebAnalyticsAnHour
Day/dp/0470130652/) will do the same for web analytics methods.
But a laundry list of different research methods wont, by itself, tell you which methods you should
use to achieve balance. To make sense of the big picture, many smart researchers have also begun to
map out the canon.
One of the most extensive and useful maps is Christian Rohrers Landscape of User Research
Methods(http://www.nngroup.com/articles/whichuxresearchmethods/). It depicts research
methods within four quadrants delineated by two axes: qualitative versus quantitative, and
attitudinal (what people say) versus behavioral (what they do):
Use Christians landscape as an auditing tool for your user research program. Start with what you
already have, using this diagram first to inventory your organizations existing user research toolkit.
Then identify gaps in your research methodology. If, for example, all of your user research methods
are clustered in one of these quadrants, you need to find yourself some moreand some different
blind men.
User researchlike any other kind of effort to better understand realitydoesnt work well if it
happens only once in a while. Your users reality is constantly in flux, and your research process
needs to keep up. So what research should happen when?
Just as a map like Christians can help you make sense of user research methods spatially, a
researchcadence can help you understand them in the context of time. A cadence describes the
frequency and duration of a set of user experience methods. Heres a simple example from user
researcher and author Whitney Quesenbery(http://www.wqusability.com/):
Whitneys cadence incorporates a mix of research methods, gives us a sense of their duration, and,
most importantly, maps out how frequently we should perform them. It helps us know what to
expect from an organizations upcoming research activities, and figure out how other types of
research might fit timewise.
To establish a cadence, first prioritize your organizations research methods by effort and cost.
Simple, inexpensive methods can be performed more frequently. You might also take a shortcut:
look for (and consolidate) the defacto cadences already employed within your organizations
various user research silos.
Then consider how frequently each method could be employed in a useful way, given budget,
staffing, and other resource constraints. Also look for gaps in timing: if your research is coming in
on only a daily or annual basis, look for opportunities to gather new data monthly or quarterly.
Heres a sample cadence. Given that your organization will employ a different mix of research
methods, your mileage will vary:
Weekly
Quarterly
Annually
Ive added in the categories from Christians two axes to ensure that our cadence maintains balance.
Balance and cadence can help organizations get the right mix of blind men talking, and make sure
theyre talking regularly. But how do we enable dialogue between different researchers and get them
to actually share and synthesize their work?
CREATE A PIDGIN
To make that conversation more likely to succeed, its helpful to identify at least a few shared
references and vocabulary. In effect, look to develop something of a user research pidgin that
enables researchers from different backgrounds to understand each other and, eventually,
collaborate.
While Daves process will help you determine common concepts and vocabulary, its still a Big Win
to get broad acknowledgment that, while you and your colleagues may be speaking (for example)
English, youre really not speaking the same language when it comes to user research. That
realization will make it much easier to meet each other halfway.
The analytics team at a large U.S. clothing retailer found, when analyzing its site search logs, that
there were many queries for the companys product SKUsand that they were all retrieving zero
results. Horrified, they quickly added SKUs to their catalogs product pagesan easy fix for a big
problembut they still couldnt understand how customers were finding the SKUs in the first place.
After all, they werent displayed anywhere on the site.
The analytics team could tell what was going on, but not why. So they enlisted the team responsible
for performing field studies to explore this issue further. The field study revealed that customers
were actually relying on paper catalogsan old, familiar standbyto browse products and obtain
SKUs, and then entering their orders via the newfangled website, which was deemed safer and
easier than ordering via a toll-free number.
The story may be an interesting example of cross-channel user experience. But for our purposes, its
a great way to show how two very different user research methodssearch analytics and field
studies, wielded by completely separate teamsdeliver compounded value when used together.
Samanthas guerrilla efforts soon bore fruither team developed relationships not just with
marketing, but everyone touching the customer experience. Informal lunches led to regular cross-
departmental meetings and, more importantly, sharing research data, new projects, and customer-
facing design work across multiple teams. Ultimately, Samanthas prospecting helped lead to the
creation of a centralized customer insights team that unified web analytics, market research, and
voice of the customer work across print, digital, call center, and in-store channels.
So far, weve covered the need for a balanced set of user research tools and teams, coordinating their
work through orchestration, and getting them to have better, more productive conversations. But
thats quite a few moving partshow do we make sense of the whole?
Maps like Christian Rohrers landscape can help by making sense of an environment that we might
find large and disorienting. Youll also find that the process of mapping is, in effect, an exercise in
putting things together that hadnt been combined before.
But maps are also limitingthey are hard to maintain, and more importantly, you cant manipulate
them. To overcome this, the MailChimp team took a very different route to sense-making,
employing Evernote as a shared container for user research data and findings (see Aarron Walters
article, also in this issue of AListApart, Connected UX(/article/connectedux)). Its actually an
incredibly functional set of tools, all pointed at MailChimps collective user researchbut, unlike a
map, it struggles to make visual sense of MailChimps user research geography.
Would it make sense to combine your map and your container? Dashboards are both orientational,
like maps, and functional, like containers. Theyre also attractive to many leaders who, when
confronted with their organizations complexity, seek better ways to make sense and manage. But
before you get your hopes up, remember that theres a reason you dont steer your car from its
dashboard. Like any other design metaphor, dashboards tend to collapse as we overload them with
features.
Perhaps some smart team of designers, developers, and researchers will be able to pull off some
combination of user research maps and containers, whether presented as a dashboard or something
else. In the meantime, you should be working on developing both.
Blue skies
Once youve done that, youll be armed to bring senior leadership into the conversation. Ask them
what evidence would ideally help them in their decision-making process. Then show them your map
of the imperfect, siloed user research environment thats currently in place. Balance,cadence,
conversation, and perspective can help make up the difference.
Lou Rosenfeld
Lou Rosenfeld is the founder of Rosenfeld Media(http://rosenfeldmedia.com/), co-
author of Information Architecture for the World Wide Web
(http://oreilly.com/catalog/9780596527341/), and author of Search Analytics for Your
Site(http://rosenfeldmedia.com/books/searchanalytics/).
MORE FROM THIS AUTHOR
Beyond Goals: Site Search Analytics from the Bottom Up(/article/beyond
goalssitesearchanalyticsfromthebottomup)