Case study: Tactics of User Experience Research at ADP

This is part two of two, focusing on the tactical details of ADP’s UX Research. For high-level strategy, see Case study: Strategy of User Experience Research at ADP

I first got into user experience research by reading “Rocket Surgery Made Easy” by Steve Krug. This book was empowering because it was a step-by-step guide. It covered all aspects of running a usability test. As a beginner in the field, learning the tactics gave me the confidence to dive in and try them out for myself.

I’ve built on the “Rocket Surgery” technique over the years, but my research repertoire hasn’t grown far beyond usability testing.

Hearing an interview with ADP’s head of User Experience Research, Jesse Zolna felt like another “Rocket Surgery” moment for me. I now feel empowered to push beyond usability testing. Yes, Zolna and his team do classic usability testing. But they are strong in exploratory research as well. They also work hard to spread the eye-opening experience of research beyond the UX team.

Hearing ADP’s CTO Keith Fulton discuss research was also interesting because it gave an executive’s point of view.

I’ve summarized some tips and tricks from those two interviews here. This is not a step-by-step instruction guide. But I hope that you can add some of these tactics to your repertoire. I know I will!

Usability testing

Like I said, ADP does usability testing. If you haven’t read “Rocket Surgery” or run a test yourself, here’s a good description of how it works.

We put a system concept in front of them and we ask them to perform a series of tasks to see where they struggle and where they don’t. Then, we listen. Watch. And learn.


Exploratory research

ADP also does exploratory research, where they focus less on their products and more on a domain as a whole.

Here’s a big domain that we want to really understand deeply… we don’t necessarily have a specific piece of the product that we’re going to improve from that… it gives everybody a better understanding and it gets the silos together.


This is usually a hard sell because it has no obvious short-term value for the company. Usability testing results in rapid product improvements, leading to happier customers, more sales, and lower support costs. Exploratory research takes time, but it can lead to innovations because it looks beyond incremental improvements.

Get out of the building

ADP meets their users in person so they can understand the real context of their customers.

We do client site visits, where we are literally sitting in the cube with the payroll administrator for days on end. When you are watching and studying what they are doing, the interesting part becomes what they are not doing. What are the post-it notes stuck around the outside of their monitor? What have they tacked onto their cubicle walls, because the system does not tell them what they need to know at the time?

Cluttered desk with computer
When you are watching and studying what they are doing, the interesting part becomes what they are not doing. Photo by Robert Bye on Unsplash

“Teach me how”

When they’re doing these client visits, ADP uses a great research technique: They get their users to train them.

We just go watch people during their payroll process and say, “teach me how to do payroll from your company”… If you say “how do you do payroll” they might say “this person is from ADP, I’m going to tell them how I use ADP products.” Or, “I’m going to tell them the right way.” Whereas if I say “teach me” they’re going to show me the real way.


This is a great way to broaden your research beyond your product. It allows you to understand the whole context of how your product fits into someone’s life.

Stupid questions are valuable

Zolna uses a great line to start off an interview: “I’m probably going to ask you some stupid sounding questions, I’m just making sure that I understand what you’re saying. You might feel like you just answered that question, but I’m going to ask it anyway. Please just forgive me in advance, bear with me on that.”

This disclaimer helps break the ice. The so-called obvious questions can lead to great insights.

It gives me permission to ask some really dumb sounding questions, but sometimes I get surprised and it’s like the best question I’ve asked all day.


How they track insights

Once ADP’s research team has gone to talk to, observe, and ask stupid questions to a handful of clients, they synthesize their findings. Here’s their process.

1) Observation

The team first summarizes what they observed. “Each of these observations is like a thing that we saw a few times, or like a theme,” Zolna says.

2) Recommendation

For each observation, the team comes up with a suggestion. According to Zolna, this “usually follows really logically from the observation. And sometimes it’s so obvious [but] I say dumb things just to make sure that we all sort of agree on things.”

There he goes again. First it was asking dumb questions. Now it’s saying dumb things. It’s hard to do but it’s a valuable skill. As a researcher, it’s important to speak the obvious because it reveals assumptions and prevents misunderstandings.

3) Action

Next, the research team runs a workshop with stakeholders to establish an action based on the recommendation.

“It should be descriptive, not prescriptive,” says Zolna. “It’s not like ‘make the button a brighter color so people can see it’. It’s ‘make the button more visible’… the team figures out what’s the right way to do it, that fits in with the rest of the product and is feasible”

Why don’t they just prescribe a solution? Shared ownership leads to better ideas and a better chance of action.

“If the team comes up with the action together they’re much more likely to actually implement that because like they can be like ‘that was partly my idea so I’m going to make it happen.'”

Metrics of user research

Like me, Zolna and his team use Airtable to store their research findings. Unlike me, he uses this to quantify the impact of the research.

we have an Airtable that houses all of our insights from the year and I can then report to my boss, you know we had 750 insights and 350 of them like actually had impact on the products. And that’s the kind of stuff he cares about most.


Interestingly, Zolna is still not sold on how useful this particular metric is.

I really love the idea of maybe redefining the metric. I report up to my boss the number of insights that we’ve had, or observations that led to an action [but] there’s a million reasons they didn’t happen… it’s not a great measure.


Stakeholder involvement

Zolna’s team brings in developers, product managers, and designers as note-takers for their research. This spreads the knowledge of research and makes research findings more tangible.

It just becomes more real. It’s no longer a rumor, or somebody else saying this… It’s no longer words on a page. It’s a person feeling pain in front of them.


I’d guess that even executives, such as Keith Fulton, CTO, observe user research first-hand. It’s clear that Fulton “gets” it.

User research shows us where to make our products better… everyone is a critic—in a good way. Users can always make us better.


Well said, Keith. Well said.

Still reading?

If you haven’t already done so, read Case study: Strategy of User Experience Research at ADP. You might also like my other posts on UX Strategy.

If you want to learn more about ADP, check out the full interviews that I based this case study on:

3 thoughts on “Case study: Tactics of User Experience Research at ADP”

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s