Podcast: Play in new window | Download
Subscribe: Apple Podcasts | Google Podcasts | Stitcher | TuneIn | RSS

Back in early March, when it was still safe to gather in person, we convened a panel of UX strategy experts at the ConveyUX conference in Seattle.
- Cora Cowles is Associate Experience Director at Huge Detroit
- Laura Joss is Global Director of the Design Research team at Motorola
- Brent Summers is Director of Marketing at Blink UX
We had a broad-ranging discussion about how to balance quantitative and qualitative inputs into UX strategy. Each panelist brought a unique perspective. And the audience asked some great questions.
This was the first “live” recording of this podcast. The format is a little different from the interviews I usually conduct. What do you think? Should I continue to experiment with different formats for the show?
(Apologies for the audio issues from about 12:30 until 14:00. We had a technical problem, which the on-site staff resolved very quickly. Thanks to Laura for powering through this part of the conversation.)
Cora, Laura, Brent, and I discussed:
- each of the panelists’ professional background and approach
- common research challenges that UX strategists encounter
- different research set-ups, from fancy dedicated labs to remotely facilitated operations
- how alternating between quantitative and qualitative approaches benefits both
- the importance of stepping away from the research process to ask why you’re doing it in the first place
- the difference between intuition and professional judgment – and how to go forward when you have a gut feeling about a course of research
- surprises that arise in the course of conducting research
- how to deal with financial data and other sensitive information
- how to convince stakeholders to support quantitative testing
- how do you know when you’ve done enough research to move on to the next stage
- how to cope when you are tasked with “slash” duties: slash architect, slash designer, etc.
Panelist Bios
Cora Cowles leads the user experience discipline for Huge Detroit in her role as Associate Experience Director. With over 13 years of varied experience across many industries—financial services, automotive and consumer packaged goods—and an MSI in human computer interaction from The University of Michigan, she specializes in information architecture and user research. Inquisitive when it matters most, Cora keeps an eye on new trends, but considers herself a purist in her role; she shies away from anything that does not directly benefit a user’s experience and is always searching for underlying root causes and contrary positions. Outside of work, she enjoys spending time with her two young children, and teaches dance as well as children’s cooking classes.
Laura Joss, Ph.D. is the global director of the Design Research team at Motorola. Her team, with members located in Chicago, Sao Paolo, and Beijing, is tasked with uncovering unmet user needs and translating those needs into actionable outcomes for product teams and senior leadership. During her time at Motorola she has helped ship numerous smartphone innovations, including Moto Maker, Moto Mods, and other pain-point solving software innovations. Her work has included developing an on-site living lab for in-person, onsite research, and creating digital panels of users for continuous user engagement throughout the product life cycle. She holds a Ph.D. in Cognitive Neuroscience, and a Masters Degree in Judgment and Decision Making. In her spare time she loves exploring new wines- as long as its not barleywine.
Brent Summers is an award-winning marketer and strategist with more than 15 years of experience in leading cross-functional teams in increasing brand awareness, driving business growth, and delivering innovative new products and services to increase customer engagement. As director of marketing at Blink, Brent makes evidenced-based recommendations related to market segments, marketing technology, and go-to-market messaging for Blink and our incredible portfolio of clients. Before joining Blink, Brent led digital projects for clients in many sectors, including B2B SaaS, travel and tourism, professional sports, and education. His work has been recognized by CommArts, HubSpot, Mashable, and Forbes.
Video
Here’s the video version of our conversation:
Podcast Intro Transcript
A long time ago in a galaxy far far away, back when people were still gathering at conferences, I convened a panel on user experience strategy. In early March of 2020, three very experienced UX strategists joined me on stage at the ConveyUX conference in Seattle. Cora Cowles from the agency Huge, Laura Joss from Motorola, and Brent Summers from Blink UX had an insightful and edifying conversation about how UX professionals balance quantitative and qualitative research inputs into their strategy and design processes.
Panel Transcript
Larry:
Good morning everyone. Welcome to our panel on balancing quantitative and qualitative inputs in UX strategy. My name is Larry Swanson. I host a podcast called the Content Strategy Insights podcast. And we’re actually recording this conversation for that podcast. So, your applause and questions will be on the air as well.
Larry:
So, one thing about that at the end, we’re going to save a little time for Q and A because we’d love to hear what you are curious about, and if you can, I will have somebody circulating with a microphone. And just for the purposes of the recording, if you can make sure we talk into the mic, that would be great.
Larry:
Well, I’m really happy today to have this panel with us. I’ve got a great balance of… well, actually let me quickly introduce the panelists and I’ll have them talk a little bit more about themselves.
Larry:
So, immediately to my right to your left, is Cora Cowles, Cora heads an experienced design team at Huge big agency, multiple locations. She works in the Detroit office. Next to her is Laura Joss. Laura is in house at Motorola. She works on product there and has another perspective.
Larry:
And then we also have Brent Summers who is with Blink the organization putting on this conference. So, thanks for that. Oh, and he’s get his own mic now. Great. And Brent is Director of marketing there, and lives down in San Diego.
Larry:
So, want to welcome everybody in. What I’d like to go through and kind of in the same order, have each of you talk a little bit more about your role, your position, and specifically talk a little bit about how strategy figures in. We’re all kind of dive in down in the weeds coming back up. Just kind of tell me about the balance between strategy and other aspects of your job. So Cora.
Cora:
Sure. So, I do lead a team of experienced designers at Huge in Detroit, but we work closely with the strategy team on every project and on every product. Both planning but also a longterm road mapping and strategy in that way as well.
Cora:
And so, we work with them, we work with researchers and project managers to help bring our project to fruition. And at every step we’re getting those inputs into the design to help make a product that is cohesive in the end for the user.
Larry:
Laura?
Laura:
Hi everybody. I’m Laura Joss, I am the Global Director of Design Research at Motorola. So, I direct team members in China, Brazil and Chicago. My team does usually the research side of things, and then my role is to sit on several different strategic leadership teams, whether that be hardware strategy or software strategy.
Laura:
We also work with our go to market teams and our selling teams to try to make sure that the research that we’re gathering around the world is being infused into our product and marketing strategy throughout our product development cycle.
Brent:
Cool. Hello, I’m Brent Summers, the Director of Marketing at Blink. At my focus at Blink is on Blink itself primarily. And so, I work with executives to figure out what do we want to be in three years, in five years, so that our marketing plans for the day are laddering up to some bigger and broader goal as we continue to grow and evolve for 20 year old company.
Brent:
But since I’ve joined have grown almost a hundred percent and added four additional locations. So, we are in growth mode, and want to continue to grow, and want to think about activities that are going to serve us in the long road.
Larry:
Great. Thanks Brent. And I’ve got to say one quick thing. I just came from Brent’s presentation and he immediately revealed himself as my soul brother. I’m a big boxing fan, and he used the rumble in the jungle as his example of strategy in there. And you’re also a content strategist at heart.
Larry:
So, and that kind of leads in the next thing I want to ask about. There’s sort of these generic challenges that we get into and we each have an approach to it. One of those challenges are like… well, first of all, just getting people to value research. That’s a big one, getting budget and time, and building into the schedule for it.
Larry:
There’s also this cult of data now that everybody seems to think that if you just immerse yourself enough in the voluminous data that’s out there, that you can figure out any research challenge. And there’s often the overarching problem, is that users don’t know what they want and we’re doing a lot of discerning about that.
Larry:
So, I guess, I just kind of want to throw those out as some of the universal challenges, and then maybe have each of you talk about if any of those grab you, in your approach to your work. And let’s start with you Laura, because you’re in the middle.
Laura:
Sure. In the middle, like it. I think the list that you just gave that stood out to me the most is the piece about the . . . like the type of data and what people are really leaning towards and buying into.
Laura:
And yesterday I talked a lot about big data and how that’s become such a big trend in the research industry, and how we balance that with our knowing that we also need that qualitative piece, and the why and understanding what people are doing but also why they’re doing it.
Laura:
And I think that that is something that is going to continue to be a challenge for us. And I think the more we gear that conversation as an and the qualitative and the quantitative, the big data and the small scale deep dives into what people are doing.
Laura:
I think we’ll be able to better position ourselves for success to get that buy in, to get people to see the power of research and why it’s so necessary. Instead of trying to always kind of defend the way we want to do things or trying to make it be a versus conversation.
Larry:
Do you have one quick success story about that? Because you spoke with great authority there, and I’m assuming that comes from experience.
Laura:
Yeah, it’s something that I’ve experienced a lot in several different ways and over several, many years. So, it’s not a new conversation. I think that it’s just instead of it being quantitative in general, now it’s big data. So, I think an example I gave yesterday was a very specific product example, how we used big data to understand what users were doing.
Laura:
And then use that to prioritize what we then asked from a qualitative perspective. But there are other ways and since I’ve already talked about the example, there’s other things we’ve done too. One of the things when you’re doing product design is you can’t always put all your resources into every single feature, right?
Laura:
Sometimes you have to prioritize where you’re going to spend your money. And so, understanding what users see as the most important features and what they prioritize as important is something that you really want to get your hands on, because then you can make the best product decisions.
Laura:
So, we do have the analytics that will show here’s how often people are using different things on their phone, and that’s very valuable to us. But it doesn’t always get at the thing that gives them the most satisfaction with their product.
Laura:
And so, we’ve combined that initial quantitative dataset into then doing deep dives with users, and doing things like making it a game where we’ll bring them in and give them game pieces and use a game theory from econ to say, okay, in a limited set of resources, how do you distribute those resources?
Laura:
And then they can put in… I’d put one coin on my display size, and one coin on my battery, and then in the end we can make them a little fake phone and say, is this right? And we can get to see how they respond to that, and if that fits what they need.
Laura:
And so, that helps us combine that big data, quantitative data set of what they’re saying matters, but then what their visceral reaction is when they see it, and that’s been very helpful. And for us to figure out what the right form factor, what the right feature set to put in our devices.
Larry:
That’s kind of exactly what I’m getting at that interplay between the two. And Cora, when we spoke our first kind of preface conversation about this, you talked about that, and kind of to what Laura just said a little bit, that data can tell you what to do, whereas qualitative research can tell you why to do it. Can you tell, is that a good starting point for your approach to this?
Cora:
Yeah. So, I think it’s important to look at both sides of it, both the qualitative and the quantitative kind of for that reason, and a broad range of inputs. So, in our work, when we deliver a product or recommendation to a client from our team, we like to make sure that we kind of have all of our bases covered.
Cora:
So, we’ve considered why something is happening, what exactly is happening and looking at it from several different angles. So, maybe it is data and usage and conversion analytics and KPIs, but maybe it’s also some user verbatims as well. So, being able to bring all of those as part of our recommendation bolsters our recommendation.
Cora:
It makes it a little more palatable for clients on the other end of it. I think one of the other things you said that sort of struck me is that users don’t know what they want. But the funny thing is, as an experienced designer, I don’t really know what users want if I can’t talk them, right?
Cora:
So, if they don’t know what they want on their own and I don’t necessarily know what I want. If I’m standing over here in my bubble, it takes all of us coming together to come to a cohesive solution, and to be able to bring that forward rather than each of us standing separately on our sides of the line.
Larry:
I just got to say, I love that somebody from Detroit made that observation, because that’s where the famous story about Henry Ford saying people would have asked for faster horses, they didn’t know they needed a car.
Larry:
But do you have any other examples of how you’ve bridged that gap between you… you kind of mutual ignorance of each other? You’re like, you don’t know what they want. They don’t know what they want and yeah.
Cora:
It really does come down to research. We’re fortunate at Huge to have a dedicated research team, and each time we go to a client and decide whether or not we’re going to work together, research is always part of the conversation, research, and data, and analytics because we’d be doing them a disservice if we came to them with a recommendation or a point of view without having those inputs.
Cora:
So, making sure that right from the start, if this is to be a successful relationship, if we’re going to work together, you all have to agree to include this as part of the writeup, and we have to agree to deliver it.
Cora:
And this number of deliverables, this number of respondents and that all of that is agreed upon upfront, so that when it comes down to it and the timeline is shortened everything, that those are the things that are not cut because they are imperative.
Larry:
Great. That’s a great example of exactly what I was hoping for that dive in into the weight, coming back up and yeah, figuring out. Brent, when we first talked about the panel, you were the first person to use the term oscillating, and this kind of come up already. Can you tell me, is that a good way to start your sort of approach how this work manifests?
Brent:
Yeah, I think that’s a good start. I took that philosophy or metaphor of oscillation actually from a CEO of a company that I used to work for called Segment, that focuses primarily on quantitative customer data. And of course, now I work at Blink where we have a strong qualitative practice.
Brent:
And so, I’ve had the experience, the pleasure, I guess, to work with firms who are strong and sort of biased towards one end of the spectrum or the other. And I think there is a sweet spot in the middle that’s about understanding the thing that you’re trying to recognizing, the thing that you’re trying to understand, clarifying your goals, and then figuring out what’s the right subject matter and method to collect the data that’s going to help you answer that the best.
Larry:
I just came from your presentation and you use this flywheel model that I guess, that’s sort of how you operate most of the time, working your way around that? Maybe talk a little bit about that?
Brent:
Yeah, I think the flywheel for me, I’m focused on marketing, and as I’ve learned UX design, working in design adjacent, I’ve thought about how similarly we’re approaching the same problem and I just have different lexicon for how I described that.
Brent:
And so, the flywheel has three parts to it desire, trust and action that I think are universal in terms of being able to be understood. And things that are pro user, that I as a marketer can get behind and UX designers that I work with can understand.
Larry:
That kind of starts to get into the toolkit you work with. And I think we all have kind of envy for Laura’s set up at Motorola, because she has this huge fancy lab and you can do anything you want. Right? I mean, tell us a little bit about your research set up here at Motorola.
Laura:
I can do anything I want.
Laura:
We have a really great research facility at our offices now, but we didn’t start with that kind of office. We first started doing research at our office, started with a very low fidelity webcam set up like lots of wires, lots of everywhere.
Laura:
This is fun. We can keep this up. I think I can always shout very, very loud. But we have sound that is coming in and yeah, is just knowing YouTube?
Cora:
. . . okay.
Laura:
All right, there we go. Let’s do that. Sorry. Thank you. But we had so many more partners coming in and watching the research sessions, and wanting to be there, which is really awkward when there’s no mirror.
Laura:
And you just have product managers sitting there, watching a user trying to talk to. That it builds up the interest and we got investment into building a lab and now we have that space.
Laura:
The key thing for us with that space is that not only do we have a usability tables set up with the cameras, I can see what you’re looking at and what you’re doing. But we also have a living room area set up that’s more comfortable, a little bit more organic.
Laura:
Or you can just sit and have conversations with people or co-create with people or bring in big whiteboards and just draw what you’re thinking or how sessions, where your users on your product teams are interacting together, just talking to each other.
Laura:
And that might not be technically research. It’s not rigorous, but you’re getting real user perspective into your product team’s heads. And that is invaluable because now they’re bought in and they’re not just listening to me give a report. They’re experiencing it for themselves, and they’re internalizing it and it becomes part of their process to think about the user.
Larry:
Cool. And then it’s kind of not the other end of the spectrum, but a different approach. Cora, you’re huge, you end up doing… oh, grab the mic. You end up doing a lot of remote work but I guess the point of this is you’re both getting good results. Tell us about how the remote research activities work for you.
Cora:
Before I tell you about that, I will tell you that I am insanely jealous of Laura setup. Very.
Laura:
You guys come on down.
Cora:
Oh, no. So, in a past life I did have a setup like that where we had a dedicated space for users, and I completely 100% see the value in it, but I also love the remote setup. So, because we have clients that are everywhere, all around the world, we do end up doing a lot of remote research where our clients are around the world, and so our users.
Cora:
We use tools like Lookback to do some of our sort of remote testing where the facilitator could be in one place, the clients in another place. We’re observing in another place and taking notes. We’re communicating directly to the facilitator, our researcher and taking notes all within the system.
Cora:
Everything gets tracked together. The video is all there together. I mean, that’s invaluable to be able to look at everything real time, but then go back and see everything all in one place is amazing. But then we also use tools like indeemo for diary studies. And so, those do good readouts that all of us can lean on afterward where we see what the user is seeing and what they’re doing with their video takes are.
Cora:
One quick story, we had a user not too long ago. This was for tires, for a tire purchase and he was doing a usability test but he was on his phone. So, we’re sitting in our office and we got to log on to observe the test and we’re like, okay, he’s on mobile. This is cool, because some we get mobile but a lot of times users feel a bit more confident on desktop.
Cora:
So, when we get mobile, we’re really into it, this will be great. He’s in his car, he’s not driving but he’s in his car and his child is in the back seat, and his child is asking for a McDonald’s and we’re like this test is not going to go well.
Cora:
It’s not going to go well, this kid’s going to have a breakdown, it’s just not going to end well. It ended up being fine, but it also gives you the real world aspect of it that you don’t necessarily get from sitting in a lab. So, I like both sides of it. I’m coming to appreciate more and more the real world aspect of remote testing, and what you get with that.
Larry:
I love the sort of a contrivance to a lot of research. You kind of control the environment, but to get that’s much more ethnographic. You’re out there, you’re like the anthropologist sitting in this side in the passenger seat observing that.
Larry:
Hey Brent, tell us a little bit about your kind of setup you have . . . because you mentioned in your presentation, I just came from, you integrate, you kind of. . . I’m not sure how to say it, but you’re like a UX-ey marketing person. Is that?
Brent:
Sure. Yeah. Well, I’m a marketing person that works at a UX agency, so I have to speak the lingo and talk about it in a way that people who are not necessarily UX designers can understand, right?
Brent:
We’re product managers, marketing people, CTO is one of my favorite clients. People who have an appreciation for UX, and a lot of humility around their own personal capabilities that can be really great clients. Can you repeat the question for me though?
Larry:
Well, just sort of your setup because you mentioned that you do a couple of different approaches and I’m just wondering, not the necessarily of the physical setup, but sort of how you approach your way to answer questions.
Brent:
Sure. So, I think the approach is variable depending on the specific context or project. A really easy example for me to provide is for a website redesign. We’ll typically approach that with three rounds of research, a first round of research focused on foundational research to understand the user or their motivations, their goals.
Brent:
Second, evaluative study where we’re looking at different information architectures or visual concepts, to understand which people tend to appreciate more or trust better. And then a final more usability type of study where we’re ensuring that people can complete common tasks on that website.
Brent:
So, that kind of project might take anywhere from 10 weeks to six months. But we find that those iterations of research in between an ongoing design program are what help continue to build confidence, that this massive website redesign is going to actually move the needle forward.
Larry:
Right. I love how iterative everything is nowadays. You get more frequent feedback about how well you’re progressing there. Hey, now, time has always elapses so quickly, one of the things I want to make sure that we’ve talk about a little bit is I asked each of you, at a time to think about like one kind of take home or takeaway that people can get from this session, in terms of their research practice and especially as it can inform their strategy formulation. Anybody want to go first? Anybody have a?
Brent:
I’ll go first and then I’ll pass it to you. I rely on that flywheel model again, desire, trust and action. And the difference between qualitative and quantitative, I find that quantitative data is most effective at measuring the actions that people take. You can’t measure the action they didn’t take, I guess you could, but qualitative data is maybe the why.
Brent:
So, in my framework, qualitative data is really good at helping us understand what people want and why they want it. And qualitative and quant can help us understand whether they like the thing that you’ve put in front of them, but quantitative data is sort of the empirical evidence that the strategy you put forward worked or didn’t work.
Larry:
Got it. Laura.
Laura:
I think we’re talking a lot about quantitative data and qualitative data. One thing we have not touched on is why you’re looking at any data. And I think the best technique I’ve experienced in my life in this industry is to always take your stakeholders and take a step back, and talk about what it is you’re trying to learn.
Laura:
What you want to walk out of the room knowing that you didn’t know when you walked in, and having that discussion be more based on what your question is, and the type of data that you can collect and the type of research that you can do will follow from that. And you can then take that discussion away from a debate, or argument about qualitative versus quantitative, and make it be more about what are we trying to learn and how best can we do that.
Laura:
And then you can fill in, here’s what quantitative data will help us get at the what, and here’s what quality of the data will help us get at the why. But when you get into those really fierce debates about what type of research you should do, taking that step back and getting on that same page kind of grounded points, generally helps then move forward without so much contention.
Larry:
I was going to say when you’re ever in a contentious situation like that, to have something to reflect or refer back to, that’s a struggle. That’s like a tactical thing. More that level of like, okay, we got to actually make this happen. Here’s how I’m going to do that.
Laura:
Yeah, I mean, if you can put the question up on a whiteboard in the room and just always point to it as you’re discussing, that’s the strongest experience I’ve had with getting a research plan in play and moving forward, and kind of stopping the cycle that you get into about why is qualitative better or why is quantitative better together they’ll be better or better when they’re combined.
Larry:
Okay, great. Cora, what, oh, yeah, sorry.
Cora:
I think for me it is for sure bringing a holistic view of the data and of the inputs to the work, but then on top of that, knowing the triggers for each stakeholder in the project. So, we have a particular CEO, we did a read out to him of some of our user testing and showed some of the clips, gave some of the verbatims.
Cora:
And this particular user in the verbatim was not exactly happy with the experience, but when the CEO heard it, he felt like it was inflammatory and that it was, oh, this is just one person, this can’t be, this is just, we can’t take this to anyone else until we clean, like that kind of thing.
Cora:
So, knowing who the audience is for each of those data types maybe and outputs, whereas a designer might respond more favorably to something like a verbatim because they can relate more to that person for whatever reason. So, I think having that full body of evidence, but then knowing when to pull out which trick out of the bag, it’s important.
Larry:
Thanks. I wanted to follow up with you on one of the things we were talking at the very first in our first preparation for this, we talked about the idea of intuition and how that’s probably not the best label, especially if you’re in a meeting with high-powered executives.
Larry:
And I think it was you who suggested the term, it’s professional judgment, and I think that’s sort of an important link often between these qualitative and quantitative things. Can you talk a little bit about that?
Cora:
Sure. So, sometimes it has happened where we have a full suite of data, qualitative, quantitative, we’ve done all of the things, checked all of the boxes and the output of that or the deliverable says, “All right, you should march forward doing this thing.” But as a team we look at it and say, okay, that’s what it says, but something, it just doesn’t feel right.
Cora:
And so, right, maybe intuition is not quite the right word, but there is a place when you’ve had the experience to call on for professional judgment. And that’s a point where you stop and take a step back and think, I see exactly what the data is telling me, but you have to look at the data and say, am I interpreting it wrong? Did I ask the wrong question upfront?
Cora:
Did I do the wrong type of research? Why am I getting this gut feeling? This gut reaction something isn’t quite right, because when you have that much experience yourself or across the team, there’s certainly something to be said for that. And it’s not always just as black and white as it’s looking at the data.
Larry:
Thanks. And for some reason that reminds me of something that Brent that said, both in our preparatory conversations but also in your presentation is now about the term you use is “language-market fit.” And that’s seems like another tool, I guess for stitching things together. Can you talk a little bit about that?
Brent:
Sure. So, I worked at the pleasure of working with Eric Ries on the launch of his book, The Lean Startup and a second book, The Startup Way. And in that book, in The Lean Startup, he talks about this concept of product-market fit, which is where you’ve got something of that a market finds valuable.
Brent:
And I think product-market fit is definitely one dimension. But if a can opener was not called a can opener, you would not intuitively know what it does. And so giving that a descriptive name, right? Is language-market fit, where maybe something like Elixir is an evocative name.
Brent:
There’s a term, it’s something that you’re evoking out of that name. It’s hearkening back to something else. And so, understanding users and also sort of do some reflection upon your brand, helps you describe your product in a way that the market can understand and appreciate.
Larry:
Got it. Thanks. Well, I’ve got a ton more questions I would love to ask, but we’ve got a lot of time for Q and A. I just want to make sure, I’m much more curious about your questions than about mine at this point. Anybody, and just raise your hand or-
Brent:
Yeah, over there?
Larry:
Okay. Way back there. Okay. Yeah. Sorry, I didn’t see you back there, and we have a mic coming towards you, I ask you to speak into the mic so we can pick it up for the recording.
Q&A question:
Can you give an example of just something that was like a huge surprise to you? Anything that comes off, anywhere. Something that you learned, something that you would be dying to share with someone?
Laura:
Yeah. I touched on this a little bit yesterday in my talk, but back when we were doing some ethnographic research, trying to understand where things just weren’t working well for people throughout their day and how their tech kind of impacted that.
Laura:
We observed people getting really stressed out and frustrated by what their phone was doing when their little light was blinking. But they weren’t telling us, they weren’t ever stopping to say, “I’m really annoyed by this” but we could see the stress on their face and we could see how annoyed they were getting.
Laura:
And so, it’s a small surprise. I don’t think it’s like this shocking groundbreaking surprise. But I think that’s probably more true to what you end up seeing with research anyway. It’s usually not these big groundbreaking, shocking things, is these small nuances where you see there’s an unarticulated pain point here.
Laura:
They don’t know to tell us this isn’t working for them, because they have a work around that they’re dealing with. But from making that observation, we were able to translate that into this idea of information, not just notifications and then design a whole suite of experiences around that, that really did transform our software experience.
Laura:
And let us create an entire suite of Moto software experiences that had been really successful for us. So, I think that’s the example I have of something that’s surprising. Again, not shocking, but very influential for us and very nuanced and very small.
Larry:
Right. And sometime the surprises can be the thing, like you said, there’s a lot of subtle nuances unfolding, and then it can just be some unexpected or unpredictable result that like, oh, and then it leads to another line of inquiry that, okay, got you. Brent, I saw you nodding your head a little bit.
Brent:
Yeah, I was thinking more about the language-market fit question, and I’ve got an example that surprised me. We worked with Moen on a Smart Faucet, Smart Shower first and then a Smart Faucet. And one of the things with the faucet was that they knew they had this technology, but we’re having trouble articulating the value proposition.
Brent:
And we thought is it about being able to dispense accurate amounts of water, or water at a very specific temperature in order to activate the yeast in your bread? But through some foundational research we learned that they really needed a sanitation message, repeatedly we’ve heard people talk about “chicken hands” and being able to turn on the faucet in order to rinse my hands without touching it, was the number one benefit.
Brent:
So, that’s a surprising insight that in hindsight, like most UX insights, is kind of self-evident, but you don’t know it until you see it. And that’s why it’s important to go through the cycle.
Larry:
That’s interesting, I love it. And I have to observe for the folks who are only hearing this on the podcast that all the guests were kind of cringing at the notion of “chicken hands.” So, I want to make sure that gets in there. Cora, go ahead.
Cora:
I think for us on one of our latest projects, something that was a pleasant surprise, but one that we had anticipated was still good to see. So, we have in the industry, across the board, there is for every website in this industry, there is on the homepage typically, an area where you enter four fields and then you go to a results page, and it’s like that across all of our competitors just industry-wide.
Cora:
And we wanted to shake it up. We knew it would be risky, but the client was on board. So, we took each of those four, five fields that were in that one drop down area or a four, five drop downs that were in that one area, and broke them out into separate views. So, now the user steps through one question on each page, and it’s radically different from what anybody else in the field is doing.
Cora:
And so, we tested it, it worked out okay in user research, and then when it went live – it’s doing amazing, it’s great, it worked out really well. So, we put ourselves on the line there a little bit, and so it was a very nice surprise when it worked out really well in the end.
Larry:
Well that sounds like a great example of your professional judgment coming into play. Like kind of having the courage to flout convention and do it slightly different. I remember years ago, I can’t remember who it was, it might’ve been Jakob Neilson or somebody talking about, if in doubt on the web, just do it like everybody else is doing it, because that’s usually a safe thing.
Larry:
But to have the courage to break out of that. And do you have any data or again, coming back you had this intuition, you did it differently and you have data about how much better that page is performing now?
Cora:
Sure. So, I’m not going to be able to remember the exact numbers off the top of my head, but prior to the site relaunch, when we’d been doing it like everyone else, the conversion rate was fine. But since then, since December, it’s been climbing every month since. And so, that’s reassuring, not only better but markedly and exponentially better.
Larry:
Great. Any other questions in the… I see a hand back there? Yes, thanks.
Q&A question:
I love these options, but it’s one thing when you’re dealing with an e-commerce or a public transaction, but we are an insurance and so, how do you handle doing these things when you are exposing someone’s personal information? Do you have any suggestions?
Laura:
It’s just to follow up on your question, is it more about how you do your data analysis and share kind of what you’re learning? Or is it more about how you talk to the people in the first place?
Q&A question:
How do you handle your capturing screens full of their personal information? How do you do these things without compromising their data?
Laura:
Yeah, it’s tricky. Definitely, especially for the situation that you’re in where you’re seeing a lot of probably very vulnerable personal information, and not just information like, this is what I do with my phone. Right? I think things that we’ve done when we have things that are a little bit more sensitive, is talk about what do we capture in terms of what do we video, what do we take notes on? What do we discard or what do we black out?
Laura:
And I think it comes down project by project to talk about what is it that you’re doing? And then also being very transparent with your user. This is what we’re looking at, this is what we’re capturing. This is why, and this is what will happen to it when we’re done.
Laura:
So, we always tell every user you’re on video, it’s for me because I can’t remember everything everyone says. You’re not going to see yourself on YouTube, and we’re going to delete your data as soon as our analysis is done.
Laura:
And I think as long as they feel comfortable with that, and not proceeding and not starting, not hitting record, nothing, until they say they’re okay with it, I think it’s extremely important, particularly in sensitive data situations.
Larry:
I’m wondering, Brent and Cora, you both have done a lot of client work, have you ever worked with healthcare or financial or other sensitive data kind of clients?
Cora:
I have. I’ve worked in mortgages, insurance, health care, you name it. And so, the way we handle it a lot of times is with our vendors and sometimes our recording vendors. We’ll work with them directly if we know what fields we’re accepting from users, we obscure those fields and just don’t record them to begin with.
Cora:
So, we’ll identify them ahead of time, and obscure them. We also like – if it’s live user testing, either remote or local – we’ll provide the user with sort of a cheat sheet of content that they can use to fill in so that they don’t have to use their own.
Cora:
Sometimes we’ll create test accounts that already have sort of dummy information and so again, they don’t have to use their own, because I do think that’s important, especially because, in testing, half the battle is winning the trust of your participants so that you can get some good outputs.
Cora:
So, if you don’t have the trust of your participant, then why even do the testing. If they feel comfortable though, because they’ve either gone through that approval or because you’ve given them some other data set that’s not their own to use, that’s when you get to the good information.
Brent:
Yeah, I think for me it just comes down to preparation and rigor. One of the things that was surprising for me when I first joined Blink was that the time we take allocating towards the session guide, it was more than I was used to at prior agencies. But when we’re looking at things in healthcare or finance, all of the things like the cheat sheet, the obscuring the fields, that all takes time and preparation.
Brent:
That’s part of what we call developing the session guide. So, that’s framing the shot and figuring out which things you want to obscure. It’s writing instructions for users about “share this window, not this window.” “Please don’t keep your email open so that I don’t inadvertently look at the bank statement.”
Brent:
So, it’s just about… and I’m oversimplifying as the marketing guy, we have Michael Harding as our research expert that was on a panel here yesterday talking about something similar. So, it’s about rigor for me.
Larry:
Cool. Thanks. Look around, see if there’s any other question. Oh yeah, I see you, with one in the back and then we’ll take the one up here next.
Q&A question:
Yes. So, I believe that qualitative testing in general has been well-received, within companies and stakeholders, but I feel that it’s much harder to convince stakeholders to actually do quantitative testing.
Q&A question:
I don’t know if you have any tips for us to actually use this towards our customers or towards our stakeholders. Because I feel that qualitative testing is always the go to first, and then quantitative testing is something that is in the backend whenever we can or whenever we’ve got budget.
Larry:
I want to follow up with a tiny bit on your question. One of the issues I didn’t list in that list of concerns at the top, is this kind of convincing the stakeholders that research is even necessary and it’s kind of what you’re getting at. Yeah. So, anybody?
Q&A question:
Yes.
Brent:
I’ll go first real quick, I think: Start. Be scrappy and run a test to demonstrate the things that you’re trying to learn, because it’s very humbling to sit in a usability lab and watch a participant struggle with a prototype or struggle with a live product.
Brent:
And I think it’s hard to argue with empirical evidence. So, if you can figure out some way to go real a test the thing and use that as part of your pitch, I think that will help when stakeholder’s over
Larry:
Cora, did you . . .?
Cora:
Sure, we used to have sessions called Watch the Film every Friday. And so, beforehand I would pull a bunch of sessions from our user data for the week. And there’re sessions where we actually are watching the user click through and struggle on our site.
Cora:
And so, the Watch The Film session was an opportunity for anyone in the company to come and sit and experience the pain with the rest of us, and see that we’re not crazy. What we’re talking about really does happen and that we’re on a mission to fix it. But we have to have the buy in first, and we have to have you believe it. So, come sit right here next to me and take a look while we walk through these together.
Laura:
I will point out though, that there are times where you can’t even get to the point where you’ve got a prototype, or you have something to work on because there is that little level of buy-in into doing research. And I think that that’s something a lot of people experience.
Laura:
And so, I think if that’s what you’re experiencing or you can’t get to a place where you could do guerrilla testing or whatever. I completely agree getting scrappy is really key if you’re there, but also having a discussion with your stakeholders. Maybe not necessarily about research, but as, I want to understand what’s happening here better. I want to talk through the quantitative data.
Laura:
Trying to have an organic conversation where you might just be able to identify some gaps and understanding, almost so that you get your stakeholder to identify there is something we thought we knew but we don’t actually know.
Laura:
And then, that’s your opportunity to jump in and say, “Well, we could do a really quick round of interviews. Let’s go grab people off the street and find out what they’re doing.” Getting that start, sometimes you need to get there first before you can get to guerrilla testing or that sounds awesome, the Watch The Film Friday sounds really great, I’ve kind of want to talk to you more about that.
Laura:
But if that’s where you’re at, that’s an option too. Trying to get your stakeholders head to go there themselves without you telling them that’s the gap.
Larry:
All right, thanks. And we had a question up here. Who’s got the mic? Yeah, thank you. And actually while he’s coming up with the mic, I’m just going to observe that I’ve always been in scrappy a little startups and stuff, so, it’s gratifying to hear that even big enterprise people, and big agency people have to do guerrilla stuff.
Cora:
All the time.
Q&A question:
That’s a good lead-in because my question is about working for a startup. I’m a designer on a team of two designers, and one of the things we constantly get when we’re trying to run researches, when do you know when you have just enough so we can kind of move on to the next thing.
Q&A question:
So, just wanting to hear from this group, what are some of the tactics that you’d use to determine when you’ve got enough that you can move on and start on the next project?
Laura:
Do you mean enough data? Like we’re good, we’ve tested this prototype enough. It’s time to either ship it or fail?
Q&A question:
Probably more actually with like discovery data for like, should we do this even in the first place? Or even if you are validating as well, that’s all useful.
Laura:
Got it. Okay. Yeah, we have experienced that a lot. And that’s a question that comes up all the time. How do you know when you’re done with research? Because you can get into an endless loop, right? And I think part of it is having tight collaboration with your work partners, and your stakeholders, and getting to a point where you can kind of feel like we’re hearing the same thing over and over again.
Laura:
We’re not seeing anything new or the things that are new are very, very tiny and in reality this is not going to make or break this product. And there’s another thing that has no data, and it’s time to pivot.
Laura:
So, I think over time you start to develop kind of an instinct for it. Maybe that’s also where professional judgment comes into play, where you realize these are not big findings anymore. It’s time to kind of put this to bed.
Larry:
I just want to observe. I think that kind of plays back to the whole strategy part, part of formulating a strategy is having, What are our success metrics, how will we recognize it when it’s here? And like as you just observed that super tricky with research. Brent, what do you think?
Brent:
Well, sometimes the success metric is getting something to market, be it a prototype or a minimum viable product or a Version Two. And so, if your strategy is couched in the objective of I’m going to release the thing, I’m a project manager originally, business analyst and project manager.
Brent:
So I’m personally good about time, budget and scope. And also working on the agency side, that’s sort of how we enter agreements. This is the process that we’re going to commit to, and unless we learn something after the first series of foundational research that causes us to deviate from that plan, we’re going to proceed knowing that we’ve learned a lot more through that round of research. But of course there’s always more work to do, but unless there’s a red flag, we proceed with the plan that was intended to achieve the goal.
Cora:
That’s exactly how we proceed. So, we always start at the beginning of the engagement, leading up to the signing of the paperwork. We have an idea about what’s there and what we might find, but then once we sign that paperwork, both sides have agreed to this amount of time, so, it’s time-boxed for one. Both sides have agreed to it.
Cora:
And then at the end of that, say two, four week period, depending on what it is, whatever research we have. So, we work fast and furious during that time period. Sometimes long days, but it’s okay, because we know at the end of that two weeks we’re going to have something good and actionable to go and deliver on and to work on. After that two weeks, unless it’s like a super-duper red flag, it’s going to have to wait. It just going to have to wait.
Larry:
Great. I think we have time for two more questions. The gentleman back there.
Q&A question:
Our industry isn’t any different than other industries where employees are being asked to do more. So, wanted to ask you, what are your thoughts about folks being asked to become slashers? Slash architects, slash researcher, slash designer when the experience isn’t necessarily there? Not all of us are in a situation where we have specialists. So, some people have to do everything.
Cora:
I’m passionate about this. I am a little bit old school in that user experience to me was always user experience, and I think it’s partly part of the market where I am in Detroit, but I started to see a lot of people who were UXDS or User Experience Designers, but really they were designers who needed maybe a job. So, I’m just going to throw this UX in front and I’ll figure it out as I go.
Cora:
I have in my experience worked with a handful of people who are good at being the slash, but being the slash takes a special kind of person who can really understand the user experience but then also design something beautiful too. And if you’re not that person, sort of just putting the two together and saying that that’s just it, is not it.
Cora:
And so, you end up sometimes with a product that is beautifully designed that no one can use or that people can use and it’s functional, but it doesn’t look all that great. And it became a problem for hiring too, where I’m looking to hire someone who is really sharp on features and functionality and hierarchy and information architecture.
Cora:
And those are hard to come by because everyone is just a UXD. And it’s hard to discern where you kind of skew, and where the true skill is. So, for me, I still prefer a separation of the two, or at least distinction in someone’s work and in their portfolio and finding someone that can do both. In that case it’s like a unicorn.
Larry:
It looks like Laura?
Laura:
Yeah, I have something to add to that for sure. I agree. The slashes concerning it can happen. It can work. The reality of companies today is that they are getting smaller or they’re globalizing and they are trying to get the most bang for their buck. So, that is the reality that we face.
Laura:
I think in my experience, the more that you make people feel what will happen when they don’t have the right things, the easier it is to get the resources and the head count that you want. And so, if you fill that gap and you fill it badly, but it gets filled anyway, your company is never going to believe that you need to have that experience or that expertise.
Laura:
And so, it’s going to create a cycle that you’re not going to be able to stop. And so, it comes down to figuring out how to negotiate, how to say no respectfully in a way that doesn’t jeopardize you in your career. But I think it is really important to make them feel that miss, otherwise they’re not going to realize there’s something they need to be filling.
Larry:
Thanks Laura. I just realized we’re coming up on time. I know we had one more question. We’ll be around a little bit afterwards if you want to catch us. I just got the hook from Joe, so we should wrap this up. Well, thanks so much everybody. Cora Cowles, Laura Joss, and Brent Summers, great panel. I appreciate all the questions. We’ll stick around for a little bit if anybody has additional questions. And thanks so much for coming out.
[applause]
Leave a Reply