logo

NN/g UX Podcast

The Nielsen Norman Group (NNg) UX Podcast is a podcast on user experience research, design, strategy, and professions, hosted by Senior User Experience Specialist Therese Fessenden. Join us every month as she interviews industry experts, covering common questions, hot takes on pressing UX topics, and tips for building truly great user experiences. For free UX resources, references, and information on UX Certification opportunities, go to: www.nngroup.com The Nielsen Norman Group (NNg) UX Podcast is a podcast on user experience research, design, strategy, and professions, hosted by Senior User Experience Specialist Therese Fessenden. Join us every month as she interviews industry experts, covering common questions, hot takes on pressing UX topics, and tips for building truly great user experiences. For free UX resources, references, and information on UX Certification opportunities, go to: www.nngroup.com

Transcribed podcasts: 41
Time transcribed: 22h 36m 34s

This graph shows how many times the word ______ has been mentioned throughout the history of the program.

This is the Nielsen-Norman Group UX Podcast.
Welcome to today's episode, which features a conversation between Therese Fassenden and Steve
Portigal. Therese and Steve discuss user interviews. They talk about how this research
method changed over the years, what the main benefits and biggest challenges are,
and towards the end of the episode, Steve gives some practical advice for how to avoid some common
mistakes many interviewers make. Join us as we dive into the conversation with Therese and Steve
and explore the art of user interviews. Steve, welcome to the podcast. I'm excited to
chat with you today about user interviews. It blows my mind that we haven't had an episode
on this topic and who better to have on the podcast. Yeah. Well, thanks for having me. I'm
looking forward to our conversation. Yeah. So yeah, and interviews are probably one of the more,
if not the most popular methods in UX research, right? After all, it's, if we're going to make
better products and services, better experiences, who better to talk to than the people who are going
to be using it. I would love to know a bit just to kind of give an introduction for folks who've
maybe never done one before, never really even heard of these. What would you say a user interview
is? You know, when you look for definitions, you're going to, you can, it's a good way to start an
argument. Well, that doesn't count as this. That doesn't count as that. So I'll give my kind of
sloppy perspective. And I think people may, may have variations and I tend to be a little generous
in what I include as a, as an interview. Um, and I think that sort of the, it's an activity as well
as a method, right? So you can do lots of quote unquote methods that still involve an interview.
Like you might run a usability test where you have somebody walk through a set of scripted tasks
with a prototype, but you might interview them before or afterwards. Um, and so that's where I think
I get a little sloppy with my definition. So I guess that's not an interview overall, but you are
sort of using interviewing as a, as a process or as a tactic to do that. Um, but I think, yeah, if you
step back and interview is a, is a conversation or it looks like a conversation, we use very different
tactics to have this quote unquote conversation, uh, between one or maybe two researchers and
one or maybe two users, customers, participants kind of call them what you want. Um, there's
jargon that comes up like, uh, semi-structured, which I don't even want to try to define, uh,
open-ended. Um, when you do an interview, you have some prepared questions, but then the conversation
goes in different places. You ask lots and lots of follow-up. So it's not, um, you know,
it's not like a verbal survey where it's question. Give me your answer. Next question. Give me your
answer. You are trying to explore. And so interview a and interview B are going to be a little bit
different. Even if you have the same kind of population, you have the same sample, you have
the same, uh, discussion guide. So there is, um, um, yeah, kind of an emergent exploratory nature to
the conversation. Even if your topic isn't exploratory, you still want to explore with that
person and sort of ask more and ask more, try to get to something before you kind of move on.
Um, so that's kind of what a, what an interview is like. I think there are other aspects like,
where does that interview take place? Um, you know, and because so much is online now,
like the conversation we're having, a lot of that is, is happening, um, is happening in sort of remote
online virtual environments. Um, I think if we go back to sort of in person stuff,
I don't think of, if you bring somebody or have them meet you in a, in a market research facility
or usability lab, it's still an interview, but you're kind of changing some of the nature of it.
Ideally there's a, uh, an in context aspect to it where you are asking that person about their
experience and you are seeing their things and how they work on their devices and what their process
is in their environment and so on. So kind of getting as far into their context as you can,
depending on the logistics of how you approach it. Yeah. I appreciate what you said about this method
seeming like a conversation. I mean, anyway, it is a conversation, but I appreciate that you frame
it this way because I feel like there's this expectation around interviews being a certain
way. And part of that is, I mean, for one, we're on a podcast right now, right? We're doing a podcast
interview. Podcast interviews are going to be totally different from like a research interview
where your goal is going to be very different. And similarly, where you're having a conversation
with a friend relating with them on a deep level, that's also going to be a little bit different
from say a research interview. And I think it's important to note that it's not just any old
conversation, but one where you're really trying to get to know that person on the other side,
the user, the participant in this case. Um, so actually I, I, I kind of want to get to know how,
how did you get into research? How did you get so fascinated by user interviews in general and just
research in general? I was lucky to, I guess I would call it stumble upon this, uh, this is a
way of working kind of early in my career. Uh, and I had a, uh, I had a degree in human computer
interaction, uh, in the days before the web, in the days before we talked about user experience,
um, where this was a kind of a design activity and people had portfolios and, you know, and processes
and so on. So I had a degree, but not really, um, not really a skill or any expertise in making a
thing. And I ended up working in this, uh, design consultancy that was doing a little bit around
kind of software. And I don't, we didn't really call it design. It wasn't really practice like
design, but elsewhere in this organization was, um, an emerging practice in helping organizations
that they were serving, figure out what to make as opposed to how to make it. And it was really
saying that, Hey, there's work to be done in understanding the problem space, understanding
the users. And it was cool. Cause we didn't call it user research. We called it, we called it based
on the outcome. What value were we bringing and research is a method to get to that. Um, so, you know,
in doing this work, uh, and learning kind of on the job and this sort of quasi apprenticeship
experience that I had, um, you know, I got to see what happened in interviews and I got to
kind of tag along in interviews and I got to practice asking a question and I got to eventually
lead interviews and teach other people how to run interviews and, um, you know, work on a process,
uh, for how you make sense of this stuff that you get in a quote conversation. I think it was
with the interviews where, um, and I, you know, I really had to work hard to learn it, but once I
felt like I was learning something, there was, um, I don't know, a switch kind of flipped for me.
Like, Oh, this is this, not only is this a thing, but I think this is my thing. I'm still learning.
I mean, you learn about people every time you do that, but you also learn about the practice
of asking questions of interviewing people every single time, um, you know, through making mistakes
or just having some reflections about it. So that it continues to excite me and challenge me,
you know, as I've been doing it all these years. That's amazing. And I agree. This is one of those
methods that I started learning. Um, uh, it was not the first method I had learned. Um, a lot of the
data gathering I did initially when I got into this field was surveys. And I mean, I guess that's
tends to be a popular choice for people who work from say a marketing type of perspective. Um,
and that was sort of what I did a lot of my undergraduate class work in. So it was sort of
the natural thing that I turned to when I thought, Hey, I want feedback from people. Let me blast out a
survey. Um, and granted there was a time and place for surveys. I'm not trying to bad mouth them. I think
they're, they're certainly a very great tool for getting some quantitative feedback. Um, but I guess
I'd love to use that as an opportunity to talk about why would someone perhaps choose to do an
interview versus blast out a survey? Cause interviews like to your point are a bit challenging
to run, right. Uh, which this kind of feels like a meta moment cause I'm facilitating an interview right
now. Uh, but there's, you know, there's a lot to consider like body language. There's a lot to
consider about how you ask your questions or how you probe about a particular topic.
Why would someone intentionally choose this method that also takes longer and costs more
versus blasting out a survey, which is obviously going to be a bit less involved.
Right. I mean, both surveys and interviews look, they may look easy. It's just having a
conversation. It's just typing up a bunch of questions into a, whatever, a Google form or
whatever platform. And I think they both are deceptive that way because they're, they take
a lot of thought and intention and practice to do well. Um, and I think you run the risk in both
cases of, of, I guess, any kind of research garbage in garbage out if you don't, you know, craft your
approach. Um, now you're a survey expert. You may disagree with me. So I'm going to go on a limb and
say what a survey doesn't necessarily do is tell you why, uh, but an interview can tell you why,
uh, because you have the opportunity to ask follow-up questions. So you might ask that same
initial question. What's, you know, what's the motivation? Well, I mean, go back up a little bit
even in a survey. Um, you know, you might give people four choices of something or let them write
in another example, but you might see some patterns or some groupings, but an interview,
you might ask it in an open-ended way. You might not say which of the four following
is, has kind of directed your, your reference or, or your behavior, but what is it that kind
of informs your choice? So you ask this open-ended question. Um, and so sometimes our questions have
huge amounts of assumptions built into them about how people do things or how they think about it or
what the choice is or what actually comes before something else. So you can hear those moments in
an interview. You can switch, uh, switch direction and, um, ask follow-ups or let the person reframe
the whole context of what we're even talking about. You can adapt on the fly, um, and ask more and ask
more and ask more and kind of continue until you understand why, you know, one of the things I do like
about interviews and I kind of was getting at this a little bit is that it's, it's an, it's a method
that changes the researcher. It changes their understanding of people, of the problem, of the
opportunity. And it does that in this experiential kind of immersive way. If I'm going to talk to
a number of people over the course of a week, um, you know, it's good. I'm going to be scratching my
chin, uh, you know, on the dog walk or thinking in the shower. Like it gives you a lot of experiential
stuff to chew on. Um, the, the conclusions that you take are not obvious. They're not in the
interview. And it, it, it kind of, for me, it's a very rewarding experience to kind of be pushed
into this sort of sustained creative state as you're thinking about, you know, the people that
you met and how they talked and how they, how they view their work and how they view their lives.
Because it, it, even if it doesn't directly go there, it goes there indirectly. You start to
understand something about other kinds of people. So it's really rich and rewarding, which, you know,
is nice on its own, I guess, but it's, um, a really powerful way to stimulate thinking about what it is
that we're trying to answer. So I, you know, I get a lot out of it with the data and I get a lot out of
it with the experience and that, um, is different than other methods. And again, can compliment and
be complimented by other methods because they answer different kinds of questions.
Absolutely. Um, I have to say when I started running interviews, uh, the, the, the thought of
experiential learning, like you're just saying is, was really powerful for me. And that sounds like a
big word, like powerful. What? Uh, but it really was because it was an opportunity to talk to another
person and to connect with them in a very different way than you can in a survey. Uh, and part of that
is, I mean, sure. In a survey, I could certainly say, why did you think that if someone choose,
chooses some sort of satisfaction rating, fine, but the likelihood that I'm going to get any more than
maybe a sentence or a couple sentences, it's pretty low, right? It's usually going to be some sort of
short stunted statement. And if someone's really angry, maybe they'll write a little bit more, but I'm
still not going to necessarily get the nuance of what someone has really experienced. And being able to
talk to someone is going to give you so much richer data. Uh, you, you can connect emotional sentiment
and even some of the more complicated emotional sentiments that maybe people won't be talking about
or think is worth talking about in a survey. Um, and, and to your point, you can also drill a little bit
deeper, ask questions like, well, what, what about this particular topic? Let's, let's explore that a little
bit more. Like you were saying the survey, when it goes out, it goes out and there's no interesting comment
from this person here. Let's, let's follow up with them. I mean, you, you might be able to try, but the
likelihood that you'll have any sustained conversation after that is, is low, uh, if not
none. So I do think that interviews have this value that is sometimes undervalued in the sense
that surveys can sometimes be a bit over-prioritized or seen as kind of the easy button solution to
getting user feedback. Um, so, so yes, while I do still use surveys occasionally, I usually treat
them a bit differently, or I don't rely on them purely as the main source of user feedback. And I kind of
acknowledge that, Hey, there might be other methods that could be beneficial here. Like maybe an interview
would help supplement and add a little bit more context than the surveys currently getting me. Or maybe the
survey is going to be a good starting point. Whereas the interview is where I focus on maybe some really
specific questions that, that maybe I want to dig deeper on. So I totally agree with that. I guess to your
point, you are kind of related to what you said earlier, why wouldn't someone want to use
interviews? Uh, cause it does seem like these are excellent methods. Um, but is there a time or
place where maybe other types of methods are more appropriate, um, other than surveys, for example?
Yes. And, and there are right. Dozens and dozens of methods. And we, we, we make up new methods all the
time. It's a great, this is a great topic that you all have actually published. I think the classic
article on this by Christian Rohr, uh, I think when to choose which user experience research method,
I might not have the title exactly right, but it's, it's a classic. Um, everyone, if any, everyone
cites this all the time and he does a great job of kind of laying out different methods and what kinds
of questions they answer. Um, so one example is, um, you know, let's say we wanted to understand
there's, there's two different navigational paths on a website to kind of complete some
task. Um, and we want to understand, uh, I don't know which one people would choose or
which one, uh, if people go down those paths, which one do people get further down? Let's
say we want them to complete, to complete something like make a purchase or register. So, you know,
path A versus path B, which leads to more success. Um, well, that's not, that's not something
you can do in an interview. Uh, but it's something you can do like in an A B test. So you, you put
those up and, and people are randomly sampled on your site and they're directed towards something
and you can maybe, depending on your site and your traffic, you might be able to get a lot of data
very quickly about, um, two different design choices or two different kinds of implementation
choices. Again, you don't understand why. Um, and maybe you don't need to understand why
at this juncture, like maybe you've done some interviews and, or done some design explorations
and you have two different ideas and you just want to see which one works better. Um, you know,
I think you, you put it really well, right? We could combine different methods and we could,
if we needed to understand why, uh, we wanted to dig deeper into like, what is, what's the objective
in going down this path or what's the mental model about how this site is structured or what's the
anticipated reward at the end of this path? We might want to do an interview. We might want to
sit with someone and have them go through this path and choose A versus B or show them A and show
them B. Is that a usability test or an interview? It goes back to your initial question. I don't know.
It's sort of using interviewing to ask why, but if that's not our question, if our question is sort of
which one or what, or sort of even how, then we would sort of run an experiment or put these things in
front of people and kind of track that that's a very different kind of initial question than the
ones with the, with an interview. Um, but that's a scenario, I think where you wouldn't use an
interview initially, you would use an AB test to get that answer. Right. Right. And I think, yeah,
there's obviously going to be certain things that you might not be able to ask somebody because it just
simply is too fraught with challenges. Like what comes to mind is maybe we ask someone,
what is the process you take when buying a house? And like, certainly someone can
like talk from memory about the time they bought their house and maybe they give a lot of detail,
but there still might be missing pieces that not because someone's being malicious or withholding
information, but maybe they just didn't think it was worth talking about or forgot about it.
And, and I think that's one of the challenges with interviews as well as like maybe, um, some form
of observational data, or if we were to say, combined with like a diary study where someone is
tracking what they're doing over a longer period of time, then, then these methods might help to
compliment where an interview, which as much as I would love to talk to people all day long, right.
There's kind of a time limit to it. Another thing that comes to mind is maybe an impractical
scenario to observe. Like, um, they're probably going to be some parents on this, listening to
this podcast who would say, yeah, if we were studying something like a baby waking up at three
in the morning, is it really practical for a researcher to be present and observing that
possibly, but maybe that's not something people have the budget for or invasion of privacy,
you know, it might not be that interesting. So that's where that interview can obviously fill the
gap of perhaps something that's really hard to observe as well. Obviously this is a topic you're
very passionate about and there's opportunities to interview users, not just as its own standalone
method, but even within other methods like usability tests or field studies. And it's a topic that
you're passionate about so much that you not only wrote a book, but you've also had, you know,
created this second edition 10 years after the publishing of your first edition, interviewing
users. Uh, what would you say has changed over the past 10 years when it comes to interviewing users?
And I think the, the context in which we do research, I mean, of any method, but certainly
interviewing, um, has, has changed in 10 years. You know, we've seen a lot more, uh, organizations
building up teams of, of user researchers that, you know, are doing interviews. Um, we've seen
a lot more people inside organizations and the, the term Kate Tousey has this great term. She calls
them people who do research PWDR. And so she, uh, is a writer about research operations and a, a leader
and teacher in that field. And she talks about, you've got researchers and you've got PWDR. And that
framework I think is really helpful because, you know, we've been sort of struggling, I think
collectively as a field over like to ask ourselves like, well, who does this work? Is this people with
titles? Is this consultants or vendors or partners? Is this, um, you know, whatever you go across your
organization. And I think it's, it's still an open question, but it's, it's shifted over 10 years.
You have, you have fewer consultants and external practitioners. You have more internal practitioners.
Um, and so you have with that, that we've also seen you have leadership. So I think you could go
back a number of years and a company may have a researcher on staff, uh, but they maybe didn't,
they reported into some other organization. So kind of the maturity of research as an in-house
function. So you have, you know, you have career ladders, you have ways to progress and, um, you
have teams with different methods kind of brought together. Um, and this, I mentioned research
operations. This was not really a term that got used very much of it all 10 years ago. This idea that,
um, you know, that operational support to make an organization ready to effectively perform research.
So there's lots of, there's lots of tactical aspects, logistical aspects, legal compliance
aspects, which we really didn't talk about. So we kind of tried to avoid maybe 10 years ago,
um, that, you know, legislation has gotten more serious, the consequences there are, um, you know,
there are, I know I have colleagues who have, uh, who are research leaders and they have a partner
in the legal department of their organization. And that person is set up to that legal person set
up to empower a research person to do their work, as opposed to maybe in the past, they maybe weren't
told about it so that they didn't block it. So, um, you, and that's kind of an, that's some
operational mindset that says, you know, what do we have to do as an organization to be effective
this way? What tools, what practices, what skills do we need? Um, and so that, I mean,
that's a very mature, um, kind of narrative about how research works inside the organization. So
it wasn't in the organization as much. And if it was, it was, you know, maybe opportunistic or kind
of, you know, small and not sort of fully empowered. Um, so I think that, that, and, you know,
if that's not you and then, you know, you're listening to this, you know, like, that's not
you. Um, you know, I would say like, I don't want anyone to feel bad. I think, you know,
you're now part of a more mature, uh, practice and there is sort of better examples about what
this looks like when it's kind of built to scale fully staffed. And, you know, yeah, it's
a big, I don't think everything is rosy for researchers in organizations or as partners at
all, it's hard work, it's uphill work, but, um, you know, the baseline or the examples that we can
look left and right and kind of see where there's a commitment and an investment, um, and what a
difference that makes. We didn't, we didn't have that 10 years ago. So that, that context has really,
I think, changed. So, you know, who I'm writing to now and what I want to kind of arm them with,
um, you know, I have more, much more to pull from and I have a little more clarity and
direction in trying to address the different participants in research, the researchers,
the PWDR kind of, you know, that whole, that whole system. Yeah, definitely. I think overall,
I totally agree. I think the field has really matured to the point that we're not only saying
research is important, which I think was sort of the, the chant, you know, that we often had to
basically repeat over and over and over in order to get seen or even taken seriously.
Uh, in a lot of meetings, we've kind of moved beyond that where now there is generally speaking,
again, to, to what you said earlier, there are some organizations that are still a little bit
less mature, you know, as far as what they're currently incorporating in their research practice,
but overall the field knows the importance of research and has started doing it often.
Um, and so now the question is, well, if we do have people who are trained researchers and if we
have other people who are maybe not trained researchers, but still see the value and maybe
their organization doesn't happen to have that researcher role, can you still do these? I think
there still is room to do all of that. Uh, but I think there's also a challenge now where if we have all
these different people doing research and researchers doing research, how do we de-conflict, you know,
perhaps overlap. And that's like a whole nother can of worms. Um, I know Cara Pernice has also done a lot
of, uh, work in research ops, has her own class on it. Kate Tausey, like you were saying, is a great
research ops thought leader as well. Um, but yeah, that's, that's a huge change and a huge shift. And I think
it's a good shift overall, even though it is more complicated, I think that it bodes well as a whole
for just the idea that interviews are something that people don't balk at as like an extra step,
but as something that, Hey, yeah, that's an excellent idea. Let's go, let's go and do that.
I guess a follow-up I have is actually about how maybe interviews are also more accepted or more readily
accepted in part by this kind of catalyst of the COVID-19 pandemic, which has now made remote
research the norm, right? Which I think we were slowly moving toward that, but now it is very much
the norm. So what are your thoughts on like remote interviewing? Are these better or worse? I feel
like there are some, there are different schools of thought on this and I would love to know what you
think. Hmm. Yeah. I mean, hopefully it doesn't have to be a binary, like we're only doing one or
only doing the other. I think, uh, you know, there's some, certainly some benefits to remote
research. It's, um, it takes less time because you're just at your, at your computer to do it. Um,
I think, and I don't know if this is totally right, but I think it, it, there is some, some narrative
that it provides, um, more, more opportunity for inclusion, um, that you might, if you're the
researcher or planning to research, you might be able to quote, get to places that not that you
couldn't go, but maybe you wouldn't go. One example is like people in rural environments,
you know, that, you know, it's sort of, if you're going to do in-person research, you might go to a
city and like, you know, very quickly, you can get a bunch of different kinds of neighborhoods or
different demographic spread. Um, but to go to, uh, you know, a low population area where you got to,
you know, fly to a small airport and drive and, and so on, that is like the cost for that is high.
That's one area where, um, uh, where you can bring in people that you might not be able to access as
easily, or you would have chosen not to. Um, I think there's, um, you know, more complex issues around,
um, this is very technology based. So are we including people? Are we excluding people that
don't have access to this technology? Um, you know, I suspect we're just replicating the biases
that were in place, but maybe we're introducing new ones. Um, and maybe there are, you know, people
that are, uh, you know, that can, we can find people that, um, you know, maybe aren't office-based
that aren't laptop-based, but that are carrying around a phone and using a phone and, um, uh,
where we can involve them. So I, I guess I think there's an inclusion question and I don't really
want to come down hard on like, this is what's happening because there's, you know, it is still
a ship and I think it's, but it's an opportunity for people to consider those things. Um, I, I
grieve and I'm using that word pretty strongly. Like I grieve the loss of in-person interviews
because, um, you know, for all, and you were saying this really well about the connection
we have with people, it's different over a computer. Um, it, it creates challenges and
this is, I mean, we know this because all of our meetings and keeping touch with our extended
family, like we've lived our lives, you know, to a greater or lesser extent for several years
this way. And it's, it's created new opportunities and it's limited other kinds of things. Um,
and I really miss, um, there's just an excitement and a fear and a challenge and an inspiration
that comes to me from being in the same room with somebody and seeing what their environment looks like
and being shown around their workspace or getting a little tour of their house or peeking in their
fridge or whatever that is. And, you know, yes, there are workarounds to kind of how, how do we
get that data? I guess I'm talking more from like, you know, that, that human experiential aspect of
research. I enjoy talking to people and I get something out of it when I can go meet them. Um,
and I'm an introvert. I don't want to do that all the time. It takes energy, et cetera, et cetera,
but something really amazing happened. So I would love to see us,
you know, move to sort of choosing when and where and how we, we still do this. And that, um,
you know, we've, we've been talking all along about how do we kind of combine different methods. I think
some remote, some in person, um, you know, so that we are, are sampling broadly and, you know,
being as inclusive as possible, but giving us, and Hey, it's not just me. Like when I take my clients
out to do research, um, right. To see what happens with their eyes, to see, you know,
to drive back from the interview in the car with them and hear what they're talking about. Uh, I can
see that they're changed and the debriefs and discussions we have from these, I mean, fairly
profound experiences, even about very ordinary topics, like stuff happens, stuff happens to us.
And we talk about it and I learn about them. I learn about how they think about their customers.
I learn about the problem space. I'll learn about their decision-making culture. Um, I get this
interesting kind of access and we don't get that when we run a 15 minute debrief in the, you know,
in the zoom after we hang up the zoom with the person. And I'm not deriding those things. Those are
like, there's a lot of really good, um, optimizing that's happened in remote research. Like it's
really grown up a lot over a few years, but I like the, the messier interpersonal aspects, uh, as well
and kind of what I can get out of them. And yeah, I haven't done an in-person interview in several
years and I would, I would really, really like to get that back into my life because I just find it
really joyful ultimately. Yeah. It has been a few years for me as well. And, and like you're saying,
part of it is cause it is cheaper. You don't have to travel. It's, you can get easy access to people
and, and yeah, regarding that inclusivity question, like there are lots of people who suddenly can do
interviews that maybe in the past couldn't because they couldn't leave wherever they're working to get
to where the interview is, or we, we couldn't talk to them as easily outside of these office hours.
When thinking about the past 10 years, obviously this is one big shift. There are many other big
shifts. What hasn't changed in the past 10 years, do you think?
I mean, this is still like a, uh, you know, uh, an enterprise of connecting with people and
listening to them deeply and learning ourselves how to be good at listening deeply and, um,
understanding what's not being said as much as what is being said and learning where to follow up and,
um, you know, doing the hard, the hard heads down and collaborative work to, to connect these
disparate dots into something new that we can bring together and recommend as a, you know, as an
implication or as a action that we can take. Um, I mean, that's, that's the work. Um, and so the tools
that we use to do it will change, uh, who's involved in doing it will change. Um, but that,
I mean, at a, at a, at a fundamental level, it is this sort of human to human and kind of creative,
you know, connecting disparate dots sort of work. Um, and that, and I can't imagine that changing.
That is, that is what we do. Um, so yeah, it's fun to revisit, like, you know, in, in rewriting or
writing an iteration, like, um, I don't think any of my advice from the past was wrong, but I have
lots of my own experiences to kind of update or better ways of explaining things or things that
I've tried and messed up that, that I feel like the context has changed. The fundamental sort of the
backbone of this is the same, but, uh, I have 10 more years of thinking about it that, you know,
has helped me feel like, Oh, I can explain it better or explain it in a fresh way. Um, so that,
that's kind of fun to do. I think, you know, as a, as a, as a writer or a thought leader on this stuff
that, you know, was part of my, my thought process and my practice. I agree. I think over the last 10
years that there have been a lot of changes, but human beings, like human to human conversation,
conversation hasn't changed in eons, right. For, for many, many decades. And I think there's always
going to be this fundamental aspect of how do we converse with one another? How do we connect with
one another that that largely won't change. But I also agree to with you in the sense that
it's taken me many years to get better at interviews. Um, I do teach a class about it and I do
know what some of my kryptonite sort of moments are, or some of my, uh, weak spots are, and I
continue to work on them, but it's, it often does take quite a bit of time to build awareness around
some of these more logistical challenges. Like how do I ask these questions or how do I convey that
I'm listening? How can I be an active listener? How do I analyze the data? Like there's so many more
things that I know I'm continuing to explore as well in our own research here at NNG. So Steve,
to wrap up, I did have one final question for you, which was what advice would you give to people who
are looking to level up their interviewing skills? Uh, I liked your description of kryptonite. I think
that we all have those. And, you know, one of the things that we get from, I don't know, listening to
podcasts or reading books or just any kind of practice and reflect is, is a chance to, to see what those
are and, um, you know, and reflect on them and, and, you know, and, and try to work on them. I think
everybody is like that and you really put it so well. Um, so I'm going to highlight one that I think
is fairly common and kind of challenging. Uh, and that is, um, it's about putting the question,
putting the answers in the question. Uh, so I'll give a good and bad example. So what people might
often do is say, um, uh, what did you have for breakfast today? Did you have toast or juice or
cereal? And then even kind of trail off. Sometimes they just kind of end with an error or like the
voice is kind of holding onto it. And the question, the expert question is, what did you have for
breakfast today? And I think like, I want to unpack a little bit about why we do that and why it's,
why it's bad. Um, we do that because that moment at which you stop asking your question and
kind of hand it over to the other person is fraught. It's actually much worse in remote,
um, because like turn taking isn't well supported by the microphones. They cut each other off,
uh, lag, a little bit of lag in audio or video makes that very, there's a very sort of automatic
human. We know about intake of breath and body language. Like we pick all that up when we're face
to face with somebody. And that's makes us better at turn taking. And that's really what that is,
right? I'm done talking now. I want you to talk. So in remote, it's even worse, but in general,
it's still uncertain. You don't know this person. Some people, um, you know, nod while the other is
talking, which you and I are doing this whole, this whole, uh, interview. Um, other people just
keep their bodies entirely stock and they still, and they just stare at you. And so you don't know,
uh, you know, at a sort of a gut level, it's uncertain. Is it okay to ask this question?
What's the person going to do? And so I think our naive self tries to be quote, helpful. Let me be
helpful to the person and sort of lower the, lower the intimidation factor. And so I'm going to like
help you with the question by making some suggestions as to what I might mean. Um,
um, and, uh, so, so that's, that's why we're doing it. It's, it's anxiety provoking to ask the
question. What did you have for breakfast? Stop. That is tougher than the, the, the drawn out
question. But the problem with that is, um, it, it turns it into like a multiple choice question
and it, it starts to position you as the keeper of the right answers. And I think what we don't
realize as interviewers, unless we've been on the other side is how much power that interview has,
uh, to the participant and how much that participant wants to do a good job. And that
goes against what you're trying to accomplish. You don't want that power. You don't want them to be
pleasing you to be reflecting your mental model, your list of options. Um, and so, you know, I think
if you're new to this, you might hear my example and say like, you know, so yes, you said, um, you
know, juice, toast, uh, cereal. And if the person had pizza for breakfast, they can just say, no, no,
I didn't have any of those things. What I had was pizza. And maybe the first time they'll kind of
push back. But eventually over the course of this interview, the more you tell them what a good answer
looks like, the more they are going to, without really thinking about it, you know, uh, adapt to
the rules that you're training them on. And so you'll get them, you're reinforcing your power and
you're going to get answers that reflect back what's in your question. Um, and so you miss the chance to
get an answer that falls outside of that, that list. And you're not trying to give them a complete list.
You're trying to give them for examples, but it doesn't come across that way. So, um, again,
this is very, very hard, uh, which is why I'm like unpacking it so far. Like, and I think people
will find, Oh yeah, I do that. I mean, I hope people recognize it either now or the next time
they do it. Not because I want anyone to feel bad, but because I want someone to do exactly what
you're talking about, which is like, be aware of your kryptonite moments and like, just try once
not to do that and feel that discomfort. Like that's, that's the thing to kind of work on.
So anyway, that's one sort of simplistic and yet I think deceptively challenging piece that
I think can make a big difference in the quality of the interview and what you're learning from that.
I absolutely agree. And for those who aren't listening to, or aren't watching the video version
of this podcast, uh, the entire time Steve was just mentioning this, this particular quirk,
I was smiling and laughing because I do this precise thing all the time. Not all that. Well,
I've been more aware of it more lately, but, uh, but it is one of those things that when I was
especially a less experienced researcher, I did way more often. And it's something I've been trying
to do a lot less. It's hard because like you're saying, turn-taking is hard, but when you're also
hyper aware of that turn-taking, even if it's in person, but especially remote, it is hard to stop.
And, um, and it's also hard to not help. And one of the things that I noticed about my interviewing
is not just helping by offering examples, but sometimes helping also would look like,
Oh, I want to be a relatable person. I want to connect with this person even more. So I'm going
to share a little bit about my life experiences, which for a podcast, that's fine. But for a research
interview, that might not be fine because then I'm starting to offer examples from my life that might
taint this person's actual accounts of what they do in their life. Or maybe they'll realize that
they don't relate with me after all, because they had a totally different experience. So I think that,
and I like how you framed it as trying to help, because I don't think we do this
maliciously or narcissistically, but out of this desire to help and connect. And I think if we can
kind of catch it in the moment and find a way to frame the way we're asking questions as
my helping should really be focused on asking this question as clearly and concisely as possible,
like that's going to be a much better use of my time and certainly a better way to help. So
I appreciate you sharing this. And it did kind of give me a moment of pause too. And maybe I'm
starting to get a little nervous now, like, Oh, no, did I do this for the whole interview?
But hopefully, I've done okay for this batch. But anyway, Steve, I just wanted to say thank you
for sharing this advice for sharing all of these great pieces of insight about interviewing. And I
think it's going to be amazing to read your book. So for those listening, I'm going to hold it up
interviewing users. I'm going to hold it up for those watching.
There's a copy. Yes, yes. So this is the interviewing users second edition.
And, you know, make sure to check that out. But Steve, if anyone wants to follow you and your work,
is there anywhere you can point them to website wise, social media handle wise?
Yeah, two places. My website is my last name, Portugal, Portugal.com. So, you know, I post,
I'll post this here when it when it comes out, I try to keep up to date on things that I'm doing that I
have to share. And, you know, social media, probably LinkedIn, I think would be the best place. So it's
my name on LinkedIn. I love people to connect with me there and, and get to chat with them.
Awesome. Well, Steve, thanks so much for your time today. It's been a lot of fun interviewing
you. And it's given me a lot of food for thought as I think about my own interviewing practice. So
thank you so much for your time and hope you have a great day.
Thanks. Thank you. And same to you.
Thanks for listening to today's episode. If you want to learn more about UX,
UX, a great place is to go to our website. There you can find thousands of articles, videos and
reports about UX. This website is www.nngroup.com. On that note, if you want to stay up to date on our
latest research and publications, we do have a weekly email newsletter, which features our latest
articles, videos, and upcoming online courses. And if you enjoy this show, please follow or subscribe
on your podcast platform of choice. This episode was hosted by Therese Fassenden and produced by me,
Tim Noiseser. All editing and post-production is by Larry Moore Production Company.
That's it for today's show. Until next time. And remember, keep it simple.