logo

NN/g UX Podcast

The Nielsen Norman Group (NNg) UX Podcast is a podcast on user experience research, design, strategy, and professions, hosted by Senior User Experience Specialist Therese Fessenden. Join us every month as she interviews industry experts, covering common questions, hot takes on pressing UX topics, and tips for building truly great user experiences. For free UX resources, references, and information on UX Certification opportunities, go to: www.nngroup.com The Nielsen Norman Group (NNg) UX Podcast is a podcast on user experience research, design, strategy, and professions, hosted by Senior User Experience Specialist Therese Fessenden. Join us every month as she interviews industry experts, covering common questions, hot takes on pressing UX topics, and tips for building truly great user experiences. For free UX resources, references, and information on UX Certification opportunities, go to: www.nngroup.com

Transcribed podcasts: 41
Time transcribed: 22h 36m 34s

This graph shows how many times the word ______ has been mentioned throughout the history of the program.

This is the Nielsen Norman Group UX Podcast.
Happy New Year.
I'm your host, Therese Fessenden.
As with every other January 1st, many of us are reflecting on the year before, perhaps
relishing any ounce of distance we can put between ourselves in 2020.
But New Year's Day also gives us the opportunity to do more than put the past behind us.
It's about new beginnings, new habits, and new resolutions to be better versions of ourselves.
UX is an industry that's centered on iterative improvement and finding opportunities to grow.
So I thought it might be fitting to start this year by doing just that.
I'm excited to share an interview with one of my coworkers.
My name is Maria Rosala, and I'm a user experience specialist at Nielsen Norman Group.
I had the opportunity to sit in on Maria's course, User Interviews, which is a lot of
fun.
I mean, as someone who does interviews on a regular basis, it's always kind of nice
to reflect on some of the practices.
She teaches a number of classes with us.
We teach five classes on various different topics, mostly UX research methods.
But she has a particularly relevant interest.
Ethics, in particular, research ethics.
So we discussed how being mindful of ethics can help us not just be good people, but can
help improve the world around us with better, more human-centered designs.
In that class, you covered something really fascinating and really important, which is
when you're doing user research, there are lots of considerations, ethical considerations
that you need to factor into your process.
Yeah.
There's a whole field of ethics, which looks specifically at how do we protect people's
interests when participating in research.
So we do cover a little bit about that in the User Interviews class.
Of course, not in a lot of detail.
I do remember, I think it was last year, wasn't it, we filmed an online seminar on this topic
on research ethics.
It is one of the things that I'm fairly passionate about.
I have an interest in and a specialism in, specifically research ethics.
I guess one thing that we can talk about today is how does that differ from design ethics
because there are some similarities and of course some differences.
But yeah, we do cover a little bit about that in the User Interviews class.
Not a lot, mainly around collecting informed consent, thinking about researching with people
who might be vulnerable or researching sensitive topics that could cause people to get upset
and how do we manage that and ensure we minimize risk.
So that's really what research ethics is about, thinking about the welfare of people who take
part in research activities and how do we report research afterwards as well and make
sure that it doesn't do any harm to the people who've volunteered their time to take part
in our research.
Yeah.
So we have a whole one hour seminar on research ethics, which I highly recommend as not just
foundational for folks who are new to the research realm, but it was a great opportunity
to learn, even as someone who's been doing this for years, learning how to handle some
of these very unique cases, maybe sensitive topics, things like that.
And it was really, it was a good class.
I guess on that note, you mentioned that design ethics and research ethics are slightly different.
So I understand with research ethics, you're gathering data, you're learning about users
and in that process, in the process of gathering data, you want to ensure that you're protecting
that user.
So how is that different or what would be different when considering design ethics and
what do you think of ethics as a whole?
What even is ethics?
I realize that's a very philosophical question, but I think it's important to really drill
down.
What do we mean when we're saying that we're being ethical in our practices?
Are we making a judgment call on the behalf of the user or are we allowing our user to
make that call?
What do you think of this?
Yeah, those are some really good questions.
My early academic training was in philosophy, so we did cover ethics and of course ethics
is a fairly broad discipline, looking at morality, what is right and wrong.
And then of course, there are different aspects of ethics.
There's applied ethics, which is the application of these particular discussions or case studies
looking at specific domains.
So research ethics is one example of applied ethics.
Design ethics is another.
I see them as two distinct disciplines.
One design ethics is looking at the welfare of people who use our designs, how we treat
them as they use our products or services, how we potentially collect and use their data
because that's obviously a big aspect as well.
How we think about the long-term implications of the way that our designs are used, how
they could be abused by people for nefarious reasons and for nefarious results.
And then of course, research ethics is concerning the welfare of people in research activities.
So some people think of research ethics as being part of design ethics because part of
user-centered design is doing research with users and so you can think of design ethics
as an umbrella, but I tend to see them as separate things, separate disciplines that
require different activities, but they are both underpinned by central concepts.
So concepts like respect for people, respect for persons.
So thinking about each person has a right to make their own choices and we have to respect
that and has a right to choose their own actions and to live their life as they would like.
So we should respect that and treat people with dignity.
We should do no harm and this again is a central concept that underpins lots of applied ethics.
So thinking about in research, for example, we don't open up that person to risks.
We don't cause them to get upset or we don't accidentally and advertently leak their identities
to people who might take actions based on that.
So employee research, for example, is a tricky one because often it's very difficult to do
anonymous research and as a result, sometimes people admit things to the researcher and
then that becomes known to perhaps their managers and there could be consequences to that.
But in design ethics, doing no harm, there are lots of ways that we could potentially
harm people.
We could cause them to become addicted to the products and services that we create.
We could cause people to be marginalized as a result of the way that our products and
services are designed.
We could cause people to become stressed or overloaded by all of our notifications or
alternatively bullying and harassment on certain platforms as a result of the way the things
have been designed.
So lots of negatives that can come out of design, unfortunately, and that's again, do
no harm is one central concept that underpins both of them.
And the last one really that I think is important is justice.
So ensuring that it's equitable, everybody who, regardless of which kind of user is using
a platform, they all experience hopefully the same burdens and the same benefits by
using your particular product or service.
It's not the case that there's going to be one group that is excluded or carries all
of the cost, whereas another user group carries all of the benefit, that would be unfair.
And the same again is this concept underpins research ethics as well, thinking about who
you're recruiting.
Is it equitable?
Are we having representation from all the various groups out there and they're contributing
to this process to hopefully build a better product and service for everybody, not just
for a select few people.
So similarities, but they're kind of separate domains.
Got it.
So it's basically you take those similar ethical principles, but you're applying them in different
ways.
You're applying it in design in terms of looking at how that design presents information or
how that design takes in information.
Whereas when you're looking at research ethics, you're looking at the process of gathering
that data and how that very process can impact those people, whoever it is you might be researching.
Yeah, exactly.
Yeah.
So this is, I love this topic because it makes my brain hurt, but it also is really important.
And when we have ethical considerations in our design, we often have better designs.
But it's interesting to me because I think people also assume that, oh, if it's a good
design, then it is ethically considerate.
Like if it's user centered, it ideally already is accounting for user desires and user needs.
So if users are picking the technology that best fits their needs, then hypothetically,
it should therefore be the best designs ethically that bubble up to the top or that become mainstream.
But what do you think of this?
Do you think that's true or do you think that companies can sort of game this where maybe
a user need is being met, but somehow it's still an unethical design?
Have you seen any patterns like this?
Yeah.
It really does depend on your definition of user centered.
If user centered is thinking about all of the possible negative consequences that can
come about by people abusing your particular product or service and thinking about perhaps
long-term costs, then yeah, in an ideal world, maybe the design as a result would be ethical.
But the reality is we're not the only person responsible for delivering a product or service.
We work with other people who have different objectives.
Often we don't have control over things like what kind of data is collected about these
users.
How is it used?
Where is it stored?
And we often don't think about long-term implications.
Like maybe we can gather a lot of data about this specific individual and then sell it
to a load of third parties that can use that to profile you and target you.
You as the user are not going to necessarily know that, but that's what's happening when
you perhaps agree to some terms and conditions and you sign on and you use that product or
you use that service.
But that product or service can still meet your needs and perhaps some of that data is
going to be used to improve the user experience by making it more relevant, perhaps the content
of the products that are offered to you are more relevant as a result, it's more convenient.
But maybe some of the negatives, you don't feel that for a while until things get kind
of out of control or until you have a situation where suddenly maybe you're being denied a
mortgage or something along those lines and that data has come from somewhere else.
We see these unintended negative consequences, particularly if it's something like social
media, great, it's connected people initially, and there were lots of benefits and designers
focused on those benefits, but there were lots of unforeseen trade-offs.
Some of these things that we've observed over the last few years are really almost unprecedented.
How people would use technology to censor other people, to bully and harass, to create
fake information.
All of these are negative consequences of the design.
Some of those could have possibly been avoided, some of them possibly not.
And that's the job of an organization and designers to think about how can we solve
these problems in a way that doesn't necessarily remove the benefits, but avoid some of those
harms.
I want to give actually one example of an unforeseen trade-off that we talk about in,
because I teach a class called Design Trade-offs and UX Decision Frameworks and we have a particular
case study that we talk about, which probably many people who are listening to this are
aware of this case study, but it comes from Airbnb.
In the early days of Airbnb, this was a really new way of offering rooms to people or lodging
to people.
In the early days, Airbnb designers really wanted to make it very easy for hosts, people
who are offering up their own homes, to feel comfortable allowing guests into their homes.
Some of the design decisions were around giving hosts enough information, as much information
as possible, about the person who was requesting a particular lodging so that the host could
feel like, yes, this is a decent person, this person's not going to destroy my home.
I'm happy to allow this person to come in.
I can create a human connection with the person who might be staying in my home.
And then what they found was that there were lots of reports of discrimination.
So people who had foreign sounding names or people who were black and had a black sounding
name, a lot of those people were getting rejected a lot of the times by hosts.
The hosts had the option to say, sorry, I'm not going to approve this particular stay.
But it was independent research, some experiments that were done by Harvard University and they
found that people with foreign sounding names or black sounding names were more often more
likely to be rejected.
So huge problem, completely unforeseen by the design team, but just shows that design
can be used in ways that you don't anticipate.
And it's really important to think about these as much as you can in advance.
Think about what's the worst possible outcome that could come about of this.
How could this particular design be abused by others?
So putting your sort of negative thinking hat on and thinking about what are some potential
unforeseen trade-offs that could occur and how could those impact people that perhaps
use your product or service negatively is going to be really important as well.
Yeah, that's a really interesting case study.
Thanks for sharing that.
And I think really highlights how difficult this topic and this responsibility is for
designers.
So I know that there are some large corporations out there that have design ethicists on staff
that their primary responsibility is to examine the ethics of certain design decisions.
I realize though that is probably a tough ask to ask every single design agency and
organization out there to have a design ethicist, but yeah, short of having a design ethicist
on staff, what tips could you give teams who are looking to ensure that their designs are
ethical or are reducing any amount of harm that they might be unintentionally inflicting?
What can people do to really ensure that they're making good decisions?
Yeah, that's a great question.
And yeah, you're right, a lot of big corporations, big companies, and big organizations are looking
to have people in-house that can think about these really tricky things because a lot of
designers are working on projects that have tight timescales, they need to deliver something.
There's very little time to think at this broad level, thinking about long-term outcomes,
running long-term sort of experiments or ways of capturing good research data about how
these things are being used, or just thinking deeply, like we should spend time thinking
about it, but unfortunately the reality is that a lot of people working on design projects
don't have that time.
So I think it is important to have people in the organization that can start to think
about those things, but I don't think that's the only thing we should do and I don't think
that's sufficient.
So I do think that it's everybody's responsibility.
It's not just the UX designers responsibility to think about this, it's everybody's responsibility.
At a leadership level, hopefully these conversations are being had and thinking strategically about
how to ensure that we're not setting ourselves up for being in a situation where we have
to reverse certain decisions or spend a lot of time rethinking how we're doing things
to try and avoid harming people or neglecting certain groups or whatever that outcome might
be.
So perhaps considering your personas and looking across all of your personas and say, well,
how does this design decision affect each of these individuals?
Are there any negative consequences?
What are those?
Let's expose them.
Let's go away and do some research to figure out if we can avoid them.
When I was studying my master's, my master's was in human computer interaction with ergonomics
and I remember very clearly the lecturer, the teacher talking about when you design
systems, you also have to design for people who might abuse the system, those abusers
that don't use the way the system is intended.
I think we forget about that.
We often have this sort of rosy glassed view of the world around us and we're thinking,
oh, this would be wonderful for users, but we forget to think about how people could
perhaps exploit the design.
So I think planning for that is really important, but then caring about people and the people
are not just digits, they're not just numbers and I think some organizations fall into that
trap where they think about people as a number and not as a person.
I think that's why it's really important to do qualitative research and to do research
with your real users, go out and speak to them and learn from them.
On that topic, if you could offer any advice to teams about how to keep a long-term perspective,
because as you said, it can be so easy to get tunnel vision on the success of our designs
and metrics are great.
They can be very useful.
They can also be a bit short-sighted depending on how we look at these metrics.
So is there any advice you could give teams on staying focused on long-term effects and
long-term gains as opposed to shorter-term consequences of designs?
A lot of teams are relying on quantitative data or they're obsessed with metrics and
they design purely to improve those metrics and not actually going out and doing qualitative
user research with people who will be using their products and services and really getting
to empathize them and learn about them and understand how those products and services
affect them.
So I think that's a really important thing that a lot of teams are unfortunately not
doing.
Maybe thinking about bringing in users into the design process, doing participatory design
or co-design, particularly in contexts where there are a lot of ethical issues and this
could affect certain groups.
So therefore, can we not bring them into the design process and allow them to help us create
better designs that are more ethical?
And the last thing I would say also is think about your most vulnerable users or the people
in society that could be the most vulnerable and design for them.
Include them in research and think about them because they're often the people that get
marginalized or have no choice in what kind of products or services they use.
They have very limited options available to them, whether that's because they have certain
identities or accessibility needs or whether they belong to a particular socio-demographic
group but involving them and doing research with them and thinking about how this could
impact them is going to be really important because they're often the people that are
negatively impacted.
So a lot of different ways that teams can think about trying to improve from an ethical
point of view, but there isn't like one quick fix.
You can't just hire a person and suddenly all these problems are gone.
This is a complex domain, requires continuous iteration and research and design to get around
some of these really big systemic problems.
Yeah.
That last phrase you used, systemic problems, that taking a moment to recognize that a lot
of these are often larger than the design itself.
There's actually a really great TED Talk I've heard once by this fantastic speaker named,
and she's a computer scientist researcher named Joy Bulamwini.
Probably butchered that name, but highly recommend checking it out.
She's the founder of the Algorithmic Justice League and part of her research goes into
how algorithms are really just reflecting the processes and the things that exist in
the world.
Because as we talked about earlier, a lot of the ethical considerations and a lot of
the challenges that designers are facing are often larger than the design itself, right?
Systemic level issues.
And I love that you brought up that bit about vulnerable users, because sometimes it's just
a reflection of the system that we're automating, not necessarily a reflection of any individual's
biases, although certainly that can come into a design as well.
But I think keeping all of that in mind and knowing that each of our designs has a role
in either reinforcing some of the preexisting systems that are in place or in hopefully
equalizing and making an equitable outcome, as you said.
So I think as long as designers are keeping that long-term perspective and understanding
that every item and every little widget that we design has the opportunity to either make
great changes or continue the status quo, and hopefully the great changes are what comes.
Yeah, but I think the point that you made is a really good one.
Unfortunately, values pervade everything we do.
And even if you think you're being objective, you're not replying your own values to things.
And unfortunately, if we have a group of designers and they all come from similar backgrounds,
they all have similar experiences, you use that and you apply that without even consciously
realizing that in the things that you design and the way that you design things, which
is why we say at Nielsen Norman Group, you are not the user, right?
And pretty much everyone knows this slogan who works in UX, you are not the user.
So you shouldn't expect that you know how people are going to behave or know what they
need or know how they're going to react to certain things.
It's so important that you have that representation in the design process.
Not only hopefully by recruiting a diverse team to work in your design team, but also
thinking about including as much as possible, a diverse group of people in your research
process.
So hugely important if you want to avoid making these massive mistakes that a lot of companies
and organizations have done, especially using AI algorithms where there's just propels like
stereotypes or continues those biases that we all have that is represented in that algorithm.
So we need to do better.
Absolutely.
I think that's an inspiring note to end on.
This will be the first episode of January as design teams are looking to make resolutions.
I think we all can make a resolution to make designs that are truly beneficial for all.
So if others want to follow you, maybe work that you're currently doing or working on,
where would you recommend people follow you or check out some of your work?
Of course, the Nielsen Norman Group website where I publish articles, but I do share some
of those articles and links to reports that I've written on my Twitter account, which
is, let me remember, because one is a hyphen, one is underscore, I think it's Maria Rosala.
My first and last name is a underscore on Twitter.
And then I'm also on LinkedIn as well, so Maria hyphen Rosala on LinkedIn.
This has been fantastic.
Thank you for giving me delightful brain hurts as a former high school teacher used to once
say the brain hurts are what make our work worth it.
So I appreciate you.
Thank you for your time.
Thanks for having me.
Thanks for having me.
Thank you for listening to this episode of the NNG UX podcast.
If you want more information on any of the courses or resources that we cited in this
episode, check out the links that we've listed in the show notes found in the description
of the podcast.
We have a number of upcoming UX certification events as well, some as early as late January,
and we publish free articles and videos every single week.
So definitely sign up for our weekly newsletter if you want updates on the latest UX research
that we've been working on.
To learn more, go to nngroup.com, that's N-N-G-R-O-U-P.com.
And of course, if you like this show and want to support the work that we do, please hit
subscribe on the podcast platform of your choice.
Thank you so much for your support.
Until next time, and remember, keep it simple.