DU020 – User Research Analysis

Season 0 - Getting Untangled
Season 0 - Getting Untangled
DU020 – User Research Analysis
/

What is the mysterious process between doing user research and then doing something with it? Hint – it’s called user research analysis.

Carla, Chris and special guest Christina Li talk about note-taking, affinity mapping and sh*t sandwiches.

Transcript

Episode – DU020 User Research Analysis.

Host: Chris Mears and Carla Lindarte. Guest: Christina Li

00:17 Chris: Hello and welcome to Design Untangled with me, Chris Mears and Carla Lindarte and very special guest, a friend of the podcast, I’m not sure if you consider yourself a friend, but you’re our most prominent guests now clocking in at appearances over 20 episodes. Christina Lee, how’s it going? Everyone? Hi.

Carla: 00:38Hello. What? This is the first time we do a, what do you call that in English trio?

00:44 Chris:Atreesome.Yeah.
00:46 Christina: I’m not sure I want to participate in this.

00:50 Carla:That’s so cool. We’re live and we’re like three of us talking at the same time does amazing. I love technology.

00:57 Chris: Yeah. It’s pretty cool. Three people talking. Who would ever thought computers could get that far, by 2018? Mental.

01:07 Christina: WhoknewatthestartofInternetera?

01:10 Chris: Not me for sure. I was still in nappies at that time even though I was like 14 probably. What the hell are we talking about today?

01:23 Carla:We are talking about research analysis and that’s why we have lovely Christina joining us, because, well, she’s actually a researcher.

01:35 Chris: Unlike us. Imposters over here. Claiming to do research now and then.

01:41 Carla:What do we mean with research analysis? Sorry, Christina, I’ve interrupted you.

01:47  Christina:No,no problem.

01:48  Chris: Someone talk.

01:51 Carla: I am asking you the question.

01:54 Christina: It’s a really good question. What do we mean and when does it start and where does it fit into a process?

02:01 Chris: I think it’s one of the, it’s one of the more mysterious parts of the whole UX and design process. So you get your users in, you show them some stuff, they do some stuff. Like how do you turn those observations into stuff that’s actually useful for you as a designer?

02:20 Christina: Well, I think a good way to start it is when they are coming in to do some stuff, like make sure you have the notes to help you to do the analysis you need. So whether that’s someone transcribing for you, exactly what the participant have said or whether you as a team, you sort of, make sure you have one insight or one quote per post-it note written down, that will help you later on in the process. And as suggested a transcript, because when you’re in the lab, oh, as a researcher or observing, we tend to jump to conclusion or we introduce our biases, we use a short [inaudible 00:02:56] oh, they said this, it must be so and so. But when you actually read through a transcription again or look at your notes written down as it is, you realize some of your own biases and the conclusions you’ve drawn and you get to look again at what actually has been set and how the participant has behaved.

03:14 Chris: Yeah, I think it’s definitely a good to have that kind of record of stuff because you can be doing sessions across three days or something and then aside from all the bias stuff, by that point you’re probably going to be pretty [inaudible 00:03:28] fringe for energy, not paying as much attention as you were in session one, which is usually the case. So it’s good to have that kind of record go back on.

03:38 Christina: Exactly.

03:39 Carla:Well, I used to just do lots, of debriefs after each session. So that really helps, especially if you’re like with a team. Um, I think I’ve said this before in other podcasts, but I wouldn’t do research unless you have at least two people, at least one person taking notes on the other one, conducting the research and then after you finish your session, whether you’re in a lab or are you outside doing some kind of guerrilla testing. You can break, take a break for 15 minutes and together do some debrief and you can use affinity mapping, which, let’s explain to people what affinity mapping is. So, Christina, how do you define affinity mapping?

04:22 Christina: Affinity mapping? To me it’s actually I said, when you had the insight or your notes written down, whether it’s on post-it or [inaudible 00:04:30], you can start looking at groupings of similar patterns of behavior or similar things that participants said or things you have observed. And each of these grouping then forms a theme or description of what has happened. And that’s how you start seeing themes emerging from your research or bigger insights coming out of it.

04:53 Chris: Yeah, I find when I’m doing affinity mapping it normally takes a couple of passes. So you properly start the session or the analysis session just with millions of random post its all over the place. And the first pass is normally just grouping them by category. So you might have some stuff you saw or observed around, I don’t know, the checkout for example. That’s a good kind of way to initially get things grouped. Then you’re looking within those groups themselves, are there any themes that come out from those like either within that category or that map across to other categories as well? So it’s normally a couple of stage process.

05:35 Christina: Well I think that’s a really good point because how detailed do you need to go or should you go, because you can go multiple passes and you start seeing a bigger theme, a smaller and smaller gripping. So it’s actually finding the right balance that works with the team you are with.

05:49 Carla: I guess, also helps and I mean just this is based on my experience. When you plan your research, you have to plan your research in a way that you are clear about what you trying to achieve. So it’s kind of like what’s the outcome I want from this research? And then start planning backwards. So what I mean with this that, if you could come up with a framework from the beginning that allows you to accelerate the process of analysis. So that framework could be based on a journey like as Chris mentioned, we’re going to look at, in this research piece, we only going to look at checkout and at basket, for example. But it might be a situation where you want to kind of go deeper into other areas or look at how people, for example, I did a piece of research on how people use mobile in store. So we can look into, how they used mobile in the sense of taking pictures or, so you kind of come up with your own hypothesis, initially, so he helps you frame a little bit more how you conduct the research, as well as how you actually pull the insights.

06:59 Christina: Yeah. So you have already defined what your acceptable outcome would be in a way.

07:04 Carla: Exactly.
07:05 Christina: Which helps.

07:06 Carla:I mean obviously you have to be open minded though cause sometimes you find things that you didn’t expect but at least it allows you to have some instructions.

07:13 Christina: Yeah. So yeah. So I think, the way I do my discussion guide is obviously I have the research questions but also I make notes on what I would like to observe, to do a checkup if they have completed the scenarios as we wanted or whatever your outcome is. And then we do the debrief and sometimes, we can talk through the results, like you have mentioned Carla, but I would like to ask the question, what was surprising to everyone? Because that’s when you know what everyone’s thinking that didn’t quite match to the expectations. And you learn from that as well.

07:49 Chris: What do you think, so the researcher is obviously, the one that generally runs the sessions and stuff, but how much of a front seat should they take when you’re going back with the team and actually analyzing those findings? Is it more of a facilitator type role, or are you driving the analysis based on what you’ve observed?

08:11 Christina: Well, for me, I like to work with the team, because it’s to really reduce the bias that I’ve mentioned before about, we kind of only remember bits and pieces of information or we jump to conclusions also. And if your team has spent the time observing to research with you, why can’t we talk about it together?

08:37 Carla:I think you could act as a facilitator, and then kind of like try to pull, especially if you’re not working with researchers as well, which is normally the case. You could act as a researcher on a project by then work with UX designer or visual designers or I used to even bring BAs and PMs into the sessions, just to kind of create some empathy from them too to the users. But so you could act as a facilitator I guess. And obviously as a researcher, you’d be the one, leading and framing the insight generation. But,as you said, it’s really valuable to listen what other people have to say as well, because you’ve removed the biases.

09:26 Christina: Yeah. And I think it’s good to bring in different disciplines because everyone has their own, I guess business view, user view, their own agenda if you like, and is actually talking those, from what you have observed as well.

09:42 Chris: I think it can be helping to guide if these people are new to UX and research, you can get, you might get some people saying, oh they didn’t like the color red or whatever, and you have to kind of guide them a bit deeper. So get to the core of what that observation actually was. So was it that they just missed it because something else was distracting or they just weren’t expecting that, that button at that point in the flow or whatever. So there’s a bit of mentoring I suppose, in helping translate those observations into stuff that’s actually useful for you to move forward.

10:17 Christina: I think on that point I would add, I think we, when we do analysis, people tend to jump to solutions, rather than looking at what have we seen. Why is it that thing we had observed? Why did the participants say those things? So I sort of broke it into two parts, usually, the first part is, just take it slow and think about what we had seen. Start grouping and we don’t talk about solutions. Our outcome should be, at this point, what other things that has emerged, and why has it occurred? And then you park any solutions you have. Don’t talk about solutions, take a break, come back and do the second part, and you can talk about solutions and actions all you like, based on the groupings that we had done earlier.

11:03 Carla:I think that totally makes sense. The only thing I would say, and it’s not a disagreement, I completely agree. Depending on the type of research that you’re doing, I’m just going to talk about an example of a project I was on, we were testing a mobile, payment journey for a mobile banking app. And what we did was, we observed that people were like the kind of type of capability of some of the icons wasn’t great and they needed more contrast. And I had already spoken to the visual designer about it, but he kind of refused to change that. So I say, okay, let’s just go with that. And we started like going through the tasks with the users and I mean I know I agree with you, we shouldn’t jump into conclusions, but sometimes what we do in an agile project is try to make the changes on the go, not on the go, but for the next session. So what we did like, okay, let’s just do the first four sessions with this design and we had a visual center developer sitting in the room next door. And they actually made changes in the prototype for the other sessions in the day. I mean I know that this is a different type of questions that you’re trying to answer, right? Because it’s just about [inaudible 00:12:26] but still, valuable to make the change and then we realize that people were actually getting it and like going through the journey much quicker and easier. So sometimes depending, again, just all depends on what you’re testing and why you are actually researching, you could actually make changes and make decisions on the go just to kind of try and see if you can improve experience quicker.

12:52 Christina: Yeah. I mean, we do that too and in the research we do, but I think it’s like you say, it really depends on how complex that problem you see right now is. If it’s, let’s say we’ve sometimes done silly mistakes on a prototype that’s easily fixable and we can work from that as well.

13:09 Carla:Yeah, exactly. And yeah, sometimes it’s just the prototype is broken and people can’t get it [inaudible 00:13:14] It often doesn’t work.

13:16 Christina: You can just see, like you say, it’s just fixing that, experience then that is that easy fix relatively compared to maybe some of the bigger themes that might emerge from the research as well.

13:29 Chris: Yeah. I guess one cautionary note on that is as UX designers we do like to solutionize as much as the best person. And we can see for users doing something and that will be like, oh I just need to tweak this, and that will sort of out. But it might be the case that actually wrecks for the next three users you see. So I do agree you should probably fix stupid obvious mistakes on your prototypes whilst your testing in the session. But it probably depends how many users your testing with and how deep the problem you’re trying to fix on the hoof is, as to whether you do or not.

14:10 Carla: Yeah, totally agree. As I said it was just another angle because sometimes you have less budget for testing. Sometimes you just have, oh I managed to convince the client to do one round of this testing and you kind of want to get it as right as possible. But yeah, you absolutely right, we as UX designers, we already see, oh I see the problem, the button is too small, whatever. And we have to kind of step back a bit and focus more about what is it that we’re trying to learn with that piece of research.

14:43 Christina: Yeah. And I think this is where the role of the researcher comes in. It is to help the team to be more certain and to answer the questions that the team has. So that’s why, as a researcher, I like working with the team to do, not only do research together but work through what the questions are. How we actually plan the session. Can we do some pilot run ourselves to make sure it works. Okay, on the day it’s pretty much a team work. You helped me with the notes and I do the sessions, and then come together and do the analysis, and go through what we have observed.

15:18 Chris: Yeah. Actually you’ve got a thing, you’re not doing this research in the United States for the sake of it. That analysis has to lead to some kind of outcome. And the outcome might be building or changing a feature. It might be raising awareness of a particular issue in the organization. So it comes back to outlining what your research goes [inaudible 00:15:42], which I think I’ve spoken before, about quite a few times in this podcast. You have to understand where, what you’re learning goes, and what it does, and what it can influence, and who it can influence as well.

15:56 Carla: Yeah. And I guess it’s also like, I don’t know, we always talk about insights and I understand what an insight is. But I know it’s hard to get into an insight, if that makes sense. I mean, it is hard to even explain it.

16:15 Christina: It is hard to get insight on insight.

16:15 Carla: Exactly. Isn’t it? Like you, you kind of like, oh, that’s the insight, but you have to go through the process. You have to really understand what you’re trying to achieve and you suddenly find the insight. But the definition of what an insight is, is quite hard to define, isn’t it?

16:33 Chris: Yeah. I don’t know if I could even do it.

16:36 Carla:I don’t know. It’s like it could, some of the insights and the ones that I like, are more actionable, but then again it all depends on the stage of the project that you’re on and what you’re trying to achieve with the research.

16:49 Christina: I agree if it’s more exploratory or at the beginning stage when you don’t know much about it. To me the insight is more about how people behave or general behavior. They tend to do this or that. So you have some starting point in the next round. But as you get more and more certain, about the product or the service you’re doing, then how you do find insight will be very different. I think it goes back to what the goal of the research is. What? What do you think it’s acceptable?

17:21 Chris: Yeah. I guess in the early stages an insight is properly likely to be something new that you’ve learnt about how users behave, or what they do. And when you’re getting closer towards defining an optimal solution, so you’re refining it basically, your insights are more around what approach is more effective than another, I suppose.

17:44 Christina: I could be I think, at the beginning, it’s like a funnel. You don’t know what it is comes out of it. And then towards the end your insight might be even specific things like what we were talking about, the size of their buttons didn’t work.

17:56 Carla: Exactly. I remember I had this project with a retail client, many years ago, and we had a research agency involved in the project. And we were testing the design of a homepage, right? I was, first of all, I don’t know how we just tested a home page, but they really can, I would, let’s do some testings. Okay. It was a static image basically. And then they kept like, we did like three rounds of testing and then the biggest insight of people don’t get the proposition. What do you actually mean? And I kept asking them, just saying, people are not getting the proposition. I was, well, what do you mean what’s the insight?

18:39 Christina: What’s the proposition?

18:39 Carla: If it takes you five minutes to write down on a piece of paper, it’s not an insight. I was like, what do you mean? Oh, basically what they were trying to say is that the copy didn’t explain the kind of the value proposition of the company. I was like, okay, that makes sense.

18:57 Christina: It is not that they did not get it. It is that the content did not help them to get it.

18:59 Chris: Exactly like people don’t understand that. I know obviously, we need a sign off from the client. But because they came back saying that the design doesn’t communicate the proposition. It was really hard for me. I had to say, can you please explain with actionable insight, just change the copy because it’s confusing. Okay, we’re going to change it. But it was really hard. So sometimes we think insights need to be very high level and very fluffy, but sometimes you just need, especially as a UX designer, you just need a solution or not solution but at least an area of focus for you to find the solution.

19:37 Christina: Yeah. I think there’s also some, it just reminded me of something. In the lab you might have the sample size of five people and then people would go go a hundred percent of people sais this and I am no, only five people have said this. You can’t use a hundred percent. How can you say hundred percent when your sample size is five? Of go, as we can see four out of five people do this. Therefore all my customers are going to do that. No, again, is a very small group of people. They might tend to be heading that way, but you need to, if possible, you need to maybe look into some quantitative data to back up. If you’re making really important business decisions. We need to recognize the boundaries of behavioural research.

20:27 Carla: Exactly. I mean it is a way of complementing, you thinking about whether or not a product is feasible or desirable by customers. I mean you could do some qualitative research, but if the business is going to make the decisions based on that, that is wrong. You’ really need to go deeper into behavioural, data.

20:48 Christina: You need a bigger sample.

20:48 Carla: Exactly, otherwise it is not going to work.

20:54 Christina: Yeah, I think it is. Don’t get me wrong. Research is so powerful when we get it right. And we can really bring empathy to the rest of the business who might not be used to thinking like a user centred way, or bit removed from their own customers. But I wouldn’t go around, going, look a hundred percent people said this? That’s just dangerous.

21:17 Chris: And I think the other trap you can fall into sometimes, is only looking for stuff that didn’t work or stuff that disproved what you’re thinking. Sometimes it’s perfectly valid for an insight to be that you are actually right about something or something did work. And I think that can be very useful to play back to stakeholders as well. Particularly those that are less keen on the idea of research.

21:43 Christina: Start with positives.

21:44 Chris: Start with positives, one or two of those and then about 80 negatives, right?

22:04 Christina: You just make them[inaudible 00:22:04] at the end.

22:08 Carla:I reckon as well that if, going back to the topic of research analysis and how do you organize the data. I think is really important. I think we’ve said it before and it’s kind of a no brainer but sometimes you don’t think about it as much. Is that, you need to find those nuggets of nice quotes or things that are really going to support. Especially once you started looking at patterns and themes, you start to really look into bringing it to life. And you know what I used to do as well, like if I’m recording or something, I just write down the timing of that particular nugget. So it helps, so you didn’t have to go back through all your recordings, and you kind of go back to the timing and just capture that and put it in your presentation.

22:57 Christina: Another thing I do is, if people are transcribing what, like sort of what we talked about, the word for word, you know, control f is so powerful. You just do a search for keywords and it just comes out and you get the quotes. I know it’s maybe a bit old school for some people, a bit more traditional. They might be other software to help you, but it does the job. And it like, [imaudible 00:23:19] you said, it’s just really adds up, when you presenting to, I don’t know, senior management and senior leadership is just so much more powerful to actually show the quotes. And where possible, do mini clips, video clips of what happened. You don’t even have to say it. Just make them watch it. The amount of time I’ve seen the light bulb goes off. It’s amazing when they watch the video.

23:43 Carla:They’re like, oh, they finally get it.[inaudible23:46]

23:48 Christina: I will be there going, this is what they’ve done. Tell the story of what the participant behave and what they said, but nothing really is as powerful as actually watching them do it.

23:58 Carla: Yeah, that’s really cool. I was also looking into a project in Google, and what they do as well, the research teams they create lookbooks. So if they go, let’s say they go mainly for field research, they take very nice pictures of where they actually going. Not very nice, but they’d take pictures and they get kind of a competition, with the teams, to say, who took the better pictures. And what they do is they share that with all the people who couldn’t go to the field research, whatever. So they created more empathy with the user base that they’re designing for, I thought that was quite nice. It’s another way of communicating the value of research, and just created empathy between engineering and users. We also, like when in one of my clients before, my Google life, we tried, because we used to do lots of research and they started to question, the client started to question are we actually getting better? Is our design getting better or getting worse? And we won in a way of quantifying that in some kind of way. And we kind of went back to what the values of the brand were, and how we wanted people to feel in certain ways. So when you look at our brand, we want you to feel safe and whatever. And we kind of picked out some of the key values and get people to do kind of smile face, or sad face, or just neutral face. And then I log all the sessions, we could actually count whether or not we were improving in the way we’re delivering the design. And I like the idea and I think they still do, it in that way. So we could do not just look at the results of the research, but also how it evolves, if that makes sense.

23:58 Carla: Yeah, that’s really cool. I was also looking into a project in Google, and what they do as well, the research teams they create lookbooks. So if they go, let’s say they go mainly for field research, they take very nice pictures of where they actually going. Not very nice, but they’d take pictures and they get kind of a competition, with the teams, to say, who took the better pictures. And what they do is they share that with all the people who couldn’t go to the field research, whatever. So they created more empathy with the user base that they’re designing for, I thought that was quite nice. It’s another way of communicating the value of research, and just created empathy between engineering and users. We also, like when in one of my clients before, my Google life, we tried, because we used to do lots of research and they started to question, the client started to question are we actually getting better? Is our design getting better or getting worse? And we won in a way of quantifying that in some kind of way. And we kind of went back to what the values of the brand were, and how we wanted people to feel in certain ways. So when you look at our brand, we want you to feel safe and whatever. And we kind of picked out some of the key values and get people to do kind of smile face, or sad face, or just neutral face. And then I log all the sessions, we could actually count whether or not we were improving in the way we’re delivering the design. And I like the idea and I think they still do, it in that way. So we could do not just look at the results of the research, but also how it evolves, if that makes sense.

25:47 Chris: Would you say, Google life is better than real life?

25:52 Carla: I can’t comment. Sorry.

25:55 Christina: Oh, that is an insight.

25:59 Carla: Oh, Google life is great. Is great.

26:00 Christina: What she says and what she does.

26:06 Chris: Down the apple store every night after work.

26:09 Carla: No. I do have an Apple, a computer though. I have a laptop, a Mac [inaudible 00:26:14]. and I have a Pixel phone. Yeah, a Mac, sorry, I just forget. Yeah, but they allow you to have a Mac or an iPhone, stuff like that.

26:28 Chris: That’s nice of them. [inaudible 00:26:28] I thought you would get shot on sight, if you walked onto their campus with an iPhone or whatever.

26:35 Carla: No, no, everyone, I lot of people have iPhones I have a Pixel just because, I even before joining Google, I had a Pixel as well.

26:42 Chris: Yeah. You’ve never it before?

26:44 Christina: Have we gone off topic?

26:45 Carla: Yeah, I think so. No, you just don’t do research live at Google and I’m trying to research.

26:53 Christina: I was actually going to say wait. Hello. When you were talking about how do you know your design is improving on that topic? You know, we touched on about hypothesis and then as part of that, the way you define it is, what is an acceptable behavior outcome. And then, you might be like, we’re looking at reducing call volume or whatever. And then you can actually start tracking those measures throughout the project or [inaudible 00:27:18] you’re doing. Once the design have gone out, has that actually happened? And then you know, in a way whether you have to live with value.

27:26 Carla: Yeah, that’s true. That is a good way of doing it as well. I mean we also came up with the idea of some heuristics of we want the design to be this, this and that and measure that in every session. But at the end was like, just make it simple, make it smiley face. I’m very simple. [inaudible 00:27:44]

27:46 Christina: At the end of like a session, an hour. It is quite intense for the participants.

27:51 Chris: Alright, we got anything else?

27:53 Carla: No, we finally, well we did more than 26 minutes. Chris.

27:58 Christina: Oh it is hafl and hour.

27:58 Chris: Yeah, it’s 30 minutes. So for every extra person we get, free minutes, extra content. [inaudible 00:28:03] That is not a great return on investment, is it?

28:10 Christina: Excuse me, you have not invested?

28:13 Carla: Yeah, I hope. Like I hope we said something useful I think with did, but if people have questions or they’re not sure about what affinity mapping means or you haven’t seen an affinity map in the past, just contact us. And I’m happy to share some pictures and I’m sure Chris has lots of them, as well as Christina.

28:34 Chris: Yeah, lots of holiday pictures or pictures of my cat [inaudible 28:37]

28:41 Carla: Yeah. You always on holidays you might have, I’m sure you have lots of pictures.

28:46 Chris: Yeah. So when this goes out I will actually be on holiday. Very nice.

28:51 Christina: I actually Chris remember, this project we worked on, wwe did some mind mapping.

28:54 Carla: I like mind maps.

28:56 Chris: Yeah. Mind mapping. Vaguely.

29:00 Christina: That’s kind of [inaudible 29:01] work.

29:02 Chris: Did we work on a project before?

29:04 Christina: Yes. How else did I meet you?

29:09 Chris: That’s a good question. You just keep turning up on my podcast. I’ve got no idea.

29:14 Christina: I kept dropping by.

29:16 Chris: Alright, cool. Let’s wrap this ramble fest up now. So we don’t need to do any plugs doing so. We will see you next time.

29:25 Carla: Thank you Christina.

29:26 Christina: You are welcome. That was fun.

29:27 Carla: See you next time. Bye.

29:29 Search and subscribe to Design Untangled using your favorite podcast app and leave us a review. Follow us on the web at www.designuntangled.co.ukor on Twitter @designuntangled. Become a better designer with online mentoring @uxmentor.me.