iVoox Podcast & radio
Descargar app gratis

Podcast
Dollars to Donuts 5y344z
Por Steve Portigal
56
9
The podcast where we talk with the people who lead research in their organization. 67443b
The podcast where we talk with the people who lead research in their organization.
53. ing Tomer Sharon
Episodio en Dollars to Donuts
Tomer Sharon ed away last week. I’m reposting his appearance on Dollars to Donuts from 2019. Show Links Episode transcript Episode (and show links) from 2019 From LinkedIn, announcement of Tomer’s ing The post 53. ing Tomer Sharon first appeared on Portigal Consulting.
01:03:46
Interviewing s anniversary and new audiobook!
Episodio en Dollars to Donuts
It’s the first anniversary of the second edition of Interviewing s, and it’s now available as an audiobook. Listen to this episode for a sample. Show Links
31:04
52. Emily Sun of Hipcamp
Episodio en Dollars to Donuts
In this episode of Dollars to Donuts, I talk with Emily Sun, the head of Design and Research at Hipcamp. We discuss staying engaged in work, designers doing their own research, and research at a small, growing company. There’s actually a big opportunity with smaller companies. At small startups, you are much closer to the people who are making the long term vision for what the company is going to be. Because we have access to that level of leadership, there is a lot that can be influenced through research. – Emily Sun Show Links Episode transcript Make Things That Matter — Steve Portigal: Improving your research process Interviewing s, second edition Emily Sun on LinkedIn Hipcamp The Fun Scale Alyssa Ravasio, Founder + CEO of Hipcamp Sifteo CHI conference 3DR Help other people find Dollars to Donuts by leaving a review on Apple Podcasts. The post 52. Emily Sun of Hipcamp first appeared on Portigal Consulting.
49:53
51. Tamara Hale of Splunk
Episodio en Dollars to Donuts
This episode of Dollars to Donuts features my conversation with Tamara Hale, the Director of Product Experience – Research & Insights at Splunk. We talk about the long tail of impact, being an anthropologist of work, and having a creative practice. The ‘doing the research’ bit is only about a quarter of your job. The rest of it is all the other stuff that goes around it. It’s about storytelling and influence and developing a vision and creating alignment around who the customers are and creating alignment on what actually are the business goals. It’s your stakeholder mapping. It’s your internal research. It’s your knowledge management. It’s improving how we work. All that stuff is part of research, and if you only think of your job as that quarter, you’re missing out on some of the most interesting and also trickiest parts of the job. – Tamara Hale Show Links Episode transcript Tamara Hale Splunk Tracey Lovejoy Catalyst Constellations Blurring the Lines: Building a collaborative dialogue from the intersection of creative practice, ethnography and business Help other people find Dollars to Donuts by leaving a review on Apple Podcasts. The post 51. Tamara Hale of Splunk first appeared on Portigal Consulting.
01:18:04
50. Vanessa Arango Garcia of Delivery Hero
Episodio en Dollars to Donuts
in this latest episode of Dollars to Donuts, I talk with Vanessa Arango Garcia, Director UX Research & Research Operations at Delivery Hero. We discuss creating an engaged research community across a global organization, being able for impact, and how today’s challenges provide an opportunity for the research progression to grow. We care a lot about our craft and we need to keep the quality up, but we also need to be pragmatic in how we are able to optimize that process of doing research to focus in the next stages. We dedicate too much in doing the research, delivering that report. And later, sometimes it’s very difficult to dedicate time to following up, connecting with the team, sitting together, ideating, thinking about the roap, because we don’t have time. We are jumping from research to research to research because every research takes time. – Vanessa Arango Garcia Show Links Episode transcript Rally AMA with Steve Portigal on what makes a research practice mature Interviewing s, second edition Vanessa Arango Garcia Delivery Hero PedidosYa EuroCopa The new researcher: Navigating the evolving landscape of UX Research Help other people find Dollars to Donuts by leaving a review on Apple Podcasts. The post 50. Vanessa Arango Garcia of Delivery Hero first appeared on Portigal Consulting.
58:54
49. Sarah Gregory of Coinbase
Episodio en Dollars to Donuts
This episode of Dollars to Donuts features my conversation with Sarah Gregory, Director of Research for Consumer at Coinbase. We talk about research comms, archiving research, and doing research that no one is yet asking for. Our email is designed for one very specific leadership stakeholder, and it is tailored to how that person likes to consume information. There’s a different stakeholder that hates email. That person, I use Slack. Another stakeholder tends to listen very well when they’re live in a regularly recurring monthly meeting. And so I make sure that research always has one or two slides in that meeting. You have to know exactly who you want to be listening, and you have to change your techniques depending on who that is. Which is really just understanding your , right? – Sarah Gregory Show Notes Rally AMA with Steve Portigal on what makes a research practice mature Interviewing s, second edition Sarah Gregory Coinbase The Basics about Cryptocurrency Bitcoin Blockchain Ethereum McDonald’s Theory Sian Townsend Jobs To Be Done Human-Computer Interaction Becoming a Level 1 Sommelier Drops of God NFT Help other people find Dollars to Donuts by leaving a review on Apple Podcasts. Transcript Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead research in their organization. I’m Steve Portigal. I was recently spoke with Rally’s Lauren Gibson for an Ask Me Anything session er research maturity. I’ll link to the detailed writeup and the recording of the full conversation, but here’s a short clip. Lauren Gibson: I’d really quick, like to go back to the skills point you were mentioning. Are there any group or team skills, like say a team is like, we’re trying to upscale or like hiring staffing. What skills would you say would be good ground zero ones to add to your team or have your team focus on to kind of increase that maturity? Steve: I have a bias here. My bias is about the work that I do and the stuff that I write about and that I teach. So I’ll own that, but I wouldn’t say interviewing skills, just like I wouldn’t say survey writing skills or Qualtrics skills. I think there’s that piece above it that makes us better as researchers kind of method aside, domain aside. And those are things like, and these are big words, which you have to chew on a little bit to get to where they’re meaningful. But, you know, we talk about words like empathy and we talk about words like curiosity and listening. I think there’s a big skill around sort of not just knowing yourself, but kind of hearing yourself. We work in fast paced environments. We are asked to be experts. And so developing comfort with not knowing and being able to be confident and curious to be able to honestly ask a question that we think we’re supposed to know the answers to. Like these are, these are about knowing ourselves and kind of hearing ourselves and being that person in the meeting that asks the question that no one’s willing to ask, being able to say something. So this is maybe about storytelling, but there’s an empathy and comion aspect to it as well. Being able to do some research and bring it back in a way. And I don’t mean what is your deliverable look like? I mean, how do you kind of set yourself and how do you talk to a person that you have a relationship with so that you help them hear something that’s new or that’s slightly new, but that’s impactful and significant and might suck for them to hear, right. We’ve learned that the thing that we’re doing is not going to work. And there’s another new problem to solve. Like that’s a great thing to find out, but presented without comion and without some nuance can be seen as harmful. And so these are like emotional maturity, hearing your own discomfort, being sensitive to other people’s discomfort, being a good storyteller. These are all in service of relationships. I guess that’s maybe what it kind of gets down to the relationships that we can build with people. And sometimes they’re like at a distance relationships, kind of like we’re having with everybody today. We don’t all know each other and we haven’t spent weeks and weeks in the same room and kind of shared ownership, but we’re all trying to connect and share information and help each other learn and draw from this. So we’re in a lot of environments and we work in different ways where we want to use these kinds of self-knowledge and emotional maturity to kind of build relationships because that’s how these things that we’re trying to accomplish. And change. So, yeah, if I want to talk about upskilling, I think those are the things I would kind of work on because they pay off across the board. And you can do that explicitly. Like I have taught a storytelling workshop. Like you can say, lean into that, but these are also side effects from practicing any of the more technical skills of research. You want to practice survey writing and trying to go to survey. You’re going to learn humility. You’re going to learn empathy, right? It is baked into everything, I think that we do, if we are reflective about our own learning. Check out the whole session, if you like. And why not buy your nail salon worker and your tax preparation specialist their very own copies of the second edition of Interviewing s. If you really wanna help me out, write a very short review of Interviewing s on Amazon. As always, I’d love to talk with you about the challenges your team is facing and how I can help. Okay, let’s get to today’s episode with Sarah Gregory. She’s the Director of Research for Consumer at Coinbase. Sarah, it’s great to have you on the podcast. Thanks for being on Dollars to Donuts. Sarah Gregory: Thank you, it’s an honor to be here. Steve: It’s an honor to have you. Can we start off with an intro from you? Sarah: Yeah, absolutely. Steve: Are there other directors of research? Sarah: My name is Sarah Gregory and I am a director of research at Coinbase, specifically working on our consumer and retail products. Steve: There are currently no other directors. Sarah: I am the only one, but we do have different product groups that could have a director at some point. Right now, I’m technically the only one. Steve: And what is Coinbase? Sarah: So we are a cryptocurrency company. So it can be anything from where you first bought your first Bitcoin. A lot of people tell us that their first Bitcoin ever they bought on Coinbase. But since then, we also have institutional products for institutions that want to custody cryptocurrency. We have developer products that for people who are actually building with blockchains. So we have all of those, which is the product groups that I mentioned. So I specifically focus on our retail and consumer presence, which is Coinbase.com, the big blue website, the big blue app, our retail customer. Steve: If that’s retail, does that mean that’s consumers? Sarah: Yes, consumers. Steve: I hesitate to sort of open the lid on Pandora’s box here, but for the context of this conversation, what’s the minimum viable explanation of crypto Bitcoin blockchain that you can provide to people like me that know what you’re talking about, but don’t know what you’re talking about? Sorry. Sarah: Oh, my goodness. I wondered if this one was coming. I hope I do it justice. No, it’s OK. I’ll do my best. So it’s the most famous cryptocurrency is Bitcoin. But there are others. And you can think of it as programmable money. So there are some cryptocurrencies that are just trying to be money, new money. You could use them for paying people. You can trade with them. You can do all kinds of money things with them. So you can have a programmable, borderless money that is not controlled by a nation state. Or there are different other things you can build with some cryptos. You can actually create like different kinds of economic situations. You can do things that are unrelated to money. Not censorable, so you can build something that allows you to ostensibly tweet on a blockchain that could never be removed because it is on blockchain. So there’s all kinds of different things that you can do. So yeah, that’s what I got. Steve: And how long have you been working at Coinbase? Sarah: Over six years now, which is crazy. I never would have thought I would last that long. If you had asked me six years ago if I would still be working in crypto in six years, I would have said you’re crazy. You know, it’s just a very fast changing environment. It’s a fast moving industry. I didn’t know very much about it when I started. So I would have thought that there would be some reason to move on and do something different. But it turns out it’s really fun. So there you go. Well, I certainly didn’t know very much when I started. Steve: Going back over those six years, do you recall the journey, if it was a journey, just to sort of understand the specifics of the landscape and be able to understand it and talk about it? Sarah: I’d heard of Bitcoin. I had vaguely heard of Ethereum, which is the second most widely used cryptocurrency. But I didn’t know anything beyond that. And they said, don’t worry, you don’t have to know anything. We really just want a researcher to build out a team. And I said, OK. Later wondered if that was a horrible mistake, because that sounded scary and I had no idea what I was getting into. But I knew that I was going to learn along the way. I knew that if I stayed curious and just anthropological, I suppose, in being able to observe and learn about a new emerging technology, then it was going to be OK. And it was. And emerging technology is fascinating. I mean, it’s never the same. You wake up every day, it’s different innovations, different things happening in the greater environment. And so to study that from a research perspective and different communities and cultures that are popping up and different things that are suddenly hitting the scene and trends taking off, there is nothing really like it. Oh, yeah. Yeah, it does. I think that I mean, it’s it’s three times longer than I’ve ever been at any workplace. Steve: You made that comment about at the beginning of the six years, you wouldn’t have imagined that it would be sustainable. Not your words exactly, but I’m getting that because it has changed, it continues to be, I don’t know, interesting, new. Sarah: I think that’s probably why I wouldn’t have thought that I would stick around just because every job I’d had prior, it’s like you kind of hit that two year cycle of like, this was fun. I feel that I mastered some things about this. Am I bored now? And sometimes the answer is yes. And in this case, the answer just kept being no. I felt like I did different tours of duty. The role changed as I went and I felt like I was always learning. So focusing on consumers, we’re focused on regular people who are not companies. Steve: So you’re focused on the retail part of the business. Can you paint a picture of what research looks like? You know, anything about sort of structure, activities, team? Sarah: They’re not developers building apps. They are just regular people who are interested in buying and owning and using cryptocurrency. So because we’re responsible for Coinbase.com and the app, we’re also responsible for Coinbase Wallet, which is our slightly more intermediate crypto app. I’m happy to explain that if that’s interesting, but just, you know, people who are interested in engaging in the crypto space, sometimes as an investment. Sometimes they’re doing other interesting things with it. That is our retail customer. So it’s everything from the super early funnel of, okay, how many people own crypto? Who’s interested and what do they know about it? Where are they starting from both in the US and internationally? And then all the way down to, hey, we’re about to release a staking feature. Okay. Like what features should it have? Who’s going to use this and what are the requirements that they’re interested in for this feature? So it can get pretty deep. We have a pretty diverse base. You can imagine that there’s a really big difference between the really advanced crypto s who are super into this stuff from like a hobbyist perspective, all the way to people who are just hearing about Bitcoin. They got interested in it. They figured they wanted to own or buy some. And what does the experience look like for them and everybody in between? There’s a large range. So we’re doing some foundational and strategic. We’re doing some long term stuff. We’re doing a lot of tactical stuff. We are doing a lot of concept testing. Steve: What does your team look like? Sarah: Working on retail and consumer right now, there’s six, seven, I suppose, including myself, eight, if you’re counting our lovely intern. And then there are a handful of other researchers working in the different product groups, not on retail and consumer. And so I believe it’s 10 or 11. Steve: Before you said, oh, those groups could have directors of research but don’t. Sarah: Right, exactly. Steve: That’s those other researchers. Sarah: It sort of gets into how we’re organized, which I know different research teams may or may not report into design or product or, you know, are you centralized? Where are you located in the business? We are in product reporting through whatever is the specific area of product managers that are our stakeholders. So we have our consumer product group, which is focused on retail and consumer. So that is headed up by a head of product who then has a senior director of design and research in our case, his name is Jeff. And then I report to him. So we have, we report into design, I guess, if you wanted to put it that way, who reports into product for retail and consumer. And that would be true of institutional, that’s true of developer. And so it’s interesting because, yes, we report into design, but we have a very close relationship with product as a stakeholder. We used to be centralized. There were previous iterations of the company where we had a VP of design and then, you know, we had a director of research who was across all the different product groups. And that came with its own challenges, as I’m sure anyone who’s worked on a centralized team would tell you. You’re kind of expected to cover everything and everything is your responsibility. Whereas when you are reporting into the product group that is also your stakeholders, obviously that comes with its own limitations and challenges. But at the very least, you’re close to where the decisions are being made. You’re close to the people that are, you know, working on the thing. And so that’s true of, we’re very close to our design stakeholders. We’re very close to our product stakeholders. And so we have a friendly and happy communal relationship with the other researchers and the other product groups. But fundamentally they are studying different audiences. They are working on different products. And so we have like a shared tool stack. We have a shared budget and we engage in some practices together like research crit and things like that. But oftentimes we’ll be working on different things. Steve: For your folks, what’s the mechanism for determining what you’re going to work on? Sarah: So obviously we’re very close to whatever are going to be the priorities of the product group that we’re in. They have OKRs, very common setup, but we’re not necessarily married to exactly what’s on their roap. I tend to think that a healthy diet for a researcher is some stuff ing their roap where either they’ve specifically requested it or we have gone in as experts in subject matter experts and research and said, “Hey, we think you should really do this research.” And then also having our own proactive ideas about maybe the things that nobody’s asking for. So we own our roaps. Product does not write them for us. We generally say based on what we’re observing of your priorities and what you’re working on, here’s our quarterly roap and what we’d like to do. And maybe they have some or suggestions, but generally they trust us to say, “Here’s the research we think you should have.” Steve: Yeah, can we talk a little about the research that you’re proposing that’s not directly tied to what the product’s roap is? Sarah: So I think it’s a little bit of keeping your ear to the ground of what are the inklings that you’re starting to hear. They’re not asking for a project, but maybe there’s a particular topic that seems to be coming up a lot or a question that you hear leadership kind of grappling with. And you’re like, “I’m not sure if they know that research can help with this, but I know that research can help with this.” And so we’re going to proactively reserve some room on our roap to really give this a shot. And oftentimes, they’re grateful that we did it, even though they didn’t ask for it because you never know when they’re going to react really positively. It’s like, “Oh, I didn’t know you were working on that project, but I’m so grateful that you did. That’s super interesting.” Or maybe they do ask for it six months later. You’re like, “Wow, I’m really glad that six months ago I thought to invest in this research because you correctly anticipated the thing that they would be interested in.” Steve: We could probably draw some lovely two by two because who doesn’t love a lovely two by two? But if there’s these categories of, you know, research that’s directly responding to things that are being done like OKRs and things that are about keeping your ear to the ground on one axis, I’m thinking about the other axis because you talked about foundational or tactical and sort of other ways of dividing up the research. I’m curious kind of how that maps out and maybe that’s not a fair question but to me it’s very different, I would guess, to say, oh, we’re going to, you know, on our own go evaluate some design directions or some interaction mechanism versus we’re going to go understand some motivations or some set of behaviors. OK, so I drew with my hands the two by two but I’m not even sure that’s the right way to think about it. Can you say a little more about, you know, how you might characterize or break down these studies that you’re identifying the need for? Sarah: It’s definitely in the latter that you mentioned, which is better understanding a particularly emergent behavior or a motivation. Maybe there’s a competitor name or a particular space that we continue to hear coming up a lot and we’re like, hey, we don’t really have a lot of information about that or I can tell that there are some strategic discussions or decisions that it sounds like they’re kind of getting stopped up on. So it’s very much in the foundational category that we are proactively suggesting things. There are times that we go out there and we say, hey, this is a designer, a feature that hasn’t been evaluated in a while and we think that it’s important to go look at this. But I would say more often it’s in the foundational strategic. For example, one that we’re considering right now is there’s this particular trend that we’re seeing happen and we’re trying to figure out whether or not that’s a space that Coinbase wants to play in. This is something new that we’re seeing people do with crypto. We’d like to better understand exactly who’s doing this, why are they doing this? Is this an area where we think we have a competitive strategic angle or is this something that we could in some different way? Or do we just really want to better understand this and now six months from now when a stakeholder comes and asks me, hey, what do we know about this? And I can say, oh, we looked into it. Steve: Well, that leads me to a follow-up in of what happens with this research. It’s not responding to a request or it’s not tied to the — I’m sort of floundering even, like, what’s the label for the kind of research we’re talking about? Sarah: I’ve heard it labeled many things because I have heard other research leaders talk about this often f like, what do you call it? Is it forward looking pathfinder, horizontal? I mean, it’s like there’s so many different buzzwords you could throw around. Like we all deal with this, right? We all deal with what do you do when you’re trying to get out in front of a question? So I’m not even really sure we know what we call it, but we certainly know it when we see it. And you can also see different research leaders have different strategies for how do you do that? Do you have a team that is specifically not dedicated to a certain group of stakeholders? They’re kind of like a centralized team. Do you get contractors for that? Do you pay vendors for that? Like how generally do you deal with it? And what I will say is that even though it can be hard to know what are the shots to take, like when do you take your roap and reserve a big part of it for this thing that you’re not even really sure if it’s going to land impact or lead to something? Every time we’ve done it, I’ve been so grateful that we really like scraped by and managed to find the space and time to do it in whatever way, because inevitably what happens. So here’s something about crypto. Crypto is cyclical. So the price of Bitcoin goes up. You get a whole bunch of new people who are very interested. They’re curious about this space. They’re hearing it in the news. They may not know a lot, but they’re excited to engage. And so then your doors are flooded with all of these people who are in this like beginner mindset. Then the price of Bitcoin goes down, as it always does. And then suddenly you still have very active s, but it’s the people who are really hardcore in the space. They’re very hobbyist crypto people who never… They came a long time ago. They’re never going to leave. Like they’re just really interested in this space and they stuck around. But there’s fewer of them. They definitely engage with the product in a very different way. They have a very different set of needs. But inevitably, because, you know, last I checked, crypto has not died yet, even though they like to say Bitcoin is dead or whatever they say, it has not died. So inevitably the price goes back up again. Well, now, I mean, all the research that you’ve been doing, that that you’ve been looking into and serving for the past year or two or however long it takes, now you’re flooded with the new s again. And so you actually, we invest a lot in lit reviews and summarizing past research, often as like the first step to a research study of, you know, like what have we looked into before? What is relevant? What is evergreen from the last time that we entered the cycle? And the time that we entered the cycle before, you can start to see longitudinal patterns. And that can be a very helpful way to guide where to go next, because you can’t always predict exactly how the market is going to evolve and what’s going to change. But you can say, we’ve been here before and here’s what we think might become relevant later. So you can try to call your shots. Sometimes, yeah, I would say we tend to do them in one of two cases. Steve: Are you doing lit reviews to look for what those shots might be or what those speculative projects might be? Sarah: The first is when a stakeholder comes to us and says, hey, what, you know, we want to do a research project on X. What do we know about X? It’s like, well, let’s have the very step number one be to check the archive and say, what do we already know? And sometimes we know a lot, sometimes we know a little, but the very least we can check the box and say, hey, maybe we don’t even need to do research. We actually already have a perspective on this. We’re able to save ourselves a lot of time and have a much more comprehensive strategic point of view for you. And 10 out of 10 times, the stakeholder is like, great, awesome. Like, I’m so glad that’s even better than the thing that I asked for. And we’re able to just machete that request away and move on. And then there’s the kind of more long-term cadence where maybe once or twice a year, I’d say, we’re doing it, where we’re genuinely looking across all the research that we did over the last six months and sometimes even prior and saying, what have we been learning? What do we know? What are the themes that are coming out? Let’s get together as a team. Let’s run that workshop. And we take a different approach every time. It doesn’t always look the same. We kind of experiment with different brainstorm formats or different structures for it, I guess you could say. But then the outcome is, here’s the trends we’re seeing. Here’s what we’ve been learning, big themes. And sometimes it identifies, here’s a big area that we don’t know about. It actually resulted in this recommendation for ourselves of this trend that now we want to go follow with our next six months. We revealed a knowledge gap for ourselves. So that, I’d say, we do maybe once or twice a year. Steve: I want to ask just a very tactical facet of that because, I mean, this is a challenge for so many teams. What information do we have? Where is it? Who knows about it? How do we access it? I think you called it the archive. Sarah: Yes. Steve: How have you organized your information such that you can do this? Sarah: I feel very ionately on this topic. I think that my team, if you were to ask them, what are the things I’m best known for? One of them is aggressively making people document in the archive their studies, because I’m such a big believer in pulling out past studies to not only save time for yourself, but also invest in those longitudinal data points and perspectives. It’s literally just a spreadsheet. I think a lot of people have this misconception that it needs to be some fancy tool. We’ve tried that. It didn’t work out because then we lost budget for fancy tool. Fancy tool went away. Now you’re scrambling to try to migrate the entirety of your archive, which in my case, goes back over six years, right, that we’ve been working in this space. And so we literally just use a glorified spreadsheet. And you’d actually be amazed at how many other research leaders I’ve met who also say, yeah, it’s basically just a glorified spreadsheet. It’s like all these tools that have their merits for sure, and yet for what you need, how much better is the glorified spreadsheet, right, Google Docs? And so we do have our glorified spreadsheet. And I think that no matter what tool or what format you’re using, the most important thing is regular ability for updating it. I think that that’s where a lot of archives fail is you have a nice one, but is it consistent with making sure that you’re putting new stuff in it and then tagging that stuff and making it easy for researchers to go back and look through? And so at the end of every month, we look across what are all the studies that happened this month? And everybody has to update the archive. And there are various ways that we keep ourselves able for that. And then we hold each other able for when there’s a new study request. Did you check the archive? Have we looked at this before? I think that it’s more of a cultural practice than it is any kind of magic bullet tool. Oh, my gosh, let’s pull it up right now, shall we? Steve: What are some of the columns in the glorified spreadsheet? Sarah: So I think that we’ve had a number of org changes, so I actually don’t pay a lot of attention to like what team or, you know, because team names may change, teams may change. I think what’s more important that we have is what was the top key insight? The single top key insight, not a list, the single top key insight and recommendation. And that can give you like your two to three sentence summary of what we found in this study. And if you’re interested, then you can click the link to learn more. Usually that contains all the keywords that you need. There is a spot to add additional keywords if your two to three sentences of your top key insight did not include those keywords. But also the DRIs, sorry, excuse me, that’s such a corporate term, the directly responsible individual who did this study, who was involved. Literally, those are the only things that you need. Yes. Yes, that is the researcher that did the study. Steve: It sounds like very few columns. The DRI is the person that led the study, that did the research? Sarah: Sometimes there’s multiple names listed. We also have the month and the year and the title of the study with the link to the report. And that’s it. Steve: Is that archive just for researchers? Sarah: The primary audience for the archive is researchers, primarily because I like it best when a stakeholder is coming to us saying, what do we think about X? Right? I mean, sure, we could give them the link to the archive and they could search it themselves. But that’s not the same thing as coming up with a research informed point of view, which I find is actually the primary value that we provide and the primary thing that they actually would like from us. Sure, they can go and search for past studies, but oftentimes they want to know, what do we think? Do we think we should move forward with this? What do we think the opportunity is? Which comes from a human researcher who has access to the archive more so than a PM or a designer running around looking for their own studies. Steve: I was in a conversation recently and talking to people, they were running basically a mentorship matching organization. And so we were trying to understand how did that work? And they described so much about what kind of data was captured, all these different fields that they had, but then they said that most often their ability to match was that so-and-so who did intake for somebody ed that they had just talked to somebody else. Like there was institutional knowledge or knowledge held among the people as opposed to the data itself. I’m thinking about them as you’re talking about this, like these tools invite a certain usage like you’ve captured everything so you can dive into it, but people work on stuff and they tell stories like culture kind of carries a lot of this. What kind of things happen when a stakeholder comes and says, “Oh, I have this,” is as I’m kind of surmising recall and interactive social, cultural stuff. Does that impact the sense of what we know versus maybe a more cold-blooded look at data and fields and reports and so on? Sarah: The short answer is yes, I do think it does. I think there’s no replacement for the people who the last time and the last time that we did this study and the people who were on the receiving end of that request. And one nice thing, this is just luck, I suppose, is our research team is still very small. And it’s also a lot of people who have worked a long time in this space and at the company. And so they do the last time. And I think in those situations, having as a practice to rely on the fellow researchers and have a strong community of researchers who will phone a friend and say, hey, has anyone ever looked into X? Or does anyone a time when a stakeholder asked about X? And in addition to also combing the archive and seeing what exists, I think both are equally powerful and important in their own ways. And that a good comprehensive look at what we know is probably going to involve both. And not just researchers, right? I mean, what about also talking to the other important stakeholders and functions, PMs, designers, data people in our lives? And so I absolutely am a believer in that institutional knowledge. And I’ve seen what it looks like when you lose that institutional knowledge. So I mentioned that our industry is very cyclical. Price goes up, price goes down. The base changes. Well, also the size of the company has changed and the people working at the company have changed. I am absolutely a dinosaur by Coinbase and crypto standards. And so I the last time, am I the only person in the room that re when we discussed this, right? And we were in a period of hypergrowth where the company’s strategy at the time to address the growth was to just hire, just relentlessly hire. And it was like every time you talk to a person, that person was new. Their manager was new. Their team was new. They didn’t know anything. And it feels like Groundhog Day. You know, like I’m snapping here. It’s like Groundhog Day. Like you end up just rinsing and repeating the same things. And in those moments, I think that you’re not doing your job if you’re not looking at what we might have looked at before. You’d end up executing the same study every three months because every three months, you know, a stakeholder shows up and asks the same question. Where are your personas? That’s a common one. So I think that that’s where we really learned how much time we could save by keeping track of the last time that we did it. And fortunately, we are — I mean, fortunately, not fortunately, I don’t know. Non-normatively, we are not in a state of hypergrowth at this time. So the nice thing is there’s just less turnover with human beings. And so there’s less thrash with like the same types of questions and, you know, a lot of that chaos. Steve: You said a bunch of interesting things, but one reaction is that phoning a friend, I tend to think of that, my own bias there is that’s kind of a failure, but you’re characterizing it as a success. That’s a valid way to tap into what we as a collective group of people who work together know. Sarah: Yeah, absolutely. Because I think that a lot of knowledge is still in human beings. And we can try to automate away. We can try to make it factoids and bullets. But I think that our knowledge is cumulative. Like as researchers, we are anthropologists in this space. We are deeply putting together everything in our heads and having a point of view. And that point of view is what we offer. I mean, in 10 years, when all of our jobs have been automated away with all these AI tools or whatever, I mean, the last thing I think to go will be. But you, as a researcher, yes, I know all the findings, all the data points, all the studies. But what do you think we should do? Like what is your strategic recommendation? That is the thing that is hardest to automate away. And I think that that is the value that we most provide to the company and to our stakeholders and to each other. Steve: I wonder if the discipline that you’ve created around updating the spreadsheet, it’s like why we take notes about things and then we them better. You know, as you’re talking about these two approaches, I think you’re saying they’re interrelated in an interesting way. Sarah: Yeah. Steve: So, yes, if you have a discipline of updating the database, the glorified spreadsheet, then that might make you better at recall when someone says, hey, what do we know about X? Sarah: Absolutely. I mean, even if I were the person who conducted all of the studies for the last few years on a particular topic and I put them in the archive, I still might be going through and using the archive to bring all of those studies that I have done before to kind of bring them all out, look at them, think about them, come up with a take, maybe do some affinity mapping if you’re like a Miro enthusiast or something like that. And so it is helping me put together the data points that I have collected, regardless of who collected them. You’re kind of engaging in this coming up with the perspective. Okay. Steve: Maybe we’ll go back to the kind of research that has no name. You gave the scenario a couple of times of, you know, someone, a stakeholder approaches you and says, you know, you’re able to say, like, well, we’ve actually done that research and here it is. And so that makes me think about, oh, you’ve got sort of the stuff kind of held back. And I was wondering when you do the research that I’m not effectively categorizing, yes, Sarah: The research that shall not be named. Steve: when you’re doing the research that shall not be named and you get to some point, you’re, quote, done with it. And you’re not really done, but, you know, wrapping that up, what kind of things happen with it? Sarah: I think that in the ideal perfect world, somebody says this is revelatory and life changing and we’re going to go do everything you said we should do. And I think we all know that that doesn’t always happen. Sometimes it’s the opposite where it kind of is the tree that falls in the forest and there were people around to hear it, but they were fairly unmoved and it just wasn’t the right time, you know, or maybe there were some unanswered questions. It informed a V2, but it appeared to just stop where it was and it went into the archive and nothing happened. And I think that’s the thing that you kind of have to do is kind of put a flavor in second lives and third lives and fourth lives of research that you were kind of bummed because maybe it didn’t have the revelatory life changing impact that you wanted it to have at the time. And then you bring it out again and you say, actually, this is exactly what I needed. And maybe it just was a little bit ahead of its time. We’ve seen that happen a lot. Big believer in evergreen research. A lot of jobs to be done research is like this, I find. And that’s like a whole category because then you also have to help stakeholders understand exactly what to do with that. But a lot of that stuff, I mean, I cannot tell you how many times we have pulled it. We all have our favorite studies, right, where we looked at behaviors in banking. I mean, completely outside of crypto, just unrelated. How do people deal with their regular money? What’s important to them in banking? At the time, it was like, how relevant is this? I mean, most people aren’t seeing crypto as their bank. It’s very different. But the number of times that we’ve gone back and said, here are the canonical behaviors that we observe with people and money has been, I would do that study all over again. It was a great investment. And we just didn’t realize it necessarily one month, two months after we finished it. Steve: You’re painting a picture a little bit of, you know, we talked about the second lives, then that first life, the research is completed and doesn’t get the uptake that it might that first go around. How does it get published or shared or socialized when it’s something that is on your roap and not on somebody else’s? Sarah: We share it in largely the same way. We have certain established channels, cadences, ways that we summarize recent research. And it doesn’t really matter where it originated or even sometimes who on the team did it. We’re still going to do the roll up of like, here’s the biggest insights that came out this month. We have a channel where we share our completed work. And we would share it the same way as anything else. And at that point, it might be like, oh, my gosh, this is amazing. Or at that point, it might be crickets. And sometimes you have to go further beyond that. It’s like, what live channels do we have access to? Is there like a regular leadership monthly where we can present one or two insights from this? Or are we going to signal boost in different channel? I mean, obviously, the way that we work digitally and remotely impacts this a lot. We’ve had to think a lot about research communications in a variety of different ways. We’re never not thinking about research comms, actually, if I’m honest. And so you really try to make sure that you’re signal boosting it to the right audiences in the channels that they pay attention to in a variety of different formats in every flavor. And then if it still feels like it’s not generating conversation, then you file it away. And it’s amazing how suddenly in six to 12 months, they start asking again, and you’re really grateful you did it. Steve: This to me gets at a complaint or an anxiety that researchers have. And I don’t know if this correlates to sort of the overall research maturity. You know, I can imagine someone listening to this and saying like, wow, you guys are going off and doing research that you think is important. And you’re like, I wish I had time for that. But let’s say I did have time for it. I did that. Sarah: Right. Steve: Well, no one wants it and no one cares about it. And so it would just be crickets. But you’re describing a larger strategy. As you said, you never are not thinking about research comms. So assuming relevance, assuming that, you know, everybody can understand the relevant research, you have a built out structure for comms, that kind of a baseline. And then you’re also thinking about on a piece of material or a set of insights specifically, how to go about that. And then I think, as you said, you know, you bide your time as well, because you have a handle on what the information is. It doesn’t have to sync. It can hang around because you have a way of retrieving that and saying, oh, yes, we do know something about X. But it’s something about Y. Sarah: For sure. And I think you have to know the shots to take. I’m not saying people should just go do all this research and be OK if it doesn’t go anywhere. I’m a big believer in trying to make sure that you’re taking the most impactful shots that you can take based on your knowledge about the organization that you’re working in, like the space. And if putting together all the information, the best information you have at the time and saying, look, I really think this is a knowledge gap that strategically matters to our business and to what we’re doing. And if you’re right about that, then I do think it’s just a matter of time that somebody comes back and is glad that you did the thing. And there is no magic formula. I mean, it really is like you have to sometimes beg, borrow, and steal your time of like, yes, we all have a million requests that are coming from stakeholders all the time. And how do you still reserve the space? It’s hard. It’s hard for everybody. I wouldn’t say we’ve found a magic formula. Although, if I did have one tip of something that has worked, if there is something that you really want to prioritize that is proactively initiated and not a request from somewhere outside of your research team, what are you going to deprioritize? Are you good at deprioritizing something else? And that is a muscle that we’ve been able to work over time of like, okay, I really want to make sure that we have room for this foundational research this quarter. Can I make sure that my requests from stakeholders are not eating up my entire roap? Are there things that I feel that they don’t need as much as they’re saying or that aren’t going to be as impactful as they’re making it sound or that I think can be answered in other more lightweight ways? And if you’re good at doing that, then you’ll always be able to reserve your time for the other things. But that is hard to do. That’s hard for everyone. I do think that the best no’s sound like yes’s. Steve: And you didn’t use the phrase saying no to things. That is often the label that’s put on that. What are best practices or guidance you might have in the deprioritizing and telling someone that whatever you have to tell them as a result of that? Sarah: Because at the end of the day, I think it’s important to why your stakeholder is coming to you. They’re coming to you because they have a question or there’s a decision that they need to make. And there are all kinds of different ways to help them with that goal that don’t necessarily involve dropping everything you’re doing and running a study. It can be, well, can we run an A/B test? Isn’t that the better way to handle this? Or here’s an educated guess we can make based on this other study that finished recently. Or we have a process where certain very tactical, not super high stakes research, it’s a form of democratization we call partner-led research. So it’s not them doing it themselves. They are leading it as the partner. The partner is PM or design. And we are helping them, but we’re making sure that it’s like the proper method and tool for the job. Like we have a way of helping them execute it themselves when it’s something that’s fairly well scoped to just their area. And so there are all kinds of different tricks of like if they’re coming to you because they have a question or they have a decision they need to make, how do you help them answer that and make the decision in ways that don’t necessarily involve a study? There’s all kinds. And then they’re grateful because they get what they need. And it usually doesn’t sound like a no, we’re not going to do that. It’s like, thank you so much for coming to me. What an interesting question. Let me try to find the best way to answer that for you. And the best way could look a bunch of different ways. Steve: So you’re helping. It’s not “Go away.” Sarah: Yeah, I think it’s a solutions-orientedness in of they’re coming to you because they want a solution. And so whatever you do, it doesn’t necessarily have to fall within the bounds of research. Sometimes they really just need help reframing their question in the right way and actually realizing that it’s a data question, like a data science question. Or it’s talking through the options and realizing that actually we can probably just make a guess about this, like use our design instinct. It’s actually like just a design — how should I say this? We should use our instincts as professional designers to make a call. Like are we really going to run a study about whether the button should be red or green? Probably not. But they may not know that. And I can’t necessarily expect the stakeholder to always know when is the right moment to do research. That’s my job to know when is the right moment to do research. So if they’re asking me something, that is always a conversation that I’m ready to have. And it doesn’t matter whether it results in a study or not. I think in of roap, I think that they are the experts in their own domain, right? Steve: How does it work with the researchers that are elsewhere in the organization that don’t have the same kind of leadership structure in of managing priorities and managing responsibilities? Sarah: And so I trust that they are running their roap in a way that works for them. I’m not necessarily going to go try to micromanage that for them. But, of course, we are here if they ever want to bounce ideas or get a take. We also are very transparent with what we’re working on. They’re transparent with what they’re working on. Like we have a very collegial relationship. So I would say that that works with smaller teams because they can always just say, “Hey, has anyone looked at this?” But we also do certain shared practices. Like I mentioned, we have a shared crit where twice a week we get together. And any researcher from any part of the organization can bring a challenge or a topic or something they’d like some on. We have a shared channel. And so if they’re really like, “Hey, I’m not sure whether or not I should prioritize this. A stakeholder is coming to me asking this, and I’m not really sure how I should think about it.” There is a community of researchers, myself included, who are really open to that. And so we have a shared community of researchers, myself included, who can always help bounce ideas. Steve: So, crit might include what approach to take about a question. Sarah: Yeah, absolutely. It can be anything. I mean, crit, I suppose, makes it sound like we’re always critiquing research plans or reports of findings. But actually it’s mostly just like a shared discussion forum where sometimes people come and they have a very challenging stakeholder management question. Like, “How should I deal with this?” Or, “What’s the deal with this tool?” Or, “How do I do this thing?” It’s really just an open forum for researchers ing researchers. Steve: Do you have any other research comms? Sarah: Yes, always. Always research comms. Steve: Do you have some examples of things that you’ve tried that didn’t work and things that you’ve tried that did work? Sarah: I think you have to be very careful with regular newsletters. We’ve tried a couple of different iterations. I think we still have one monthly, but we really had to iterate many times to get it to be read by the right people. Something that’s very top of mind is having strong, crisp product recommendations, which I’m told is actually different than a lot of environments. I mean, I guess it’s been over six years since I worked in any other environment. But researchers are saying, “Hey, at other companies, oftentimes they want the insights. They don’t want me to tell them what they should do. They don’t want me to be specific about what products should do and what design should do. And I think you should build this, and I think you should make X into Y.” That is something we absolutely do. And not only do we do it, we’re held responsible for doing it. Like, we’re perceived as not doing our jobs if we are not having a strong point of view. And it doesn’t always come naturally to a researcher. We’re data people. We like hanging out at insights. We like the “how might we” statements. We kind of like the collaborative energy. And definitely there’s a way to do that. I wouldn’t throw it out. But at the end of the day, especially for a certain level of leadership, what they’re looking for is, “But what do you recommend?” And I want it to be three bullets, and I don’t want to have to read it very much. And I want you to be very specific, not just make X easier. You know, how should I be making it easier? What is the specific change? And that means there’s a lot of trust that our stakeholders have in us. Like, I don’t get in a lot of debates about methodology, for example. I don’t have a lot of people who want to sit through an hour-long presentation just so they can question whether I talked to the right audience or not. You know, like, they trust that we did our jobs right. What they want is to dedicate five seconds to reading my bulleted list of three things that I think product or design should do. And the more that we have put that at the front and center, the TL;DR is what we call it. It’s very internet lingo. The TL;DR at the top of the research newsletter, which I mentioned we still have. And really the three main TL;DRs from that month. Not everything, not extensive, very short, easy to consume, and it took a while to find the right recipe. And to identify who is the right person to read that, I don’t necessarily expect that everybody responds to email. Actually, our email is designed for one very specific leadership stakeholder, and it is tailored to how that person likes to consume information, just based on what we’ve observed about how that person requests information to be delivered by other teams, like how they like to receive updates. There’s a different stakeholder where we actually don’t expect that that person is going to read. That person hates email. That person, I use Slack. Another leadership stakeholder I want to make sure is listening, tends to listen very well when they’re live in a regularly recurring monthly meeting. And so I make sure that research always has one or two slides in that meeting. So I think you have to know exactly who you want to be listening, and you have to change your techniques depending on who that is. Which ultimately at the end of the day is really just understanding your , right? Like except that in this case, the product is us as a team and what we provide, and the is a specific stakeholder. Steve: I want to follow up a bit on the specific recommendations part. I am always anxious about finding a balance because I think here’s the thing that we learned is no one does anything with it, but if you go too far without the context in here’s how we think you should address it or how we think we should address it. I want to set up a dynamic where someone can say yes and to that because they know the engineering constraints, they know some piece of software they’ve bought or they know what rolls out uphill versus downhill or they know all kinds of stuff. There’s other decisions being made and other expertise that they’re bringing. When I say we learned X and we think we should do Y, what I don’t want is to get shot down, no Y wouldn’t work and then we’re at the end of it. If I have to defend sort of how might we questions, I think it’s to try to lead to the generation of the best solution with what everybody knows. I totally get the more you can bring people closer to taking action on the research, the more impact you’re going to have. Sarah: No, I completely understand where you’re coming from, because I think if you talk to every person on our research team, they’d say the exact same thing. They don’t want to get shot down because they don’t want it to be the end of the conversation. We had to completely change how we think about that. We had to become okay with getting shut down and to realize that that actually was not necessarily a worst case scenario. I’ll give you an anecdote that jumps to mind. We just did one of those regular roll-ups that I was talking about where we kind of look across the last six months of research and say, “These are big themes, and this is what we think you should do,” and we presented it to a very large leadership audience in a large forum. And we knew we were taking a big swing where we were being very opinionated with, “Here are the three things we think you should do.” And we get into that leadership. It was a presentation. We did this one synchronously live. And I’m like, “Okay, here’s the three major themes. Let’s talk about this first theme. What are people’s thoughts?” And literally a person raises their hand and says, “I think this is a terrible idea. I do not think that we should do this.” And then the next person said, “You know, I mean, maybe we shouldn’t do it in that way, but I actually think we could do it if we did it in this different way.” And then the next person chimes in and says, “You know, actually, I’m somewhere in between you two. Like, I see the value in this, but maybe…” And it is so funny because we have been beating our heads against the wall for years because we felt like our stakeholders were constantly being like, “You need to be more specific with your recommendations. You need more of a TL;DR. You need to have more of a point of view.” And we were like, “We don’t know what you mean. Like, how are we not being specific?” And we realized that it’s because we were afraid to get shot down. And what we didn’t realize is that they will perceive us to be not doing our jobs if we don’t come out with a spicy opinion. They want the spicy opinion. And if we are disagreed with, then that actually is going to lead them to other different ways of coming up with different solutions way faster than a “how might we” statement would. It was crazy. And I think that this could be a way in which the Coinbase environment is different than other environments. Like, it turns out that after all that time, the best way to get them to ideate different solutions was to throw out an idea and a recommendation that they hated. And that did more than a “how might we” statement ever did. It was unbelievable. And so now that’s been really encouraging the whole team to be like, “It’s okay to get shot down. It’s okay to get disagreed with. It’s okay for people to think that that isn’t what we should do.” Because to express a point of view, a strongly held opinion, is better than not expressing one at all. Because if we don’t express one at all, they’re going to be like, “Oh, I just wish our research team had stronger recommendations.” Steve: You reminded me of this article which I’ll link to in the show notes. This guy is basically describing, I think it’s the McDonald’s theory. And he’s working on site with people in their pre-times and having this conversation every day at lunch. And he says, “Do you want to go for lunch?” “I don’t know, I don’t know, I don’t know.” And he decides to facilitate the resolution of that by proposing McDonald’s. And then everyone says, “No, let’s go…” They come up with specific choices which they weren’t doing beforehand. And he says basically the same thing that you do. People will come up with good ideas to fend off bad ideas. And so I don’t think you’re proposing recommendations that you think are bad ideas. But you’re proposing a solution to a problem. And so that’s what the audience does in that example. And they are working to resolve them. Sarah: Yeah, that absolutely resonates where they want to know what we really think. And sometimes they’ll find a report and the report has a very clear, you know, like, here’s what we found and, you know, therefore here’s our recommendations and whatever. And that stakeholder will actually DM me and say, what do you think? Like, even if what I think is actually some version of what the person said, you know, like what the researcher put, what they want is what I think. That is the value that I provide, even if I’m really just kind of like restating what they said. And so I think it’s a good reminder of like, they don’t have to agree with me, but they do need to know what I think. And if I think something that they don’t agree with, at the very least, they know what I think. Steve: What does shot down mean? That they disagreed with the recommendation in that meeting. But there was a robust conversation about what happens next. Sarah: Totally. I mean, that, that to me is a success. I would say the failure state is where they just don’t trust that the research is valid. And fortunately, that’s not a situation we ever, I mean, I’m not saying we never find ourselves in that situation, but it is, it is rare. They trust that our research is valid and now it’s an equally valid discussion about what to do with it. And there could be a variety of different opinions and a variety of different inputs, right? We’re not always just going to go do what s said we should do. Right. I think that’s where it starts to become what’s the strategic opportunity. What’s the business opportunity? What’s, you know, what are the other data sources that we have? What about instinct? What about all these other factors that ultimately lead us to a good decision as a company with research and insights just being one component of that? Steve: I’m gonna go back to something you said early on that you’re talking about these six years. And you’ve described changes in crypto and then the changes in the company as a response to that. And you said you’ve done these different tours of duty that your role has changed over those six years. What are some moments or eras that are different than the one that we’ve been talking about? Sarah: Well, there was the era of Sian Townsend. Hi, Sian. I hope you’re listening. We love you. There was a time when I mentioned that we had a centralized research team and there was a leader of that centralized team back when the team was like closer to 30. I lost count. I don’t even know. It’s much larger than it is now. And the leader of that research team was named Sian Townsend. And that was the first and only time that I have ever reported to another research person. I’d only ever reported to design managers, people in products. And that’s great. But ultimately, you’re not talking to somebody who deeply understands your practice and your function. And so for me, I got so much out of that couple of years where I learned how to be a research leader in ways because I’ve only ever worked at small companies. I will it I’m a little bit of a bootstrapped case where I’ve worked at smaller companies and/or on smaller research teams that just don’t have the level of structure and process. Like I’ve never worked at Facebook or Google or some of these larger research teams where you just learn the gold standard of how a lot of these things are done. And a lot of these problems that sometimes small company people encounter every day, those problems have actually been solved in a different way. But that’s why I think sometimes you go to these research meetups and you see a bunch of research teams of one or people who maybe just have one or two other researchers they work with. And they’re like, “Hi, how do you address this problem? Please help me.” And there’s such a community amongst researchers where we struggle with these challenging problems. And to see the way that she had the tools in her arsenal to be able to handle these conversations at a level that I had not previously been exposed to was the best mentorship that, I mean money can’t buy that kind of mentorship. So this is like a big love letter Sian, but that is an era, and that was also an era when the team and by the team I mean the research team, but I also mean Coinbase as a company, we were scaling enormously. I mean we were in that period of hyper growth I mentioned where it was like we were three-xing by the end of the year and it was just constant hiring and constant new people. And tha
01:11:08
48. Jamika Burge of Capital One
Episodio en Dollars to Donuts
In this episode of Dollars to Donuts I speak with Jamika Burge, the head of research for Data and AI at Capital One. We talk about her journey through academia, discovering research, and intersectionality. Doing good – for me, as a researcher, and as someone who wants to do good in the world, it means understanding people’s needs in context and providing opportunities for them to succeed. That’s what that means for me. Success can mean different things to different people. I can guess what success means from a business perspective. I can even guess what success means from a researcher perspective, but ultimately it’s that end who tells us whether or not we got it right. I want that person to feel as an end , free to share with us when we got it wrong, but also when we got it right. – Jamika Burge Show Notes Interviewing s, second edition Steve Portigal on the UX Podcast Jamika Burge on LinkedIn What is the Positive and Negative Affect Schedule? (PANAS) Center for Human-Computer Interaction at Virginia Tech IBM’s Thomas J. Watson Research Center John M. Carroll at Penn State Mary Beth Rosson at Penn State Office of Naval Intelligence DARPA Spelman College blackcomputeHER #blackcomputeHER conference National Academies Capital One Help other people find Dollars to Donuts by leaving a review on Apple Podcasts. Transcript Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead research in their organization. I’m Steve Portigal. I was recently a guest on the aptly named UX Podcast, where in reflecting on the 10 years since the first edition of Interviewing s and the new second edition, we chatted about the changes in the research field I’ll link to the whole episode, but here’s a clip. Per Axbom: As you were describing this shift to in-house teams, I almost felt this sense of jealousy, in that in-house teams then get to work with research over time in a way that I as a consultant cannot. And they have a shared experience. And they can even reflect back on research they did two years ago, based around the same product or service. So that has to be a lot different, I guess, when you when you talk to people about research and how you’re doing the same type of research around the same type of product service for a long time. Steve: Right. And, I’m a consultant myself. I see this with my clients, that they live with some space, they live with some product topics, set of stakeholders, set of s. Some have talked to researchers that meet with the same set of s over years, and they they build these kind of longitudinal relationships. And I’m jealous of them, too. I’m also personally happier where I am, I think I bring value not being in that…there’s a rut, I think that is easy to fall into. And the rut has all these wonderful attributes, like knowing that people and their limitations, like if you have their limitations, their strengths, their preferences, if you’re working with a bunch of different stakeholders, and they have different communication styles, or they have different ways that they engage or different availability, you the in-house person do not have to figure that out every time. I mean, yes, as the team changes, you’re going to be constantly adapting your styles. And there’s even like a mention of this in the book, I think, because one of the people I quoted describes how they try to have a communication approach that is inclusive, but spread enough to deal with the different needs and expectations of everybody. And so yeah, right, as consultants, we come in, and we don’t have the lay of the land. And I think sometimes that’s an advantage. This may be arrogant sounding, but I think we have the responsibility to speak truth to power, we also have the opportunity, we’re a little less constrained by risk as consultants. Per: And not as biased maybe. Steve: Yeah. You know my compensation isn’t tied to the success of the product. Those models create all sorts of interesting incentives. You’re part of an organization, you’re part of a corporation, if that’s the domain that you work in. And your success is tied to success. And as a consultant, it’s not. Yes, the company does well, they hire us back, we want things to do well, but I don’t think we have the same kind of incentive model. So I’m glad there still are consultants, because I think there’s a nice triangulation, or a nice partnership that can happen. I love working with somebody that has the long view inside and they give me the highlights of who and what and I don’t have to do everything my way, but we can negotiate the kinds of approaches and practices. And I can be that unbiased voice just by the fact that I’m not part of it. James Royal-Lawson: Yeah, exactly. You’ve got that, you know, the fresh eyed approach that you can offer as a consultant coming in. But then we’ve got the opposite edge of that. And you mentioned this in, in chapter 10, Making an Impact, that internal research organizations, they need to keep track of what they know, and what they don’t know. And, what have we already researched? And and that opens up a whole different aspect of historical record keeping, I guess, you could say, which as a consultant, maybe we didn’t have to deal with that. It wasn’t an assignment you were given. And then you delivered. Steve: I will say and I’ve gotta imagine this has happened to both of you, though, where as the consultant you serve as the offshored institutional memory, where somebody writes you and you haven’t worked with them for a really long time, and they’ll say, Hey, didn’t we do a project about this or that? Do you know what it was? Do you still have the thing that happened to me like, I don’t know, six weeks ago, someone had an anecdotal memory of something and they weren’t involved. And no one was left working there. So, that organization wasn’t doing a great job of documenting whatever different initiatives and so on, and that they were…I could put my finger on it, like I actually had the document quickly. So, yeah, I think there’s an ideal of whatever knowledge management, institutional memory, what have we researched? What are we learned from it? Yeah, I think it’s a hot topic that people are working to try to define. What I get nervous about is where that problem, which is an organizational one, and an institutional learning one is hoped to be addressed with a software solution. Without asking some larger questions like, what information do we need to save? Like, is it the existence of that report? Is it the report? Is it the person who you know, could talk you through it? Is it the raw data? Is it the decisions that we make? Is it the recommendations? That’s just me riffing on like, what do we want to track? And who’s going to query that information? Is it somebody that is quote a researcher? Or is it someone that wants to ‘hey, do we know anything about x?’ and discover it themselves? So I think there’s just a huge amount of challenges there, like, are you creating different things to archive? Are you archiving them in a way that they’re retrievable by who at what point? A number of years ago, people in research organizations that were growing, were talking about bringing in somebody who’s a reference librarian, whose job it is, in other contexts, is to be a human being that interfaces between people who need information and storage of information? Yeah. And that’s very different than self-service. So this is like a huge kind of culturally based –any two organizations are going to deal with this differently – what they expect people to be able to do and who there is to do it. There’s a technical need here. But there’s also just a process and understanding of what information, why do we want to look back to what end and building those use cases in? And I think it’s turning out to be trickier, maybe than software vendors had promised us. Again, that was from the UX podcast. Now onto our episode. I had the pleasure of speaking with Jamika Burge, the head of research for data and AI at Capital One. Jamika, thank you so much for coming on Dollars to Donuts. It’s just so lovely to get to speak with you today. Jamika Burge: And so good to be here with you, Steve. Thanks for the invitation. I’ve been looking forward to this. Happy to be here. Steve: I want to start off by asking you how you found research. Jamika: That’s a really good question, and I find myself asking myself the same thing, because I don’t think I came into it knowing that it was research. And as I look back on my experience, you know, I started off as a computer scientist. In fact, even before that, I started as a business major in college. I want to study this area called business communication. I didn’t know what it was, but I knew that I really liked English, and I liked my AP classes, English and biology, in high school. And so it gave me an idea of how I approached learning and problem solving and presentation of what I knew in the world. And so I thought business and communications, okay, I’ll give it a shot. But then I got to college and had a work study assignment my first year there, and it was in the computer science department. So work study is for those of us who needed a little extra money paid to us for working on campus to supplement our education. And for me, it was in the computer science department. And because I showed up to work every day or every other day, whatever my schedule was, I learned a lot about what technology was. I had never taken a tech course. I’d never programmed. But it for me was an opportunity to learn something new. And so I spent my first year developing my university’s first website. This was like in the mid 90s. And I’m learning about the internet, and I’m learning about what to do to create experiences on the internet, which at the time was web pages, or maybe some Flash, right? Adding a photo or an image online, which were all new things for me and even for us in the field. And so that was my entry into understanding the tech space more broadly, but only in the context of the internet. And after my first year, I changed my major from business communications, which at my college was really business, to computer science. Again, not really knowing more about the field than having spent my first year as a work study student in the computer science lab. But at the end of my first year, I also had my first internship and was able to come back my second year in college as a full on computer science major. And I spent the rest of my college career as a computer science student, struggling, enjoying, celebrating, rethinking my role in the world, because computer science is not the easiest topic at all, but understood that there was so much to do in the space and so much more that I wanted to do. And even by the time I graduated, I had not learned about human centered experiences or human centered design, or even human computer interaction, which in computer science is the closest thing to experience and research. I had, however, learned a little bit more about AI and artificial intelligence. And so I knew that was interesting and thought maybe I wanted to do more and decided to get my master’s degree and focused on agent based planning and understanding how to create and model agents and experiences using deontic logic, which was, I’m ing now, the logic of prohibitions and epistemic knowledge, logic, the logic of knowledge. How do we know what we know and what does that mean? So I’m building these models and understanding model based design. And then I learned through an article that I read about human computer interaction, which is not just how to create artificially intelligent systems, but how to ensure that whatever systems we create, intelligent or not, are actually usable. And I was fascinated. And one of the classes that I took that introduced me to that was called usability engineering, where we could actually test any technology that we created from a point of view. It wasn’t from a tech point of view. It wasn’t from a performance point of view, but it was from a point of view. And that excited me. And so that’s how I learned. This is my first year in a master’s program in computer science, where I’m learning about human computer interaction through usability engineering. And that was what started me on my way. I was already on my way towards digging into research more broadly in the computing space. But here is when I realized, you know what, I could better understand how to create, build, sustain technologies in the context of those who are using them. And those who are using them don’t have to know how to develop them. And so that was my first time learning more er research from the lens of a computer scientist. And I think I was certainly in graduate school, maybe three years later, when I decided, you know what, I’m going to focus on this thing called human computer interaction. And for a researcher in that space, that absolutely meant that it was research as an opportunity to really broaden my perspectives about what it means to create experiences for s, technical, non-technical, whoever they were. And that was my first introduction into research as a graduate student, technically. Steve: Was there a point at which some of the work you were doing in grad school looked more like research, like from a 2024 lens? Jamika: I think so, because in my program, and I went to Virginia Tech, which had, and still has, a strong program in human computer interaction in different areas of the practice, like visualization and human computer interaction models and frameworks. But I think for me, in the truest sense of research, I learned not just what it was, but how to practice it through my first project in a course that I took, where we were supposed to gather for a solution that we thought needed to be explored. And I think for this project, I was partnering with one of my colleagues in the program, and we were in a virtual reality class, and we were studying the impact of depth perception using in a cave environment. And a cave environment is, for virtual reality, is sort of this pseudo real experience that feels real because you’re putting on the headgear and you’re in a space that’s projected in front of you, and it’s immersive. And it was also new for me, but it was an opportunity to investigate, okay, how are people really engaging in this space? And that, for me, was not just an opportunity to understand and see research, but to embody it. Like, wow, this really is an opportunity to get real from another person in a way that’s meaningful and in a way that helps me and my partner and the whole field understand more or understand something we don’t already know. And that, for me, was fascinating because what I didn’t realize is that I’ve always been a researcher. Ask my mom. I’ve always asked questions and always asked why. And so, for me, it was a really nice way to see, oh, I know what this is, and I can apply it to something that I am right now in the moment getting trained to do. And it was the first time I think I’d ever really understood the value of loving what you are trained to do. Like, what is a career? Oh, it can be things that are fun and that are really interesting and that are also challenging in ways that I didn’t expect. But yeah, I think that was my first real introduction to research in a hands-on way through a course that I took. Steve: And describing it as a lightbulb moment or maybe a series of lightbulb moments sounds very powerful in the way you’re describing it. Jamika: For sure. And we think about a graduate program and even a PhD as being research. research is different, and it is much more applied and provides a lot more of what I call real-world perspective in a way that is truly unmatched. And so, for me, that was the appeal. And it led to even my research being a little unorthodox, but it was all about, for my dissertation work, but it was all about, okay, what are real people experiencing and how do we better understand that for the greater good? So yeah, that series of lightbulb moments, I think, really started me on the path that I find myself on even today. Steve: What was unorthodox about your dissertation work? Jamika: Well, I struggled. I’ll tell you this first. I struggled with my topic because I was in a computer science department, but my advisor was and still is a psychologist. And I was doing work that was highly integrated with bits of psychology, sociology, and anthropology. And so I struggled with, well, can I do this kind of work in a computer science department? And I was quickly assuaged. My fears were assuaged when my committee said, well, as long as we agree, of course, you can do whatever you need, you know, whatever. And that was all I needed to hear. And so it took me on a path of really understanding how people share high-stakes emotions across tech media and tech media and channels like email and phone and the internet, IM, for example, and even face-to-face. That’s a medium. And so I brought couples in relationships into the lab and effectively had them argue. That was my culminating academic research project, if you will. And the context around that was at the time, this was the early to mid-aughts, and there was an influx of technology options to remote interaction. So we were seeing a lot more IMs pop up. We were seeing instant messaging systems pop up. We were seeing a lot of people engaging in social media before we called it social media, right? There were lots of those activities and even people who were engaging in relationships and even maintaining relationships via email, right? We didn’t really have video at the time the way we do now. And so people had to stay connected, especially people who were away from their loved ones. It was important to do that. And so I found myself trying to understand, well, even for me in graduate school, I was away from my family and benefiting from understanding ways to get through some pretty heavy experiences and maybe even to resolve high stakes interactions like arguments and even extreme excitement were important. And so it led to my thinking about, well, what are the opportunities for using communications media effectively to communications for people in relationships? And the value there in relationships is that if people are vested with each other, then we can stress test the tech medium a bit to get at things like, well, how were they able to resolve the conversation or what were the extremes of the conversations that emoted the most or what were some barriers in the conversation that kept them from meeting a resolution or what were some interesting indicators, depending on the medium, for example, where they couldn’t see each other, where one person in the couple behaved a certain way and the other person didn’t get the benefit of seeing that. Those are some all interesting questions. And while broadly the opportunity of understanding how we use media, tech media to connect was interesting, but the underlying real question was, can we actually get them to argue in a lab? And we were able to do that, but it unlocked a lot of learnings about people, about how we communicate, about what we think about what we communicate to others and the ability to resolve any kind of conflict, especially when it’s not face to face using technology. Steve: So when couples came into this lab, were they having these interactions mediated through some tech platform? Jamika: Yeah. So they come in to the lab not knowing at all what they were going to do, which is by design of course, and they’re told what they’re going to do, how they’re going to be spending their time, and they are put into one of three groups. They’re told that they’re either going to be in a face to face condition, an IM, an instant messaging condition, or the phone condition. And if they are in the instant messaging or the phone condition, one person will be moved to another room where they could actually engage with each other separate from each other. And in the face to face condition, they were both in the same room. Now there’s some pre and post things that are happening. Like before we actually get started, there are course g informed consent forms that we can talk later about how I learned the value of communicating consent, but also doing no harm, because this work is really important to understanding that when you’re having people come into the lab and argue, we’re sort of creating a level of stress. And I’ll note here, probably appropriately, that we partnered with the University Counseling Center and we gave our participants pointers and paraphernalia that if they needed as a result of this conversation, or any time in their conversation, the Counseling Center was available for them. So that’s important. But that said, part of the preconditioning and pre activities were I wanted to understand what their current state of mood was. So I used a PANAS inventory, which is a, wow, I’m really digging back here. It was an inventory that measured people’s mood, their level of happiness. And there were some 50 to 100 questions I had to answer, just so we can get a baseline of how their moods were. So we did that before and after the experiment. And we also did a depression inventory to understand, you know, how are people feeling just in general? Again, what are their baselines? And how might our understanding of their mental state, at least at rest, if you will, change in the course of these conversations, which also helped us to determine, you know, whether or not they were actually arguing and able to get to some level of resolution. So we did that at the beginning, and we did not take depression inventory at the end. And before we started, we also had people talk about, you know, what are things that you generally argue about? You know, and there’s some common things, we can get into that in a minute. I think you asked me, you know, what was the process of determining what tech media, but all this is before they’re actually separated or assigned to their condition. But once we’ve taken care of all that, they were ready to go. And we’re then separated into the different, according to the tech media that they would be using. Steve: So you did this research, and then this is for your dissertation. What is your, I mean, it’s a terrible question to ask someone, but like, what is your dissertation conclude? You know, I think I’m sure you’re talking about weekly nowadays. Jamika: Well, I’ll tell you, I actually talked about this with a colleague a couple of days ago. So it’s not as unusual to talk about it as we might think. And I have some conjectures about why that is. And I’ll come back to that. But there were a lot of things, actually, that were the results of this research. And for those who are a bit more versed and probably closer to this than I am now, know that this research is an area of even media uses and gratifications theory, which means that the technology that we use can help us to deliver news with different kinds of results. And an example is maybe if you’re feeling uncomfortable about having a conversation with somebody, a friend or otherwise, but mostly a friend, you might, depending on how uncomfortable you are with the conversation, choose the communication media based on that feeling. So if I’m telling someone that I, you know, I’m sorry, I’m not going to make the party tonight and I know it’s a big deal for you, but I just I can’t make it tonight, then to keep from being overly uncomfortable, you might not call, but you might text. It’s not the good thing to do, right? It’s just it’s not good, at least socially. But personally, the pressure of having to tell somebody that you can’t do something that you said you were going to do or that is personally uncomfortable means that folks will likely choose the technology based on what goal they’re trying to accomplish and how it makes them feel in doing that. And so for me, it helped me to see that as one of the results that we found was that people are able to argue, first of all. So they do argue. But it also because we also let me back up and say this, we recorded them as well in the context of these arguments and that they were able to argue in front of a camera was also very fascinating. And it’s because of the camera in the room that enabled us to learn some other things. And one of the things that we learned is that, you know, people will self-soothe in arguments even when their partners can’t see. And it’s that self-soothing that can aid in the conversation or might give them cause to check out. And as an example, there were a few times where the men in the conversation, particularly in the phone condition, would touch their heads with their foreheads and sort of shake, you know, massage their temples as though they were in pain and trying to massage themselves because they were distressed in the moment. Obviously they’re arguing. But they really weren’t communicating that distress. But they were showing it on video knowingly to some extent because there was one person even in an IM condition when he wrote something back to his wife, he said, she’s not going to like that. And he looked right at the camera. Right? Like those kinds of things were really fascinating. Not that they necessarily needed to be witnessed by the partner, but that they were part of self-soothing for those who were experiencing or emoting in that way. So that was one of the major things. And the other thing was not only were people able to argue, they were able to resolve their conversation, resolve those arguments irrespective of the medium. The difference was the speed, obviously. People in face-to-face conditions were able to resolve their arguments or even the time they spent in the argument was shorter than those who use different media. And that is not surprising. Obviously it takes more time to type. And certainly the less rich the medium, the more time it takes to communicate certain feelings that may be more obvious by looking at people, obviously. So that was interesting that the medium didn’t really have an outsize effect on people’s ability to resolve the conflict. But seeing how people managed in the moment was really fascinating, especially when they weren’t face-to-face. And I think that for me was really interesting, particularly given the range of participants, the range of experiences they had in their relationships, and that they didn’t know what they would be doing when they came into this experiment. Steve: So where did you take things after this program? Jamika: Yeah. And I wondered what was next for me. During my graduate school career, I interned at IBM Research and got a fellowship through IBM and knew, “Okay, I’m going to go to IBM Research and be there.” But it didn’t quite work out, I think in part because while I knew I didn’t really want to take the tenure track, I think I knew I didn’t really want to go into the research industry either. And part of it was I felt that there was so much more for me to learn. I’ve always felt, and I think in many ways that’s why I continued my education, there’s so much more to learn, there’s so much I don’t know. How can I grow my thinking and my skills in ways that make me worthy to do more of this work without realizing that I actually had a good handle on things and was pretty good all along, but what did I know? And so it was thinking about, “Well, if I’m not going to be a faculty member someplace and if I’m not going to go into a research scientist role, what else is there? I didn’t know.” So I actually decided to do a postdoc and I worked with Jack Carroll at Penn State. At the time, he and Mary Beth Rosson, actually, who was my initial advisor, Mary Beth Rosson and Jack went to Penn State. So I ended up going to Penn State. Again, it was not planned, but I worked with Jack. And in that space, I was working on understanding, now that I knew that computer science could be a part of the human experience in ways that I didn’t realize before I started my graduate program, I thought, “Well, what else can I do?” And so with Jack, I worked on understanding how nonprofits might leverage wireless technology. And this was in the time where free Wi-Fi or walled gardens in cities was taking off. How might we get access to free Wi-Fi so that people could be connected to the internet and it helps to increase access to technology? And so I spent a year and a half, a couple of years, connecting with nonprofits locally and understanding what their technology needs were in the context of internet and access. And it was a really interesting and even fulfilling body of work because it helped me to understand that while I understand and am trained in technology and all the opportunities that there are, there are so many that don’t have that access. And forget understanding, it’s access. And that for me reminded me, okay, what else might I do in this world that helps us knowers of the technology and of its capacity and its capabilities? How do we bridge the gap between those who know and who have access and those who don’t or who rely on technology in ways that maybe don’t have a choice or for whom we can make those experiences easier? And so for me, it was an opportunity to learn, well, what does giving back look like? And what does doing good look like? And that was my next journey after finishing my PhD. Steve: I want to pick that up. I also want to go back first though, if I could. It seems like a large contrast to me as you’re describing it between people in labs, one on one, very experimental communication. It seems very like, oh, this is academic research. And to go from that to, I don’t know, social problems and entities and institutions and large social forces. And I don’t know if service design is part of the vocabulary there, but something that seems ethnography-y, the aesthetics, if you will, from the two different major studies that you’ve described, it seems like a really significant transition to hear you describe it. And with that, was it even a transition for you? How did you go from that dissertation world to the postdoc approach? Jamika: Yeah, that’s a really good question because to me, it felt natural. It felt like an obvious point of transition in my story where I could go from learning about how to do the work to actually doing the work in the field, in practice. And I think that’s what shifted my thinking about the kind of work that was possible for me. And it absolutely is a shift, right, from being highly academic, very much theory-based and driven, and showing our proof of concept, for example, in experiments and questions that we might ask ourselves that help us to get closer to answering our question, the big questions posed in our dissertation, to, okay, now how do we make this real? How do I make this real? And what am I bringing to bear in this work in the community? They don’t care about my dissertation. They don’t care that I did this other work. They don’t even care about the underlying theories. They’re just trying to feed the homeless, or they’re trying to youth in the community. So for me, it was, okay, now I can go from understanding the academics to applying those learnings in a practical way. And I think that’s when I realized, you know what, I am much more interested in applied research. How do I practice what I know in real-world experiences? Because that’s, I think, how I can do the most good. Not that I can’t do good in the academic space, or writing a paper, or doing research that changes hearts and minds, right? Whatever that means. I think it was, for me, an opportunity to, if not forsake one, then embrace them both. And that, I think, has been my purview since as well. I’ve never abandoned my academic roots, but I also recognize that practicing the work of research, or even research, is best when we understand the underlying frameworks and methodologies that help us to get the most, the best, the most useful results, right? So I think they go hand in hand, for sure. So it is a jump, but I try to do both. And it isn’t always easy, but I think it’s important. Steve: You were starting to bring up that this idea of giving back and doing good came out of the things you learned about access, and that that led you towards the next stages. What was that? Jamika: Yeah. Well, I love the way you’re setting up the story, Steve, because I’m going back down my own memory lane. And I am appreciating that, at the time, I didn’t like my journey. I thought, I mean, I got my PhD in, I think, ’07, ’08, and it was right around the housing crash. And so many of us who were finishing our programs were looking for work and couldn’t find them, because it was also, in many ways, the crash of the dot-coms. And so all of us were sort of competing for smallish roles, or at least small numbers of roles, not smallish roles, but maybe there were small roles. But there wasn’t a lot of work to go around, at least for those of us who were looking for very specific kinds of research roles. And so in my postdoc, not only was it an opportunity for me to figure out what I wanted to do when I grow up, it also helped me to find another area of research, and even applied research, which wasn’t obvious for me. And that was government research. And so that was what was next for me. I went from my postdoc to doing government contracting work, where I worked at the Office of Naval Intelligence and at DARPA, the research arm for the Pentagon, and did some fascinating research that, in my mind, was the next level, or maybe another opportunity to apply my learnings in some real-world situations to hopefully do some good. And that was my next journey, going into government research, which completely eluded me my entire educational career. I did not realize, “Wow, I could do this research? I could do this kind of work?” It didn’t occur to me. And so I think for those couple of years that I was at Penn State, it helped me to get in touch with what I really enjoyed, and not be so constricted by what I was expected to do in my career. As a newly-minted PhD, we’re told, “Oh, you should go pursue academic posts and be a faculty member.” But I knew that that really wasn’t what I wanted to do. And so giving myself the opportunity to explore really helped me to think more about what can I do and what makes sense for me. And so that led me to DARPA, where I did some pretty cool stuff in two different domains. The first was ing military service personnel who were suffering from traumatic brain injuries or PTSD and developing technologies that could help them, to developing game-based learning experiences for K through third graders to teach things like calculus and Newtonian physics, but through games. And for me, that was a nice connection between the human-centered design tech world of ing the development of technologies that could our military personnel to the outreach piece, the doing good, to helping our young people see math and science as a career option. And to use games through that was exciting. So that was my next chapter, if you will, of applied research. Steve: You said that at the time, you didn’t like your journey. It’s hard to reconcile because you’re telling it as a retrospective, but in the way that you’re telling it now, I can’t see what it is that you didn’t like about it at the time. Jamika: You’re right. There is the benefit of hindsight. At the time, I felt like my path should have been straighter because I’m already jumping at this point. To back up a bit, I got my BS in computer science, went straight to a master’s program. The summer after I graduated from my master’s program, I actually interned at IBM Watson, which is in Westchester County, New York, an IBM lab, and again, studying HCI, whatever that means, right? I don’t even the project. But I enjoyed it so much that I considered staying. But I also knew that there would likely be more opportunities for me if I pursued a PhD. And so instead of staying at IBM Research, I was at Yorktown Heights, for those who might know the area. I decided to actually teach. I taught for two years at Spelman College, an HBCU, an historically Black college or university in Atlanta, Georgia, for women. And so I was there for two years. And the reason I took that stint was I figured, well, if I’m teaching and if I’m close to a university experience, I’m probably going to be more likely to go back to school for myself. And I’m not sure why I thought that, but two years later, that’s exactly where I found myself. I found myself enrolling at Virginia Tech and was there for the next five years. So already my path is a little crooked. And in many ways, I was behind my peers who decided to keep going to grad school. And so for me, again, in my mind, my experience, we’re always told, go straight through. And there’s a process here. So I’m already breaking the mold. I’m not doing the things that made sense. Finished my PhD for being there five years. And then I’m not getting a job, as many of my friends would say. You don’t have a real job being in grad school, but I beg to differ. But then finished that and then went on to a postdoc, which was in many ways additional education for me. And so again, a little crooked path, not quite straight. But for me, even looking back, it helped me to really figure out what made sense for me and to be okay that the path was a little broken. Or in my mind, it seemed broken. So I finished my postdoc and then go into government research and thought, okay, well, I could get used to this, but there’s got to be more. And even there, I thought, well, okay, it’s a little crooked still, because I didn’t directly move into working at DARPA when I finished my PhD or my postdoc. I did a little bit of extra contracting work where I was at the Office of Intelligence and some other places. And so that felt a little crooked too. And again, I didn’t know the industry. I didn’t know that that was what research looked like outside of the academy and outside of the research industry. It just was. And so I had to get my head around understanding that, first of all, it’s my journey and it’s totally fine that it’s not all the way straight. That’s just life and that’s okay. But then also, I’m doing things, a lot of my peers and a lot of the folks whom I called mentors who came before me didn’t do. So I’m kind of creating my own path. And at the time, it doesn’t feel that way. It just feels different. And that’s what I . I that as part of my journey, but I also look back and appreciate that my role was different because it meant that I was meant to do different things. And that’s what you’re hearing. I’m even reconciling those differences. And the cool stuff that I can look back on having done, but in the moment I thought, “Okay, well, this is great for now, but what’s going to be next?” I never knew what was going to be next for me from graduates, from being an undergrad. Every point next happened because the stars aligned, not because I planned it. And I think that was part of what I was also feeling in the moment, not really knowing what was next, but not really appreciating what was now. Steve: It seems like that’s part of how we grow up is there, I think many of us go on these crooked paths and many of us are not told that that’s how it’s supposed to be. And so I totally relate to you feeling like, “Well, I must be doing something wrong because it’s something wrong. What I have exposure to says this is not what to do.” And you can see other people “succeeding.” And at some point, hopefully for all of us, at some point we realize, “Oh, this is actually the way it goes.” Jamika: That’s right. That’s right. And looking back, I’d have it no other way. But yeah, we’re not told in the moment. Steady as you go, it’ll work out. And this is exactly the way it needs to be. Completely agree. Steve: So what happened after DARPA? Jamika: What was great about that work is that not only did I get to participate in some really great work and interesting work and learn more about what DARPA is, high risk, high reward projects that shape our world. Like Siri started off as a DARPA project. The internet started off as a DARPA project. And so these opportunities to impact at a different level were new to me. I knew what the academic route was for even understanding research and experimentation and experience. And the frameworks that govern those experiences and the knowledge of those experiences. I even knew what the applied world looks like, particularly in the context of social good and community service. But now I was starting to learn you could actually drive research and research programs in a way that can actually change lives at scale. That’s what I learned in the government work. And that set me on a couple of different trajectories actually, which I think are the same, but it got me to thinking in my mind about dual track opportunities for myself. It was when I decided, you know what? I’m going to keep doing this research thing. I’m going to keep building and growing my skills and learning more, learning more about the community and all these things, especially through my work, my day job. Then I also thought, you know what? I think I want to be an entrepreneur too. I want to start my own business. I want to do more of this social good in the context of tech in a way that allows me to grow in different ways. And so that was when I continued to do research. I continued to be part of organizations that enabled me to use those skills. And I also started my own tech nonprofit where I was able to apply my academic research background in doing research, but also developing programs and experiences to develop capacity for tech capacity in particular for black women and girls through my organization called blackcomputeHER. And so there’s blackcomputeHER. That’s a part of my world now, or at least after government work. And then I’m also doing research as part of my day job and negotiating the two because there’s a day job and then there’s the after job work, which is a part of the work of an entrepreneur anyway. And so that was next for me. How might I take my learnings and go to yet another level that enables me to use my skills meaningfully and to explore what else was out there, perhaps in ways that others of my peers or even others who’d gone before me hadn’t done. So again, I’m thinking, okay, I’m going to give it a shot. I don’t know what it’s going to look like, but I’m going to go for it. And that was what was next for me, being an entrepreneurial minded researcher who even now is learning to blend the two in everything that I do. So that was what started for me as, okay, let me figure out what this entrepreneurship thing is and what’s possible. Steve: I’m struck by the fact that, you know, in describing your choice and your motivation to launch something like blackcomputeHER, and you’ve mentioned a few times being interested in giving back and doing good for the world, but you identified one or one and a half other motivations for you, which I hadn’t really thought about. Like why do people do things that are good for the world? And you said this is also a way to develop your own skills, is to learn and keep growing as a person and as a professional and as a researcher. And then even the entrepreneur aspect of it, which I think maybe straddles between the two. It’s interesting. And maybe it’s, I guess it wasn’t obvious until you said it, and maybe it should be obvious that motivations that we have for phrases like giving back are so loaded in motherhood apple pie type , but it’s not incompatible to our own selfish pursuits that we can do things that are good for the world, but also we’re hungry for because of what we’re interested in, in our own development. Jamika: That’s right. Both things can be true, right? And this was also at the time, and I think it’s less amplified these days, but, you know, it was a big deal eight, 10 years ago to have as a skillset for a major tech company, entrepreneurial thinking, right? What does that mean? How do you bring big ideas to bear for the organizations in which you work? And so that was also on my mind at the time as I’m thinking back and reflecting on what you’re saying here, because I thought, well, I think that way, I’m sure, because you have to think that way from an academic perspective. That’s what professors do. They are entrepreneurs of their own labs, of their own research groups. And it was just a different way of thinking about it. And while I did an adjunct role for a few years and I was able to write grants and secure funding for some of my nonprofit work, the nonprofit became the vehicle for more, right? More learning, more capacity building for others, more expansion of who I am as an individual, how I think and what I wanted to do and contribute in the world. And that doesn’t have to just be about where I work, right? I’m all the things. And I think that’s exactly a good way of thinking about it, Steve. You know, both things can be true, starting your own business or learning can merge in ways that you didn’t expect. But for me, I’ve learned, I think we’re quite necessary for me to get to wherever I’m going next. And I’m not sure where that is yet, but, you know, it’s all a journey. Steve: When you started blackcomputeHER, what were some of the things that you were doing? Jamika: Well, when we started, our mission was to, and still is, to provide professional development and capacity building in tech for Black women and girls. And the reason that matters is because Black women in tech are one of the least represented or Black women are one among the least represented in tech careers. And design isn’t so different if we think about, you know, where research often falls in our career spaces. So as a person who identifies as a Black woman and having experienced differentiated experiences as a Black woman in tech, I thought, well, what else can we do? And I have two cofounders who are also Black women, and we thought, well, what can we do to, if not change the conversation, we can at least create community? And we’ve been able to do both. So we’ve created community by starting out with a conference for Black women and girls and allies. I mean, everyone is invited. But we’re very clear that if we want to provide parity in the field, in the professional workforce, then we have to ensure that there’s parity of people who are in the workforce for the skills that they bring to bear. And so how do we start? Well, let’s figure out who the community is. So we started by having yearly conferences where we grew from about 12 as our first initial planning experience to over 200. We’ve had over 200 in our meetings. And the value of the community is that we get to know each other. We can point to others who are also in the field. And it ensures and reminds us that we’re not alone, you know, in our growth and in our advancement, and in our pursuits of our careers in tech and design. And the other thing that it’s done is helped to change the conversation and change what we know about women, women of color in tech, and Black women in particular. And my work with blackcomputeHER also led to my being part of the National Academies Committee where we actually focused on that very thing. How do we change the trajectories of women of color in tech, knowing that there isn’t equity, there isn’t parity, but there’s more that we can do to change the conversation. And so that, I think, has been a really great opportunity for me to keep learning, keep pushing, keep changing the conversation, especially through things like intersectionality. How do we acknowledge that there are different experiences for different people, particularly given the different kinds of discriminations that they feel, but also the way that they identify? And how do we bring that level of awareness to the work that we do, to work that I do? And I can say that now that I’m at Capital One, if I go along that journey of, you know, after DARPA, I came to Capital One and have since, if I can pull the blackcomputeHER face to the Capital One work, I started our intersectional symposium, the first that we’ve ever had, where we talked about our experiences as people coming from different identities and from different dimensions of experience. It all matters. And that actually is a really great way not to separate us, but to actually connect us, because I can have more in common, you know, with a white guy, with you, Steve, than I do with other Black women, because of my experiences, because of my journey, because of my growth opportunities that I’ve had. And that doesn’t make me better, it just makes my experiences different. Steve: What have you seen people who have participated in the blackcomputeHER events go off and do? Jamika: One of our staple programs that we have through blackcomputeHER is our Fellows Program. And our Fellows Program is different from a traditional fellowship. We don’t provide funding, but we do provide an experience. We provide with a cohort of those who apply. This is an application experience, so folks have to apply. We invite a cohort to participate over the course of the year, where they participate in webinars that feature subject matter experts across a range of tech areas, even executive coaching expertise and executives in the field. So we provide access to experts in a way that our fellows who are early to mid-career professionals don’t usually get. So we provide that level of exposure. And I’ll say that as an example of a good news story, and there are a few of them, but a good news story that I have is that one of our first fellows from our first cohort — we’re moving into our seventh or eighth year now, so we’ve been doing this for a while. And one of our fellows who ed our program was in transition. She didn’t really have a role in tech, but was interested in starting, but had done very well in a STEM career. Maybe it was chemistry or something. I don’t what it was. But she was interested in moving into a tech career and knew that to do that, she believed that she needed to go back to graduate school. And so she participated in our program for a year, was exposed not just to our mentors, our experts, but also to other fellows in the program. But a couple years later, she sent me a note saying, hey, Jamika, I just wanted to share with you that I have applied to and been accepted into MIT’s program where I get to study more about tech and design experiences, which was really exciting to hear. And that’s exactly what we want to be able to do, to provide a space for women to know that there are more opportunities out there. And if we can provide an extra rung in that ladder, another step up to help them see and find those opportunities, we want to do that. Now she got into MIT on her own. I’m not going to take credit for that. But I do think there’s value as she’s communicated to us and she’s come back and participated in other programs, is that there’s nothing like community and exposure and connection, especially when you’re one of a very small number of folks in a space. It can feel like you’re the only one ever. But that I think is a mistake for any of us because it can keep us from doing, being, offering more. And that for me is exciting. We’ve got other experiences and stories like that, but I think that’s one of my favorite. It hearkens back to me, our discussion about being uncertain on our paths because we’ve only been told this is what a path should look like. Steve: And that I can imagine for this woman that you validated a path or she took from that fellowship experience the confidence to do what she was already capable of. Jamika: I hope so. You know, the research tells us, especially in broadening participation and in areas where we’re trying to get even more women in the field, role models and cohorts matter. So again, that’s part of that academic training of knowing that there are frameworks that I’ve applied before and other training programs that I’ve created for young people who participated in. This is what helps us to grow community and capacity. And so applying that in this context is again, a nice combination of what do I know in this space and then how can I apply it to create the change that I think we need in the world. And so it’s been a really great opportunity to grow, to fail, to have really great experiences, a celebration, which I think is what entrepreneurship is, right? Like, would you agree? Like that’s part of the gig. And so it’s been fun. It’s been a journey, but I hope to keep learning. Steve: I hear both, you know, trying to individuals, but also exploring how you can kind of change the systemic aspects. I have a naive view of that. So my questions are going to be naive, but it’s clear how you help these individuals and your example of the symposium at Capital One seems like that’s about the system. And I guess, is this true, the more Black women go into tech, the system does change based on who’s in it. And that I think naturally would change the opportunities for black women in the future. Is that how this works? Jamika: Yeah, I think so. I mean, I’m one of those people who, I mean, systemic change is hard and it takes a long time. And so I don’t profess that any one thing I do or even any number of things that I do can change the system at Capital One or otherwise, actually. I actually appreciate that we were able to have a conversation around intersectionality at a tech company that happens to be a bank and is also very highly regulated. That hadn’t happened before. And so in my mind, yeah, it was a way to talk about some real experiences for people who needed to be heard. And it wasn’t just Black women, right? I mean, all of us have experiences and all of us have stories. All of us have identities that make us who we are. Race and gender are the obvious choices, but they don’t stop there. Where did we grow up? What’s our educational background? How many languages do we speak? Are we living in this country, but maybe coming from another country, right? How many degrees do our parents have, right? There are all these dimensions of who we are, including how freely and able we are able to move in the world. Like that’s a level of identity that we often don’t think about. But all of these dimensions and all of the dimensions are very rarely held by one person. There are so many parts of who we are that makes that conversation around intersectionality so meaningful for everybody. It is not an exclusive term. It’s actually an inclusive term. And I appreciate that we’ve been able to have that conversation at Capital One, which I hope begins to empower others or help others feel empowered to have similar conversations. And I’d be very curious to know if any other organization is having that level of conversation because I haven’t heard it. I’ve been invited to lead those conversations in other organizations, in university as well, but it’s also not an easy conversation. So I think what my approach is in that recognizing that there is an opportunity for systemic change is to know that the system can be fixed, but that isn’t necessarily my job. I want to make sure that for those of us who are in the system, that we feel ed, that we have what we need to succeed and persist. And that when I have an opportunity as Jamika to lend my own experiences and expertise in an effort like through the National Academies of Science and Engineering, which is a pretty big organization, then I can also share, well, here’s what else we can do. Here’s what we’ve learned from blackcomputeHER, for example, that really helps our community persist. Or here’s what I’m learning that we’ve discussed in an organization that is not all Black women, but we see the value of hearing our voices. I think finding ways to, again, be myself in both worlds is what I strive to do because again, I’m more than the work that I do. I’m Jamika. I happen to be a lot of different things and do a lot of different things and have a lot of different hats that I wear. So I would love to be able to, I would love to hear other people talk about themselves in that way too, which I think goes back to the experience. Who are we? That’s really what I think my role as a researcher is. How do we help people to not just tell us who they are, but to be comfortable being who they are wherever they are? That for me is pretty cool. There’s context in the middle of all that, but at the end of the day, I want to know the people that I’m serving. There are lots of different ways to that and to it meaningfully. Steve: I guess to say yes and to that, yes, we can do a great job at helping people to tell us who they are, but also sharing that information in a way so that people who are otherwise reduced to artifacts on a wiki somewhere or marketing documents or whatever, trace logs through a tool are more fully presented or richly presented and in a way that looks more like how they’d want to be presented. Jamika: That’s right. That’s right. More of that, please. Yeah. I think that’s always easy or obvious, right? Sometimes even in recruiting participants or thinking about who might give us the . It’s as simple as ensuring that on the one hand, our participant pool is diverse, but how are we ensuring that even for those who might be considered the mainstream audience, that we’re not pigeonholing them in their experiences either? How are we soliciting the kind of that truly is inclusive, not just through a diverse pool of participants? I think again, it’s a different way of thinking about experience and enabling that extra layer of for the people that we serve. Steve: Since we’re talking about Capital One a little bit, could you just maybe give a little context to what you work on and say a little more about Capital One and your work there? Jamika: Yeah. I am head of research for data and AI, and that really means that our researchers on my team and I are responsible for ing data rich and data inspired products and experiences. And we’re ing and beginning to AI and machine learning based experiences as well for our customers. Steve: And is that part of a business area or a product area at Capital One? Jamika: Yeah. In particular, Capital One is a highly regulated banking tech company, has lines of business that it s, card and bank and such, but it also has enterprise level s and the enterprise s really enables all functions of the company, including card and bank. And so my role is in the enterprise area of work where I in particular am really interested in ways to scale our efforts and to apply our learnings in ways that not only help our customers externally, but s our customers internally, which can be associates, right? Or me or others in the organization. So that’s kind of where I’m situated in the kinds of problems that I’m solving. Steve: When you think about not just Capital One, but the profession that we’re in writ large, and I know you have lots of exposure to researchers from all sorts of different organizations through your own career. In this world of, I’m going to call it like modern day corporate research, what do you think doing good looks like in our field right now? Jamika: Doing good. I acknowledge it probably means different things to different people. For me, as a researcher, and as someone who wants to do good in the world, like I just do, it means understanding people’s needs in context and providing opportunities for them to succeed. That’s what that means for me. Success can mean different things to different people. I can guess what success means from a business perspective. I can even guess what success means from a researcher perspective, but ultimately it’s that end who tells us whether or not we got it right. I want that person to feel as a , an end , free to share with us when we got it wrong, but also when we got it right. Building the right kinds of relationships and the right kinds of processes that that are really important. Relationships between whom? Well, I think there are lots in the process, but certainly the relationships with our end . Again, who is our customer? Who are we serving? That requires a level of engagement and relationship that I think allows us to really deeply understand their needs and to their successes, whatever that looks like. What are they trying to do? What activities or tasks or jobs are they trying to get done? That’s the context that defines what success looks like. I’m very interested, and I think doing good means ing those tasks in context in a way that helps them feel that they can determine what success is and attain that success in context. Steve: This is going to be a slight non sequitur. I just want to go way back to something you said early on. You just were talking about yourself as a child. I think you said you were always a researcher and that you always ask questions, but I wonder is there more or do you have a childhood story about a canonical kind of Jamika thing that was like you as a researcher? Jamika: I know I always ask why, which isn’t so unusual for kids. I being insatiable in my questioning. In a way where I , and maybe it’s just my mom getting exhausted, but I I’d asked a question. I took dance as a kid. Before you really learn proper form, you do a lot of kicks and twirls. I was a little girl. That’s what I did. I once I was asked to be part of a choreography for an organization or church or something. I don’t know what it was. I had asked my mom who was helping to choreograph. She wasn’t a dancer per se, but she ed me and my siblings and was always there. I thinking, “Mom, what if we do
01:13:12
47. Akshay Verma of Duolingo
Episodio en Dollars to Donuts
This episode of Dollars to Donuts features my interview with Akshay Verma, the head of Research at Duolingo. We talk being qualitative focused in an experimentation-driven organization, research team structures, team rituals, and sharing knowledge between researchers. I don’t actually want to bemoan or belabor this concept of a room that we’re invited to or not. At Duolingo, I feel it pretty acutely just because we do have a lot of rituals and traditions at Duolingo around how product gets built. And it’s great. It works. It works really well. But, you know, I could spend a lot of time and energy going crazy, being like, “How do I get invited to these rooms?” and then get upset when it doesn’t happen? I actually don’t. I try my best, but I think our energy is probably spent elsewhere.” – Akshay Verma Show Links Akshay on LinkedIn Duolingo How Duolingo is using its ‘unhinged content’ with Duo the Owl to make people laugh on TikTok FigJam IDEO Stanford d.school Calendly How the Underground Dance Music Scene Makes Me a Better Researcher Get Familiar With Detroit Techno: 10 Essential Songs Help other people find Dollars to Donuts by leaving a review on Apple Podcasts. Transcript Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead research in their organization. I’m Steve Portigal. The other day I debuted a new client workshop on “Boosting Research Impact.” We did this remotely, with me being in California and the enthusiastic participants in Europe who stayed late. One person used the excellent simile “It’s like I’m being asked to solve global warming” – in this case they were referring to the kinds of research request they get, but it highlighted for me how the team did such a great job at not spiraling into higher-level, structural, and less-easily solved challenges around having impact on stakeholders, but stayed within the framework I gave them, which was to consider specific, practical steps they could take before research, during research, and after research. There’s that cliché that asks, how do you eat an elephant? And the answer is one bite a time. These kind of frameworks turn an elephantine problem like having impact with research into something more manageable. For this team, it worked very well to dedicate time and space for collaborating creatively in order to change the approach. We also looked at what’s already going well, and how to extend or expand that. In just a couple of hours they generated a number of new changes that they can prioritize and build from ideas into actual programs and practices. I’d love to talk with you about the challenges your team is facing and how I can help. Okay, let’s get to today’s episode with Akshay Verma. He’s the Head of Experience Research at Duolingo. Steve: Akshay, thanks so much for coming on Dollars to Donuts. Akshay Verma: Thank you for having me, Steve. Steve: I really appreciate getting to speak with you. Akhay: It’s a total joy to be here. Steve: So you are the head of research at Duolingo, is that right? Akhay: That is correct. Everyone’s favorite green owl and unhinged TikTok presence. Yes, that is true. Steve: I want to know about those symbols a little bit too, but what is Duolingo? Akhay: Duolingo is the world’s most popular language learning app. Akhay: We recently expanded beyond language as well. We’re rebranding ourselves as a learning app. It’s also a bit of a video game. I think we really lean into the fact that it’s a really gamified experience. It’s a really fun, gamified way to learn a language or now you can actually learn math and music on Duolingo as well. Steve: And the owl. I think I know about the owl, but explain the owl. Akhay: The owl has sort of had a life of its own in the last couple of years and really made a wave on the internet. But yes, our icon, our mascot is a green owl. And somewhere about two to three years ago, our lovely social media manager, Zaria, started experimenting with TikTok and just had the owl take on a life of its own with a lot of really funny, really unhinged TikToks. And so now the owl is kind of like a celebrity. And actually there’s videos of the owl sometimes. They’ll be recording a TikTok in Madison Square Park or whatever, somewhere in New York. And it’s like a celebrity. Like people will come up to Duolingo and take photos with him. So he’s become a cultural icon, but he is sort of the shepherd. He shepherds language learning and learning through the app. Steve: And his name is Duo. Akhay: His name is Duo, that’s correct. Steve: What does research look like at Duolingo right now? Akhay: Actually, we’ve been in a period of flux and change over the last year. I think a really exciting one, but a period of flux nonetheless. So research right now at Duolingo looks like this. And I’ll tell you a bit about what it used to look like about a year ago to also provide a point of contrast. So we actually, about a year ago, moved from being in our product org to being in our design org. And I’ll actually contextualize this a bit further. Duolingo is a pretty small company. We’re about, I would say, 700, 750 employees total right now, which is fairly small. I used to work at Spotify, which I believe the time I was there was closer to 7,000 to 8,000 for comparison. And our product and design teams are around, let’s say 150 people total. And there’s eight people on the research team right now. So actually fairly good ratio, all things considered. But research at Duolingo right now looks like we’re part of the design org. We’re still settling in, in a way. Change takes time. And so it’s been a year, but we’re still settling into being in the design org. We’re a fairly qualitative team, slowly moving towards being mixed methods. We do a lot of longitudinal research. We exist in a sea of experimentation. And so we have to think a lot about how do we coexist in a company that has a lot of A/B tests. And yeah, we’re just a really lovely, lovely team of people who are ionate about languages and about learning and around gamification. And we each kind of have our specialty and flavor that we bring to the team. Steve: You said that your team shifted from reporting to product to reporting to design. What led to that? Akhay: Yeah, a few things led to that. We had a bit of internal restructuring with certain individuals and people leaving and people ing. And it was a really good moment for us to reflect on where should research sit. And so part of it was just riding the wave of other organizational restructuring that was happening. And then this question around the industry standard alignment that research typically has is with design and the type of impact the team was having at least about a year ago and leading up to that moment was probably closer aligned with design. And so we kind of took that moment to try that out. And so I think it was partly just riding the wave of other moves and reorgs that were happening and then trying to think about industry standard wise, where should we be? I have a lot of thoughts on reporting to product versus design just at large. I’ve done both before. There’s pros and cons to both. I’ve reported to an insights function before that’s multiple insights disciplines, and that’s worked really well too. So I have some thoughts, but I think at Duolingo, it works really well for us to be in the design org. Steve: You teased us with you have thoughts. I guess I want to hear what those are. Akhay: Yeah, let me give you some context about the teams that I’ve been a part of over the last eight to 10 years. So I used to be on the lovely research team at LinkedIn where we reported into design and design actually reported into product. And so I’ll come back to that in a second, but I sort of grew up, so to speak, in the heyday of our field in that environment of being part of design and then sort of by extension being part of product. I then moved over to Spotify, which I back in like 2016, 2017, it was really pioneering in tech where research could fit in an organization. It was one of the first companies that brought together data science and research, and then eventually other functions like analytics, engineering, and other related insights disciplines into one insights org. They’ve iterated on that model a little bit, but I was at Spotify in that heyday of my teammates were researchers and data scientists, and I would leave work streams and teams across both disciplines. I then moved over to this company called Gong.io, which is this sales product for, it’s an AI product for salespeople. And there I actually reported to our head of product. And so I was very, very closely intertwined with the product org. And now at Duolingo, as I mentioned, I ed when we were part of the product org and we have since moved over to design. So I’ve sort of seen a bit of the design thing, the product thing, the insights thing. And yeah, you mentioned, not to generalize, but there’s definitely trade-offs in each one of those. And I think it is, not to give the researcher, it depends answer, but it does. It depends on the context of an org and it depends on a few things. It depends on what is the product culture? Is it a very experimentation driven company like Duolingo or Spotify, or maybe not so much like LinkedIn. Again, LinkedIn had a ton of experimentation as well, but I at least worked in parts of the company that were a lot more like strategic and forward thinking and new product building. So I think that really impacted the role that research could play. It also depends on a lot of organizational rituals, right? Like at Duolingo, we have a lot of rituals, like we have product reviews that certain people attend. We have different permutations of leads meetings. There’s a lot of stuff that’s organizational specific. And so as a result, I think it just, it depends where you are, but generally if I were to generalize, I love being part of product. I actually love being part of product because I think it gives us a closer through line to actually make product impact. And I always say some of the best researchers are product managers who are secretly researchers. And so I think being part of product really helps with that. I felt that definitely at Gong, like I was in product leadership meetings. I was working really closely with my product managers and those were my key stakeholders. But I also like being part of design because if a company like, and Duolingo is a good example of this, if a company really wants us to make very visible, tangible impact on how a product looks and feels and functions and how features are designed, being part of design really, really helps with that. And so I guess this is my jumbled way of saying it depends and pros and cons to both, but I personally love being part of product. And then I didn’t mention the insights disciplines. I loved that long-term. I don’t think that works as well long-term. I don’t think being your own, own discipline that doesn’t ladder up to product or design is actually that effective. Steve: Why is that? Akhay: You know, I found that at Spotify, it was interesting because we had so many insights professionals. We had like 150 researchers, probably 400, 500 data scientists, which is crazy. We just had so much visibility, which was really interesting. But I found that since we didn’t report into product and we didn’t report into design, we didn’t have the same level of influence because our set of stakeholders were each other, which was great. But then I think we got lost in some of those rooms. We got lost in those leadership meetings with the rest of your discipline because the rest of our disciplines was just other insights folks. And so I think that was a big downside. I’m trying now to place myself many years back and think about the things I used to feel, but the lingering, I think impression I left Spotify with was again, love Spotify, a huge place in my heart for Spotify. I just think it was tricky to have impact as research, as researchers specifically with being so disconnected from the other functions. Steve: How does reporting structure shape the way you’re able to influence people? Akhay: You know, part of it is very literal. Like I said, even just things like the meetings you’re a part of week over week, which permutations of team meetings or leads meetings or leadership forums are you invited to? Part of it is literally just that. It’s like if you report into X discipline, you will go to X disciplines set of meetings. And so I think that really shapes influence being part of a company, especially bigger companies that have lots of people where a lot of our job is influence and a lot of our job is fuzzy and it’s the people you meet and it’s the people you can have impact with. It’s truly just like, which rooms are you a part of and which rooms are you not? And frankly, reporting structures as silly as it sounds, like has a huge role in that. I think the other thing that’s a little bit more amorphous is the incentive structure that comes with it, right? The incentive structure for someone who’s in, and I’m no longer explaining to someone who’s three by using phrases like incentive structure, but you know, what are you incentivized for by being in a product org? It is shipping great product, making sure it moves metrics and continuing to do that, right? What are you incentivized for if you’re in a design org? Probably something similar, but from a different lens. And so I think that really impacts our influences research as well. Like ultimately who is deciding what success is, it is the person who’s leading the org you’re in and whether that’s product or design, that’s going to impact that. So I think those are the two things that come to mind, one being very literal and one being a little bit more around what are you ultimately incentivized for? What is research as a discipline incentivized for or with rather? And yeah, what does our success look like as individuals and as teams as a result? Yeah. I mean, I think it’s possible, but again, it’s just, it’s so dependent on the person and it’s so dependent on the rituals that different teams have. And so I do think that’s possible. And I think that’s the classic conversation around having a seat at the table or being invited to the table. The things we often hear about is almost like tropes in our field, like researchers invited and we have a seat or we’re not invited. How do we get a seat at the table? And sometimes it’s literally that it’s like, which org do you report into? And as a result, are you invited or not? Of course, there’s always room for change. Of course, there’s always room for influence, but you’d be surprised you put people together to accomplish certain goals. And we exist in the structures that we are a part of in a company. And that typically ends up being the function that we ultimately represent or report up to or whatever it may be. So yes and no, I do think there’s opportunity. And I’ve seen great examples of that at Duolingo, for example, because we have a bit more of a blank slate to work with, to be honest, because the research team is quite nascent. And so I’ve been able to experiment with like inviting us to certain again, rooms or meetings or whatever we want to call them that we haven’t been part of before to varying degrees of success. It’s not always been a slam dunk, but that’s not always the case, especially as you get more complex and get to the sizes of a LinkedIn or Spotify, which are like tens of thousands of people. And there’s so much chaos and complexity as a result. Steve: In a company at the size of Duolingo, when someone like you invites themselves to different kinds of meetings, like you said, varying degrees of success, what does a slam dunk look like and what does a failure look like? Akhay: I’ll share a recent example with someone from my team. One of the researchers on my team, it’s an area, you know, at Duolingo, we have three main areas similar to most companies, right? We have a learning area that thinks about all the learning content of Duolingo. So all the actual learning related material that the app offers. We have a growth team, which is pretty structured, which is similar to a growth team that you might imagine, which owns all the gamification mechanics, our DAUs, our daily active s, our top line growth metrics that we look at as a company. And then we have the monetization area, which thinks about our subscription. So the ways that Duolingo makes money. So everything from the purchase flow to what value to s we offer from our subscription, et cetera. So that was just a bit of context, but one of the researchers in my learning area, and this is a slam dunk example of, you know, over the years has built great relationships with leads. We typically haven’t been involved in, again, certain permutations of leads meetings, just because research historically has just never been invited. And recently he mentioned, you know, he was part of a conversation with that group and they were talking about certain product ideas. And the team was pretty amenable to the ideas until said researcher was like, Hey, actually we have like years of data and research around this. And my perspective and intuition is this maybe isn’t the best idea. And we should maybe think of this in a slightly different way. And the team completely agreed and decided, actually, let’s pause on this new thing that we were exploring. And that to me was a slam dunk of like, Hey, we’re not typically in this room. We kind of are testing the waters and inviting ourselves in. And as a result, we can shut things down before. And I don’t even want to use the phrase shut things down because it’s more like, Hey, we can actually like redirect the team before because we have that knowledge and we have an archive of knowledge we have built over the years as a research team. That to me is a great example of like, right, incentive structures and, you know, how do we play a role in these situations? And then let’s think about like, what does, and I’m blanking a bit on your original question now. Steve: You know, you said it could go well or it could go not well. There’s degrees of success. Akhay: Yeah. Steve: And so I was curious what going well looks like and maybe what going not well looks like. Akhay: Yeah, I think examples of not going well. I think a lot of researchers will be able to relate to this, but this is for better or for worse, something we experienced quite often. Duolingo is a very experimentation driven company. Like that’s the bread and butter of Duolingo is we do a ton of experiments. And oftentimes we are involved in maybe the early parts of planning and experiment that we might run, which can sometimes be pretty laborious. That could take weeks and months of prep and experiment hygiene and analysis. And there’s been so many times me or my team have been parts of those conversations and said, this is not a great idea. And we do it anyway. And then a couple of months happen and what we kind of expected happens. And I’m not here to like, I’m going to make sure that never happens because that’s impossible. Sometimes you have to let people do what they’re going to do. But that’s a sheer example of like, we are not involved early enough or we’re not involved period. And as a result, we will catch wind of something that we spent two, three months doing. And then a researcher on my team in a one-on-one will be like, had they shown me this, I would have told them exactly this three months ago. And that’s the downside of the, had they shown me this, right, is things just fall through the cracks if you’re not parts of those conversations. And yeah, so I think that’s an example that comes to mind of something I hear a lot. And I know a lot of researchers will be able to commiserate feeling like we’re not involved enough or early enough or in the right, with the right depth or at the right point until it’s too late. Or we never hear of things. And then we go, yeah, we can point to like four different studies we’ve done that would have probably like shaped how you’re thinking about this had you told us. Steve: I was reflecting back a couple things that you’re saying. One is being in the room, is having that information… Akhay: Exactly. Steve: …is being aware of it. That seems like that’s the base. And then from that, you need to be able to, for lack of a better way of putting it, be listened to. And so in your slam dunk example… Akhay: Exactly. Steve: …that was a new sort of room to be in. That was new. But I think you’re also saying that the researcher in question had built relationships. Even if they hadn’t been in that room, they had relationships. Akhay: That’s exactly right. Steve: Coming to that room is not brand new. Everybody, they knew each other. And then when they made a recommendation or a suggestion, that was taken up. Akhay: And you know, it’s interesting, just a meta reflection on this. I don’t actually want to bemoan or belabor this concept of a room that we’re invited to or not. I think all of this is very true. And actually at Duolingo, I feel it pretty acutely just because we do have a lot of rituals and traditions at Duolingo around how product gets built. And it’s great. It works. It works really well. But, you know, I could spend a lot of time and energy like going crazy, being like, how do I get invited to these rooms and then get upset when it doesn’t happen? I actually don’t. I try my best, but I think our energy is probably spent elsewhere. You know, and again, this is sort of an existential reflection on it, but I don’t know if necessarily this metaphorical room is something that I think is like the biggest issue right now and for my team or anything, but I do think it’s an important one and it connects back to our conversation around reporting structures and all of the like company context that ultimately impacts these metaphorical rooms that we are a part of or not part of. But more to say, it’s funny talking about it because I’ve intentionally reflected to myself about how I don’t want to belabor this, especially as a leader of a team, because that’s a recipe for like a constant downward spiral. If you know, we as a team or me as the leader of this team is thinking a lot about just this idea of these rooms and the rooms we’re not a part of. Of course, I do think about it, but I catch myself being like, okay, no, no, no. How do I make best with what I have and slowly in part change over time? Steve: And then what does that have to do with the rooms that you are in? Akhay: Exactly and like over time, how do we focus our energy on that? And part of it, again, going back to a reporting structure, right? If we reported into product, you know, I would be in rooms that I’m not in now. That would be great. I would love that. But the downside is I wouldn’t be in a lot of rooms I’m in now being, you know, a design leader. So, you know, trade-offs there, but it’s a very literal thing that impacts the rooms you’re in or not. Steve: And I think you’re saying that it’s healthier to focus on what relationships and what rooms. You and I are never going to say rooms again after this conversation. We’re going to be so sick of that word. The teams and the meetings where you are there… Akhay: Yeah. Steve: …and you’re trying to add value, it’s healthier to focus on what you do have access now to… Akhay: Yeah. Yeah. Steve: …and where you are working versus, gee, where you could be working. Akhay: Yeah, exactly. And I, you know, I say that to a lot of people on my team or just a lot of other researchers a lot too, because we are a fuzzy discipline. We are no matter how much, you know, I think it’s important to like continually have the impact conversation and to be evolving our baseline of what good impact looks like and be able to measure it, you know, as tangibly as we can, but ultimately we are a fuzzy discipline. And so it’s really easy to get stuck in these spirals about, you know, what rooms we’re going to be a part of and what rooms we’re not. And yeah, you’re right. I’m never going to say rooms again after this conversation. Steve: Sorry, everybody. Akhay: But I always tell people, no, we got to, we have to make the best of what we have, but also recognize that like, yeah, we can still push for change. And that’s where I come in and that’s where, you know, we all play a role, but that’s where, you know, I do come in. But my method is slow and steady and seeing over time what works and what doesn’t and not letting myself burn out over the context of the company that I exist in. Right. Yeah. Think about it a lot. Steve: You said a couple of times that, you know, you’re a qualitative team and an experimentation-oriented company. And even in the last things that you were just talking about, you said, “This is fuzzy work, but we have to measure impact.” How do you think about what impact looks like for the fuzzy work that you do in a company whose culture and thought process is oriented towards experimentation, A/B testing, and so on? Akhay: It’s probably the number one thing I think about. And it’s true. I think one of the joys of Duolingo is that it is very experimentation driven. And again, I say joy because to me it is a good thing. I could probably be like, oh, my team, it’s so hard to be a researcher because we’re so experiment driven at Duolingo. But I don’t think that’s the case. I actually think there’s a lot of work we don’t have to do because we’re going to do the experiment anyway. And I love that because that frees up my team to work on things that aren’t so focused on experimentation. However, the other side of the coin there is it’s hard to show impact, right? If the impact is like, okay, what experiment did your research specifically lead to and what were the metrics that it moved? A lot of my job has been actually changing our mindset, starting with my team, then moving beyond that to our design org and ultimately to our product team and ultimately to the company changing our mindset that, yes, that’s one form of research impact, but that is by far and away not the biggest form of impact. And so how I think about impact is really three things right now. I think the first is how can we add some context here? So we recently actually restructured the team a little bit where we have three key horizontal audience level work streams that map onto Duolingo’s three-year company strategies. So I sort of did this intentionally about three, four months ago where I was like, I think we just need something much more horizontal that ties very directly to our company goals where I worked directly with our exec team, shopped it around with other function leaders and made sure everyone was A, on board and B, felt excited about this. So as a result of that, the impact I see there is really long-term. It’s like, how do we actually shape our company’s thinking around these key goals by studying the audiences that they pertain to? And without talking specifically about what these goals are, a lot of them are audience focused. There are specific segments of our audience base and essentially not so much always just around growth, but around really creating value for that segment. And so that to me was like, that’s what I need to do because guess what my team’s really good at? Studying audiences. So as a result of that, that’s the first and the most amorphous form of impact, the one that’s going to take the most time, but the one I’m being really intentional about quarter over quarter, making sure, Hey, like we have these three-year markers, five-year markers for the company. I want to be part of that conversation over the next three years. And it’s not all that my team is doing, but a good chunk of my researchers are now working on these work streams that align to those goals and the work streams are audience understanding. So that’s the first level of impact. I would say the second level of impact is beyond that. I still have at least one researcher working on each of those areas that I had mentioned, you know, the three big areas. And then we actually have some other areas of Duolingo as well, making sure that we have at least a person ing the month over month, quarter over quarter needs that are very specific to these sub teams. And so that form of impact is still pretty like strategic and it’s still pretty high level, but it’s about like shaping the features that the team might be working on, iterating on those features. Like an example for growth might be thinking about our gamification mechanics, short, medium, long-term, what does that journey look like? What else do we experiment with? How do we actually implement it in our monetization team? Thinking about, you know, we are thinking about our subscription offerings and the tiers of our subscription. So making sure that we’re helping the team in those conversations at the area level. So that to me is the second level of impact is like at the area levels, making sure we’re impacting how we’re thinking about feature development. And then frankly, the third, maybe the third and sneaky fourth is that what I mentioned, the experiments and the very tactical things. And that’s intentionally the third thing and not the first thing, because I still want that to be part of our portfolio. I still every quarter want to be able to say, you know, we impacted these X number of experiments. Either because our research directly led to them, or, you know, we played a role in them in some way. And of these, these are some metric wins that we had. But I want that to be like a pretty decently small size of our portfolio, because I don’t want to over-index on that, because that’s also a bit of a losing game I find is if you get too bogged down as a research team on like, which metrics did you move? That’s the most obvious way to show your impact, right? But I think it’s the hardest as well. And that’s where you get in the game of like working backwards and some shoddy calculations around like, well, technically I was a part of this conversation. And therefore I moved this metric. And that’s where I encourage more of like, hey, this is a team sport, like we should be a part of these conversations, but I don’t want to get like, quote unquote, territorial about like what we impacted and how much like that moved metrics. And then relatedly, very tactical improvements. That’s sort of the bread and butter of like usability research. We don’t do a lot of it as a team, but I have two lovely full-time contractors who basically help keep our lights on. And so every week they have a full slate of features and concepts they’re testing, usually very tactical. The teams find a lot of value in it. I think it’s great. We can point again at the end of every quarter, similar to my experiment report, I can say, you know, we tested 13 features across the company and we made these very specific design changes, you know, and these have been implemented or these were put into an experiment. And so that’s like my third and maybe fourth category is the tactical research and the experimentation impact. So that’s how I think about impact right now, company and strategy level, that’s the longest term and having specific work streams that align to that feature level. Again, helping teams dream up new features or iterate on existing features and helping unblock them at the feature area level and then experimentation and tactical impact. Steve: For each of these three plus a sneaky fourth level of having impact, I’m wondering how and who and when, like, the decisions about what you’re going to spend time on doing. Yeah, could we talk through each of those and maybe hear how those decisions get made? Akhay: So basically how do I staff my team to ensure that we’re covering those different layers of impact? Steve: Yeah, and I guess the leading question part of it is, I don’t know, I have this thing about proactive versus reactive that I want to impose as a presumption in that. So, yeah, who’s working on it and where does the need for them to work on it emerge from? Akhay: I’ll start with the first. I think that to me is it’s new terrain for us, but to me it’s really obvious. It’s, you know, we have these company goals that we talk about constantly. Like our three to five year strategy as a company. And so those to me are like, okay, if we have this other North Star as our company, I need to have work streams that align to each one of those. And so the need sort of arises at the top, right? It’s like, these are our North Star metrics. We constantly refer to them. We talk about them in all hands. You know, we have documentation around them. And so when I actually presented this to senior leadership a couple of months ago, that’s how I framed it. I’m like, hey, I’m trying something new. These are these new work streams. And actually they are all very, very neatly mapped onto our specific goals we have as a company over the next three to five years. And so the need sort of arose in that I identified that and I was like, we need to do that because as I mentioned, a lot of these, actually all of them are audience focused. They’re about a specific subset of our audience. And so as a result, about half my team is staffed on those. And here’s the thing. It’s not like these are in isolation, right? Like sure, these ladder up to big company goals, but actually in the short, I would say in the medium to long term, they still will have impact in, I mean, they have to have impact for specific teams as well, because that’s how we get to these goals, right? As teams have to act against these things. And so it’s sort of this twofer of, you know, I want to help shape our company vision and strategy long term, but in the short, medium, long, in the short and medium term, these will also serve area level goals. And so that’s where the need arose, so to speak. And that’s how I’m thinking about staffing them in that, like about half my team is working on work streams that align to those goals. And then the other, you know, second category that’s, I think everyone on top of what they’re doing is still more or less aligned to an area. That’s the area that they represent. That’s the area that they have some context in. We’re experimenting with some new staffing models, and I now have a research pod of a few researchers who’ll look across two areas, which we can dig into if you’re interested. But nonetheless, I think for the most part, people are aligned to areas. And so just because, but that doesn’t mean they’ll be working on things that quarter for that area. A great example is I have a researcher on my team who’s leading a work stream that has a lot of buy-in from our CEO. We actually meet with him regularly just to keep him updated on what we’re learning and what the next steps are. As a result, she can’t work on monetization, which is the area that she s, but that’s okay because I made sure that like anything that’s like absolutely critical, we’ll find a way to resource. But I also make sure to set expectations like, hey, this quarter actually like X person is busy working on this other work stream. So I have to do this game of trade-offs. And as a result, this game of like working with stakeholders around the expectations around resourcing for the quarter or for the half or whatever it might be. Again, we have to work with a finite set of resources. So I do the best that I can to make everyone happy. Not always possible, but you have to do what you have to do. And then the third is that’s the stuff I’ve tried to offload for my team. That’s where we leverage our contractors who work full time. And then the experimentation is like an added bonus. Like a lot of things can lead to experiment ideas. So we as a team just keep a pulse on that. And so that’s sort of like a byproduct of all the work that we’re doing. If possible and when appropriate, we try to move it to experiment ideas that we can implement. Steve: I’m thinking back to your use of the term rituals to describe, you know, ways that organizations work together. And so within research, yeah, what rituals or practices do you have? You’re talking about sharing information and keeping, you know, track of the pulse of things. How do you all work together? Akhay: Actually, very, very top of mind for me. We’ve been doing a bit of a ritual reset that I’ve been really excited about. I think one thing that comes to mind a few months ago, I was sort of observing. So we have a very tight knit team. We work really closely together. We have this is a research team. We have our weekly meeting where we keep in touch and chat about what we’re working on and commiserate over things and get to know each other. But I realized there’s so much up leveling that we were missing because we as a team weren’t looking at each other’s work nearly as often as we should be. And so I kind of had a bit of a mandate and I said, hey, if anything you are doing is going to get, and this is subjective, but more than a few eyeballs on it beyond our team and you’ll be the judge of how you want to define that, I want to make sure that our entire team does a round of crit for it. So if you’re sending out an email at the end of a project that the whole company reads, which we do at the end of every project, we send out a TLDR email to this email list that has pretty much the whole company, I want to make sure that our entire team has a checkpoint to give on it before we send it out. That’s been amazing. I’ve gotten so much great about the process just because it’s so nice to be able to connect the dots. And we were starting to miss that a little bit up until a few months ago. That’s a ritual I love. And that’s also true for things like absolutely if you’re going to have a share out report, like absolutely multiple checkpoints where the whole team looks at it. And I’ve just seen so much work over the last few months be up leveled, we’re able to connect the dots more, have more of a point of view. I mean, yeah, it’s just the nature of the more people can give, the more number of smart people that can look at what you’re producing and give you on it, the better your work will be. So I think that’s a really lovely ritual that we have. Another thing I try to do every quarter is these sort of zoom out weeks, typically to start or end a quarter where we actually get together and talk about these bigger provocations that we as a team want to pose for the company or for the product team or with as wide of an audience as possible. And that’s also an exercise in connecting the dots. And we had a really fruitful discussion recently where we tried to dig into the archives of our team’s insights and identified examples of great insights and not so great insights. And then dissected why was something great and why was something not? And then actually came up with principles of what we as a team think a great insight is. And I can’t recite it off the top of my head because I forget, but it was things like, actually, I’m not even going to try because they were lovely and great and I probably won’t do them justice. But we as a team now have a shared definition of this is what a great insight is. And that can be an anchor and a bit of a rubric almost. It’s not so much a rubric, but just a bit of an anchor as we frame our own insights. And would it stand the test of like, hey, we defined that this is a great insight. Is this true? So I’d love to do those zoom out days just to dig back into the archive of our own team, think ahead. I have one coming up next month that I’m thinking about, and it’s going to be around impact. People think a lot about individual impact, so I think I’ll have everyone write an impact statement for themselves as individual researchers. Then zoom out one level. What is your impact on the team or work stream that you have been a part of? Then we’ll zoom out another level. What is your impact in the broader team that you’re a part of? Then we’ll zoom out another level. What is our impact as a team? Things like that. I love rituals like that because I think it’s so important to take a step back, zoom out, and also think about our past. I keep using this phrase of the context that we exist in, but I think it’s really true. So much happens in any given month at a company. People leave, people , team. We have reorgs. We move to a new discipline. So I always want to make sure, let’s take stock of that and let’s make sure that we show up the best that we can in all of that. So that’s my intention with those rituals is helping us zoom out. What else? And then we actually, we’ve been in a bit of a flux with ing our design teams rituals because we just moved orgs, like I said. So that’s been actually really fun for the team, having a new set of stakeholders that we are building relationships and rituals with. But I think that’s the stuff that’s really important and that’s the stuff I love to think about. Steve: When you do those zoom outs or that example of what’s a good insight, how are you able to synthesize and align on that? I guess I asked that question with some bias that I don’t take it for granted that everyone’s going to talk about what impact is or everyone’s going to talk about what quality is and that you’re going to end up somewhere that’s not a giant mess on the wall. How are you collectively able to get somewhere with all this potentially personal and individual judgment disparate takes on things? Akhay: You know, frankly, it is a bit of a mess on the wall and that’s the thing we have to be comfortable with, but pretty tactically, you know, just leaning into our skills as workshop leaders and strategists. So I do a lot of, you know, we’ll have a pretty collaborative fig jam board with these open ended questions and provocations. I’ll continuously remind the team, this is not meant to be a judging exercise or no one gets surprised for having the most number of great insights on the insight, great insights category. And so it’s a lot of like intentionality around, and I think this is a great thing about researchers. Our job is to make sense of mess. And so we do make a bit of a mess, but then I give us enough time to like distill that mess into whatever it might be. A principle of what makes a great insight or a set of three really impactful nuggets that we want to share with the whole company. That was a recent brainstorm that we did. And so it’s a lot of like, honestly, facilitation skills and more tactically using really, really intentionally designed FigJam boards. Steve: Maybe you could take a fictitious or actual example and just maybe walk through some of the touch points in the process. I’m curious with the number of people and the sort of artifacts being produced, how do you work through group ? What kind of are people getting and how do they structure that interaction to make it feasible and valuable? Akhay: You know, right now it’s actually fairly unstructured, which I like. I’m all for structure where structure is needed, but this has been fairly organic and it’s been working really well. But tactically what that looks like is, like I said, I have a fairly loose, but still pretty clear definition of like, if you’re going to get eyeballs on it beyond this team, let’s go through a team crit because I want to make sure that we as a team are showing up in the best way that we can and also learning from each other. Right. So as a result, it’s pretty packed. We basically added an extra calendar slot every week where it’s just crit slots. And so we now have twice a week crit slots for just the research team. And there’s only eight of us and almost every week, all of those get filled up. So we’ll have anywhere from three to five people presenting different types of projects. Like I said, everything from a final report to a discussion guide to a survey set of survey questions that they have. And we keep it pretty unstructured, but if we’re being really, really granular, what that looks like is we’ll have typically 20 to 30 minutes. The person will start with like, here’s what this is. This is the type of that I want. Sometimes it’s literally like, I’m about to send this email out. Can you just like gut check, spelling, grammar, and like framing for me? Love that. Guess what? With eight smart people in a room, it ends up never being just that because we ended up getting into all sorts of other stuff as well, which is great. And that’s the good byproduct of this. And we will typically just have a live Slack thread. And so the person will be presenting and we will all just have a smattering of Slack notes to the person. And so they’re basically getting and advice from many six, seven people as they’re presenting something, or sometimes it’s more discussion based. And then from there, sometimes it’s like, Hey, actually let’s do a follow up round the next time around if it warrants it. So it’s fairly unstructured actually, and it’s worked really well. And again, what I’ve liked is even just in a few short months of doing this, I see a lot more consistency in how we show up, especially for the high visibility things like a company-wide TLDR email with our insights. But my favorite part is seeing people connect the dots, right? We’ve had people on the team who’ve been here for many, many years who have a lot of historical and organizational context that someone might not have. And as a result of this, they can then augment their insights or their perspective with that context or someone else might be working on a project that the other person’s not privy to and now they can connect the dots and uplevel their work that way. And so that’s my favorite part of it is also just making sure that the more dots we can connect, the more I think our team shows up as polished and as impactful. And so that’s been our crit ritual and it’s been really, really great to see the team just take it in strides and say that they’re learning a lot just from each other by doing that. Steve: In of that connecting the dots piece, you know, you’re only eight people and it’s taking like lots of face time to get access to each other’s knowledge and put that in other places where it’s going to make the work better. And you’ve worked in research teams of different sizes and organizations of different sizes. How does connecting the dots work in a larger research organization? Akhay: Frankly, it doesn’t. I, you know, again, loved being at Spotify, but the amount of times I would work on something and then realize months into a project that someone else at the other side of the org, far, far away from me had either done this before or was also working on it at the exact same time. And that was sheer, again, the sheer product of organizational design, the fact that we weren’t centralized as a researcher, insights function. So it didn’t work to the same extent or you spend a lot of your time gathering context. And so we only have hopefully 40 hours of our work week. If we’re spending, you know, half of it building context, that slows us down for sure. And you miss a lot of context. So, you know, in larger organizations, it’s much tougher. I think it worked really well when I was at LinkedIn. We were a very tight knit research team as well. And frankly, I adopt a lot of the ethos of this like crit practice and making sure that we all stay connected, even as we scale and grow bigger from that experience. And again, it’s much more doable with a much smaller company and with a team of eight, but it’s just tough. The more people you throw into an equation, the harder it will be. And so I think that’s what, as you scale companies and as you scale research teams, that’s where we introduce a lot of chaos sometimes and end up just like spending a lot of our time connecting those dots to varying degrees of success. So yeah, I don’t know. I think it’s tough. I think it’s just really tough in larger organizations. Steve: You made mention a moment ago about a research pod. I’d love to hear what that is. Akhay: Yes, this is hot off the press. But one thing I am experimenting with, with my team is, I mentioned we have the three key areas of our product org or rather of our company. And I only have so many researchers and so we’ve tried the embedded model and it works pretty well in some areas, but maybe not so much in others. And then on top of that, I also just have people matters to consider. I want to give people a fresh start. I want to, I’m a big believer in rotating to new areas after, especially after a few years of working on the same thing. I think fresh energy is really, really important. So I’m at this perfect confluence right now of trying to optimize who on my team works on what so that it’s a good match for them and for the area that they’re in. And then trying to optimize for the management structures I have on my team. And then also trying to optimize for just resourcing the highest priority needs across the entire company. And so I kind of dreamt up this research pod of three researchers who will look across two areas, two of the three key areas, and they will have a manager who’s responsible for building those area level relationships across both areas, fielding, prioritizing, kick-starting the right research requests. And then with the three researchers figuring out probably on a quarterly basis, who’s going to work on what. And so it’s like a semi-embedded model where I as a researcher in this pod could spend one quarter working on one of the two teams and the other quarter working on the other team. I anticipate, this is again, very much hot off the press. I anticipate some of the concern around that will be, well, what will you lose by not being embedded and not being the go-to person for your product director to come to? I actually don’t think much because they will still have a go-to person as their point person, right? It will be the manager of this pod. And I still think you can build context over multiple areas, but what this actually helps free up is we can work on higher priority things. I think the sheer, if you’re aligned to an area, then you’re kind of beholden to what others that area need. But then this other area might need higher priority, just if we were stack ranking at the company level, what the most important things are. I actually think there’s more to be had by being able to work on higher priority things than being completely embedded in one area. So it’s not necessarily the most revolutionary staffing model, but it’s a little bit different for Duolingo. And I anticipate there will be a bit of pushback and concern, but I’m really excited for it. I think it’s a really nice way to give people the opportunity to work across multiple areas, gain more context, gain more expertise, and then frankly, just be able to work on higher priority things. And so let’s see how it goes. Duolingo is very much like a, you are defined by the area and the team that you’re on. So this will be interesting. And I’m curious to see, maybe I’m wrong, maybe there won’t be any discomfort, but curious to see if there is any discomfort around like, oh, there’s not one go-to IC, individual contributor researcher ing the area that someone’s working on. So that’s my research pod experiment. Steve: Maybe we can switch gears a little bit to learn about how you found the field of research. Akhay: I love this question because I think every researcher has a really unique story to how they found research. And mine is a bit of a funny one. So it goes back to me studying Italian, which funny enough, you know, language, now I work at Duolingo, but learning a language was actually a really serendipitous way by which I learned about research, but studying Italian in college, and I did not know what research or design or frankly, even tech was. And my language instructor told me about a summer program in Italy. And this was in the sort of heyday, early days of like IDEO, the d.school, sort of their like early days of research as a discipline. And I knew nothing about that, but it was in Italy. There was a summer program in Italy and it was about design thinking and prototyping. And I was like, this seems interesting design. I don’t know what design is, but I like the concept of design. And so I went, I did the summer program and it sort of changed my life because I learned about, you know, prototyping and research. And this again was in the like early 2010s. It was in the like early days of like the heyday, I say, of, you know, the d.school and IDEO. And I bookmarked that summer and I was like, I want to do this eventually. But then the more I looked into it, the more I felt like you needed a PhD. Again there wasn’t a lot of clear paths into UX research at the time, especially, I mean, it’s still true, but especially about 10 years ago, that was not the case. And then one thing led to another. I ended up working at a design agency, moved back to Italy and through the luck of it all ended up finding my way to the research team at LinkedIn. And so that was my journey, kind of learning about design and research and realizing, oh, I actually already do this. A lot of my background prior to tech was in, you know, ethnography, sociology, urban studies, a lot of people-centered disciplines, but I was not approaching it knowing that like there was a field that existed in tech that could relate to it. And so when I, I still the moment I was like, oh, this is a job I can have. Like I can learn about people, but in the context of like a business or a product that seems too good to be true. And now here we are, I think it’s changed a lot. I think that ethos of like people-centered research teams has changed a lot for better or for worse. I still think at our core, that’s who we are. But I the cultural conversation around our field, especially about 10 years ago was like, that is the unique value we bring. I actually believe pretty wholeheartedly, like that unique value for us has changed a lot. Now we have to be much more like product strategists and designers, and we have to look across multiple methods and we have to be fluent in business metrics and experimentation and a lot of things, which frankly, like many years ago was not true, but I think that’s okay. And I think that’s good, but that’s how I found my way to the field. Steve: Do you find yourself on the other end of those conversations? Like as you said, it’s still a field that’s hard to find its way into, but are you given your role or just things you do in the community? Are you exposed to people that want to the field or people that are new to the field? Akhay: Yeah, it’s funny because now there’s so many sort of feeder quote unquote academic programs or boot camps into this field, and in a way that was not true when I was ing. And I had the great privilege of meeting amazing mentors and people who took a chance on me. And I try to be that as much as I can, but as you can imagine, the LinkedIn inbox definitely fills up. But through hiring, LinkedIn actually hire, not LinkedIn, Duolingo actually hires pretty heavily from a student pool. So a lot of our, we have more people who have never worked anywhere but Duolingo than not. And so I really try to tap into that and make sure that we are getting really good talent that has never worked in research before and trying to be parts of talks and conversations and s as much as I can. But it’s also a different time. I think the access, again, there’s all these programs now that are like trying to be feeder programs into the field of UX research for better or for worse, and similarly with design. So I think it creates a lot more supply and we’re kind of in this flux as an industry and as a field right now that I’m curious to see how that will shake out. Steve: What are things you look for when you’re talking to someone as a prospective employee or even as an informational kind of conversation? What are things that stand out to you, people that don’t have research experience but that you think could be successful and do a good job at research? What are you looking for? Akhay: By far and away, the main thing I look for is I think the like gusto and energy someone brings just to who they are. I think so much of our job is actually not research. It is influence. It is driving change. It is inviting yourself to those rooms that we talked about. It is moving those rooms. It is influencing those rooms. And I really look for that. I think that it’s tough because it’s a very, you know, sometimes difficult thing to measure and feel, but I look for that is like how much conviction can you bring to a conversation or a room? Because so, so much of our job is entirely dependent on that. And frankly, maybe this is the bias I have with my background that’s more in the humanities and the social sciences, but I look for people who are good people people, you know? People who are good at understanding people and which sort of relates to that first point as well. Being able to like tell interesting stories about people. Again, that’s changing a little bit as I mentioned with just the evolution of our field, but no matter what happens, I think that’s what will set us apart as a field. That’s what also set us apart from other disciplines like data science is like we have to be able to pull from our expertise and knowledge as people experts. And so I look for that as well. And that can come in a lot of different ways. And you know, we hear this term curiosity and empathy thrown around a lot. I think it can be a little bit overused, but it relates to everything I just mentioned. And I think that sometimes just shines through. And so that’s really what I look for is like, can you move a room? Can you, you know, how much conviction and gusto do you bring to the way that you talk about work or you talk about a problem? That’s really important to me. Steve: People that follow this podcast will have heard some interviews in the last couple of months that were previous guests coming back on and talking about what’s different for them or different in the field. In some cases, five years later, in some cases, nine years later. So hypothetically, if let’s say we were gonna have another episode of this together in nine years, what do you think we would talk about? Akhay: Well I would love to be back in nine years. I think a lot of what we would talk about would be methodologies of years past that we don’t think about anymore. We would talk about, you know, how we exist in the structures of our teams. And so I’ll be specific about the first example. You know, I don’t think with the new tooling that we’re about to have and the new process changes that we’re about to have, we’re not like we probably teams aren’t going to be doing usability testing. Right? Like we can we’ll probably be able to offload a lot of that. And so I think we’ll talk probably very we would talk very specifically about those, you know, methods that we like don’t use or employ anymore and the things that we used to have to do that we don’t anymore. A good example is I was reflecting today on how nine years ago I would go back and forth over email scheduling a participant interview and spend like a good chunk of my week literally just back and forth without Calendly. Just me, the researcher, scheduling people very manually and it’s changes incremental. So it’s hard year over year to be like, oh, things used to be so different. But if you look at it nine years later, it’s like, oh, yeah, we didn’t like we just didn’t have tools that could do that for us. And so I would imagine like nine years from now, that will probably be the case. Just a lot of the things and the processes that we have are no longer there. I think the bigger question is, frankly, like. You know, we can have a lot of conversations as researchers. What is the future of our field? Where is this field going? But frankly, I think every discipline is having that same existential moment. This is not unique to us. Data science, product design, everyone is thinking about how is our job going to change? And so nine years from now, yeah, a lot of the jobs will have changed and a lot of the fields might be, you know, a lot of these jobs will probably. Will have to traverse, I think, a lot of different. I don’t know, areas of expertise to. Do meaningful work, I think. And again, reflecting back nine years, I think that’s also true in the early days of that heyday of, you know, the how might we as I call it, I reflect on that. I think I was told that as a researcher, I was supposed to be a little bit objective. I was supposed to like plant seeds and then let other people run with it. And now, nine years later, I tell my team, no, no, no, you plant seeds and then you water them and you run with it and you bring people along with you. But like you have to have a point of view. You have to run with it. And so nine years from now, I think it will be probably even more extreme. So I’m curious to see how the like boundaries of what value we bring to a team or a company will be different. What new jobs don’t even exist yet. Like who are we partnering with? And you know, what are the types of things that people are doing at companies? Like I don’t even know in nine years what’s, you know, there’s all these disciplines I work with now or types of roles I work with now that weren’t there nine years ago. So yeah, those are some things that come to mind. Steve: Maybe just before we wrap up, it’d be great to hear about something about you that’s maybe not about work that, you know, let us learn something about you. Akhay: I always joke that my life is really lovely and it’s really
01:10:45
46. Daniel Escher of Remitly
Episodio en Dollars to Donuts
In this episode of Dollars to Donuts I speak with Daniel Escher, Director of UX and Research at Remitly. We talk about more ways for researchers to add value, business questions over research questions, and the things that researchers worry about. Where I think collective identity can be limiting is when someone thinks of themselves as a researcher and says, “Therefore, that means this is my small box of things that I do and ways that I contribute.” And what I always want to do is push that box to be bigger, right? I’m not at all saying that the box doesn’t exist in any way. But we as researchers can drive far more decision-making, far more strategy, far more hypotheses than I think we realize. I think that we tend to want to hand off work to other people when actually what I encourage my team to do is figure out where are the places where actually a handoff doesn’t make sense, but a handshake makes sense. There’s some there. Or where does hand-holding make sense, where there’s really extended involvement? – Daniel Escher Show Links Steve and Inzovu – Storytelling workshops Formats Unpacked Daniel on LinkedIn Remitly Nazir Harb Michel on LinkedIn Savannah Young on LinkedIn Angelina Erine Theodorou on LinkedIn José G. Soto Márquez on LinkedIn James by Percival Everett Clay Christensen’s Milkshake Marketing (and Jobs to be Done) RITE Method Help other people find Dollars to Donuts by leaving a review on Apple Podcasts. Transcript Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead research in their organization. I’m Steve Portigal. In collaboration with my friends at Inzovu, we’ve been running storytelling workshops for clients. Storytelling is an essential human skill for any team. It drives connections, influences decisions, and inspires empathy. In innovative and creative practices, the ability to tell a compelling story is just as crucial as doing the work itself. While good teams focus on delivering high-quality work, great teams go beyond and wrap the delivery of their work into stories. A team’s ability to tell well-crafted stories is critical for influence, which is always the key to the outcomes they seek. That’s where our storytelling workshops come in. These workshops empower your team by enhancing their storytelling prowess. It’s valuable for a range of roles within growing product and service organizations, like experience design, research, product management, marketing, and innovation. We run this in a few ways. One is a series of short online sessions across several weeks with individual and group assignments between meetings. We also have an in-person full-day version. Check out more info at www.inzovu.co/workshop/storytelling. And I’ll put that link in the show notes. One resource that participants have found useful and inspirational is Formats Unpacked, which really opens up the idea of what a story can be. An opening monologue is a format. A podcast interview, which we’re getting to in a moment, is a format. An unboxing video is a format. The same interview, one year apart, is a format. You might enjoy checking out the Formats Unpacked site, which dives into many different approaches to presenting stories and sorts out what it is that makes them work. Now let’s get to the episode. Today, I’m speaking with Daniel Escher, Director of UX and Research at Remitly. Daniel, welcome to Dollars to Donuts. It’s really great to have you here on the podcast. Thank you. Daniel Escher: It’s such a pleasure to be here, Steve. Thanks so much. Steve: I’d like to ask you to start by giving a little introduction. Daniel: Yeah, I think the very first thing people are surprised to learn about me is I’m the youngest of six children. I realize that’s pretty uncommon in this day and age. But I come from a big family. I have 14 nieces and nephews across myself and my husband. And I grew up in Hawaii, also kind of surrounded by not just my biological family, but very much a sense of community and broader family. And I think I’ve generally taken that throughout the rest of my life, where I try to surround myself with people who are really, really good people, good-hearted people, people from different walks of life. So after growing up in Hawaii, I moved with my family to Washington State, where I currently live. But I was gone for about 10 years, living in New Jersey for a few of those years, and then seven years in Indiana, getting my PhD in sociology at University of Notre Dame. Now I lead customer research and UX for Remitly. I should actually correct that. I’m one of the people leading UX at Remitly. My official title is director of UX and customer research. Remitly is a company about a dozen years old. We provide cross-border financial services and have offices located around the world, but our headquarters is here in Seattle. Steve: Can you say a little more about the company and what it does and who you serve? Daniel: Yeah, so we serve really anyone who needs financial services across borders. And so the focus for us over the last decade or so has been remittances. So people who have moved from one country to another and then are sending money, often back home, to family , friends. That’s not all of our customers by any means, but that’s definitely the vast majority. So we operate in what we call pairs of countries or corridors. So let’s say sending money from the US to Philippines is one pair or corridor. Sending money from Canada to Nigeria is another one. So our customers range really widely in of demographics, in of socioeconomic level, and in of the needs that they have for the product. And then over the last couple of years, and especially these days, we’re in the process of developing some new products. Unfortunately, I can’t say more about what those are, but they are very exciting and I’m very curious to see us continue to learn in those spaces and then ultimately achieve product market fit and a great customer experience. Steve: How long have you been working at Remitly? Daniel: Yeah, it’s been almost six years now, which is pretty wild to think about. Steve: And so you mentioned both customer research and UX being areas that you’re responsible for. And just over the time that you’ve been at Remitly, how has the set of things changed that you’ve been leading or focused on? Daniel: Yeah, Remitly, I have realized, is a constantly shape-shifting organism. And I think a lot of companies are like that. And I didn’t realize that actually until pretty late or pretty recently. And I actually think that us as researchers, many of us as social scientists, the extent to which we can view the place where we work as an organism that is responding and shifting and adjusting can actually help us to be successful. So that’s a quick digression. But to get back to your question, the last six years has looked different in so many ways. When I started out, I was essentially an individual researcher brought in to begin a research function. I hired and built a team, established practices, and then about a year ago moved from reporting into marketing to reporting into product. And that was a function of the needs of the business shifting. And when I ed the company, the biggest needs for us were understanding our customer audience and our potential audience. And so we did total market segmentation, and then we needed to develop a brand position. And so I was heavily involved in that. So essentially we went from being what looked like a flag store online, because we had one country’s flag, another country’s flag, and then an exchange rate of the currencies, right? Into a much more human brand. And so we’ve shifted from talking about the function of sending money across borders into more of the outcomes of sending money across borders. And that’s often some sense of connection, some sense of accomplishment, some sense of aspiration. And then a lot of that work in marketing is, it’s not like it’s complete, because obviously customers are always changing. But the bigger opportunity in the last couple of years has been more on the product side, as we think about building new products and trying to expand the offerings that we have. Steve: And when did experience be part of your, I wanna say remit, but that might be an annoying term to use for the company that we’re talking about, Daniel: I love it. Steve: But I just did it and I’m sorry. Daniel: No, no, we use the term as well. Definitely. I would say from the beginning, experience has been part of that. But especially in the last about year, I’ve become much more focused on it. And I would say that that’s been a fantastic transition for me. So I’ll share more about that in a second. But just to describe the nuts and bolts. So currently I have four researchers reporting to me, a couple of content designers, and then a product designer. And as that group, we cover everything from upper funnel marketing and brand building into landing pages and app store optimization into our app and our send flow and checkout experience and then post-submit experience. And then kind of underlying all of this is a new work stream around voice of customer. But as I look at that whole surface area, what has been so helpful is shifting from being focused exclusively on research to also thinking about content design and product design. That has really forced me to go through the, I’ll call it the operational wall in what I do. So I think as researchers, especially in my first about four years at Remitly, I was really focused on, I do the research, I do it well. And then someone goes and does something with it. And in the past year, by having designers report directly to me, I now have to care about that handoff. I have to care about, okay, what then happens with our hypotheses or with our recommendations? Do the designs actually reflect that? Does the overall customer experience actually reflect that? And so honestly, this past year has been a real crash course through that wall or against that wall at times, as I’m trying to figure out in some ways, how to be a bit of a product manager, how to be a bit of a program manager. I talking to my team a few months ago. And what I told them is that our default mode is to think about our skillset first and then draw lines around that and use that to determine what I do or don’t do. And instead, what I said to them was, think of yourself as being on a school board or the board of a nonprofit or the board of a for-profit, where when you are at that board meeting, no one cares what your degree is in. No one cares what your skillset is. No one cares what your job title is. All of you want the work to get done. The end. Right? And so, obviously, me as a trained sociologist, I’m bringing in that experience and those credentials and whatnot into the conversation that we’re having or into the thing that we’re doing. But all of those things don’t actually limit me or determine what I’m able to contribute. And in fact, I know that there’s a sense of collective identity and momentum that comes from I am a researcher or I am a designer or I am a product manager. But in actuality, if I think about a researcher, for so many years, I was under this impression that the way I add value as a researcher is by conducting original research. And actually, what I’ve realized is that very few people are interested in that in itself. Instead, what I can bring is my overall skillset and just life background. So I can add value by asking good questions. I can add value by integrating data sources. I can add value by synthesizing what business management is saying, what marketing is saying, and what I heard from some agents in customer service. And I can do all of that and drive decision-making without ever conducting a customer interview or running a usability study. And in a similar way, designers can add tremendous value or steer decision-making without ever drawing a wireframe in Figma or presenting a prototype of something. So all of that, in summary, I would say is I think that we forget the power that we have to make outcomes happen. And we don’t need to limit ourselves to what is my job title? What do others expect me to do? Okay. Steve: It’s so exciting to hear you describe, I don’t know, the superset of ways that we bring value. And that pushing pixels or asking questions is not the scope of what any of these disciplines can do, and there’s this larger set of things. But if you expand the definition or the tent or whatever it is, yeah, research is facilitative. Research is working with stakeholders, is working to understand problem spaces, is synthesizing, integrating. That narrative is a really lovely and rich characterization of what researchers can do. Even in your presentation of it and how I hear it myself, it’s still, like in your board example where no one cares what your discipline is. I don’t know, I think I need more help bridging that back to this example because you’re still saying researchers have this larger set. But I’ll just speak for myself, I like that identity. I want to be seen as a researcher. I don’t wanna be seen as a designer or a generic person on the board. I think there’s issues of identity that we need. And so you also brought up what do people expect from you or what do we think people expect from us versus how to bring more value than that. I mean, maybe my question here is, do you think there’s an aspect of identity with the name of our discipline, researcher, designer, that is playing a role here in of what we’re comfortable with, how we’re seeking to perform and be valued for that performance? Daniel: Yeah, absolutely. I think that personal identity and collective identity is really powerful, right, and actually, if I think about what drives activists in social movements to continue to show up day after day, it’s some sense of personal identity or collective identity. And so I don’t at all want to dismiss that. I think that that is really powerful. What I see that as though is a launching pad or a springboard into doing really big things. And where I think collective identity can be limiting is when someone thinks of themselves as a researcher and says, therefore, that means this is my small box of things that I do and ways that I contribute, and what I always want to do is push that box to be bigger. Right, I’m not at all saying that the box doesn’t exist in any way. But we as researchers can drive far more decision making, far more strategy, far more hypotheses than I think we realize. I think that we tend to want to hand off work to other people when actually what I encourage my team to do is figure out where are the places where actually a handoff doesn’t make sense, but a handshake makes sense. There’s some there. Or where does hand holding make sense where there’s really extended involvement? So for example, I think of someone on my team, Naz. He’s a linguist by training. He has been heavily involved in conducting research on how we price and how we merchandise pricing. And I realized that maybe some folks are immediately yawning as I talked about that, but it’s actually a really complex, fascinating space to figure out, what should our price actually be to send money across borders? And then how do we display that price and communicate that price to people who may not have more than a couple years of formal education all the way to people who have PhDs or JDs from people who are sending $15 to people who are sending $15,000 at a time, right? And so I go into this backstory to say that I’ve seen Naz make this transition from this is what the stakeholder needs. And so here’s the research, the end. Over the last two years, he’s shifted from doing that to now he is actually helping to set the pricing strategy in certain places. He is actively working with designers and actually putting other people and teams to work for him. Saying, hey, I need this so that we can learn this thing in market or out of market, and then we’ll be able to make this decision, right? So I hope that that helps. Like Naz still thinks of himself very much as a researcher, right? He’s not a PM, not a designer, not a business analyst or manager. But he’s going far outside of that kind of traditional box of doing the research and then handing it off to operational teams. Steve: If Naz, were here and we were to ask him how he made the transition from these two points of thinking about how he brings value. Do you have an idea what he might say? Daniel: Yeah. Oh, lots of sweat, lots of effort. I would say that for everyone on my team. Myself included. Right? I think the reason I’m doing research and being a researcher sound too easy. Most of the time. I’ve seen in myself over. You know, the last five and a half, six years. A huge swing. I’ve seen, you know, Nas do it. Other people on my team, Angelina, Jose, Savannah. All of the kind of easy advice about research and, and making an impact is, is actually really challenging. So for example, right. I hear people give advice about, you know, track your impact. Well. Okay. That, that actually. Implies that, you know, what different kinds of impact are. Right. So, so that you’re seeing impact in, in the number of different ways that that can look. And then it implies that you are in the right conversations in the right meetings. That you are focused on the right things in the first place and not trying to do everything at once, which is a classic, classic error of many researchers out there. And so, you know, just that one piece of advice is, is actually many, many months, if not even years of kind of hard work and effort to, to get right. Similarly, even advice about how to build rapport with people in an interview is actually very challenging. Right? Like it takes dozens of times of practicing across a wide variety of people to be able to do that successfully. So I’m starting to get long winded, but, but again, yeah. It’s a tremendous amount of time and effort that, that goes into this. Steve: I mean, this idea that you’re describing here, the moving from the handoff to the handshake or the handhold and sort of learning to do that over time. It is something that I’ve heard a critique of researchers or as research discipline is that, yeah, researchers want to sort of write something and have people sort of automatically get what to do with it. And the opportunity for researchers is to do what you’re saying and help the work take shape and help it have life and help the other people that you want to impact work with it. There’s also a story that exists at the same time and I guess multiple stories can be true when you’re talking about over time, lots of people, lots of organizations, which is that there are researchers who are, I think to some extent, begging to have that kind of impact. I’m not just here to do interviews, I’m not just here to get data or to, often it’s like I’m not just here to run usability tests after you’ve done things like I can bring more value, research is a process and a discipline can bring more value. And those folks are often, I think, stymied because there isn’t quote buy in or they won’t let me, you hear that. And I don’t know what we do with that, but I guess a takeaway for me is that there’s multiple sort of vectors that are happening in a profession. And the one that you’re describing is, again, is not new to me. Researchers have this opportunity and it takes, I think you’re also emphasizing, it takes time to kind of shift your role for yourself. I think that your team, you’re not talking about overcoming internal resistance, you’re talking about them all progressing in, yeah, redrawing the box or not being limited by that small box. Daniel: Yeah, I think that everyone at any company in the world consistently underestimates how long it takes to build relationships, just period. And the reason that I start there is that as soon as we talk about research or design or involvement in the abstract, we have removed relationships from the equation. And actually, almost all of these questions around making an impact, around getting involved in things, come down to personal relationships that we have. To be clear, the sociologist in me is immediately saying, okay, great, Daniel, you’re gonna emphasize the individual. Hey, we also need to talk about the institutional, right? Obviously, right? But stick with me for a second just on the individual story. So if I think about being a researcher in a setting that is challenging, a setting that is maybe even skeptical, a setting that feels stymieing, the way to start to make progress against that, in my experience, is over time, slowly chipping away at all of those perceptions, right? So I spent probably my first four, four and a half years, Remitly, listening to what other people wanted me to do. Instead of going after, here’s what I think needs to happen, and I’m going to make it happen. So I got all sorts of external advice about set up these, like, self-directed training modules and do this kind of work, not that kind of work, right? And in actuality, like, where I need to start is, what is the business question? Not what is the research question? What is the business question? Okay, the business question is, we are losing this kind of customer in this country. Okay, what can I do to address that question? And I realized that that work is going to come at the cost of requests from teams that want me to do this little thing or that little thing. But in actuality, the biggest questions for a researcher to answer almost always rise above that kind of fray, right? So like, if you think, no matter what the company is, it’s how do we acquire new customers? How do we keep the ones we have? And why are ones that we have churning? Why are they leaving? So the extent to which you can position your work to answer one of those three questions, you are going to be making an impact. And what I see researchers often doing is waiting for permission to get involved, right? Like, they are waiting for someone to say, hey, I want you to be involved in this thing. The problem with that is, one, like, oftentimes the requests that come in are not actually the most important, valuable thing that could happen, right? And then the other problem is that you’re waiting for permission, right? Instead of saying, hey, this is the thing that needs to happen and this is the thing that needs to get done. And so what I often see researchers do is they have some sense for, here are the very important questions, but then here are the requests that I’m getting. So I’m going to try to do everything, right? And that is a recipe for disaster. In actuality, a human being can do maybe two, maybe three things in a quarter, right? Like, set aside all the answering emails, set aside whatever. Like, you can do two or three things in a quarter. And so it’s not impressive to say that you’ve run, you know, 50 usability studies or 100 interviews or whatever the thing is. The impressive thing is that you have actually answered one of those big three questions around acquisition, retention, and then, you know, churn or dormancy. Bit of a soapbox there, but I see researchers, you know what? I’m actually going to speak for myself because I don’t want to project onto other people. I have been such a people pleaser for so much of my career. And I have thought that the way that I will get invited into things or the way that I will stand out is by taking some requests, doing it really well, and then being rewarded and recognized for that. And what I’ve learned is that just does not happen, right? People’s time span is really short. Their attention is short. It’s actually in the act of becoming more independent, at least in the U.S. corporate context, that you actually stand out and you demonstrate that you know what truly matters and that you are committed to delivering on it. Steve: I don’t know if it’s when someone s your team or through the relationships you have, your team as a leader, how do you help people make that switch from, and you got me when you said people pleaser and waiting for permission, I think that resonates strongly. How do you help people see, I mean, you’re talking about through the looking glass a little bit, there’s a whole other way of looking at this. It doesn’t feel like it’s on a gradient, it feels like it’s a flip from do what you’re kind of assigned, and that’s how you help people versus say no to things and choose independently where you think the biggest value is. That doesn’t feel linear to me, that feels like letting go of a previous model. Daniel: Yeah. Steve: And I hear you on things taking time, so I don’t want to sort of reduce this to like an aha moment or anything… Daniel: Yeah. Steve: …but how do you get there, how do you manage other people to get there, how do you? Daniel: Yeah, it’s a light switch that I think takes many months, if not even years, to flip. So you’re right that it is kind of a polar experience. But I think in actuality, there’s a fair amount of gradient involved. For candidates that I’m interviewing and for people that I hire, it starts with I’m looking for three aspects to them, and then there’s kind of an underlying aspect just because of Remitly. But the three that are constant are I’m looking for the person to be curious. I’m looking for the person to be humble. And I’m looking for the person to know their value and the value of others. So curiosity, humility, value. So let me unpack each of those. Curiosity, I am looking for people who are constantly wondering, what’s ahead? What’s next? What can I learn more of? Why did this thing happen? What is a new method that I need to learn? What’s a creative way to solve this problem? What is some domain knowledge that I could pick up that would help me in what I need to be doing? So just kind of a tremendous curiosity. Like very few things disappoint me just in general in work. But one of them is when I see someone who has just kind of done the work and then just stopped and has said, like, okay, I checked the box. And research is not about checking the box. Like research is kind of this constantly self-propelling activity forward. And so there’s almost never a sense of termination to it. So that’s curiosity. Humility, I am looking for people who are confident in what they know and what their skill set is and what they can bring, but then also recognize that they still have much more to learn and might be wrong. So humility does not mean submissiveness or shyness or any of those other traits. But it’s really this sense of confidence with the recognition that I have more to learn. I am not perfect. I can and might be wrong for a given thing. And then value is, you know, I know what I bring to the table. And then importantly, I know what others bring to the table. So this actually ties back to the discussion we were having about the researcher identity versus being like part of a board, so to speak, right? Where I, as a researcher, I’m not going to try to do everyone else’s job, but also I’m not going to let other people do my job for me as well. So those are three traits. And then especially because of Remitly, we have something like 5,000 pairs of countries where people can send money to and from. We have a wide variety of customers. We’re developing additional products. And so, you know, kind of underlying all of these three is a real sense of positionality, a sense of understanding, like, who am I in the world? Who are other people in the world? I’m really looking for, you know, diverse backgrounds, diverse experiences, language skills, experience conducting cross-cultural research or even like design research, right? Essentially the recognition that people differ. And the way that I show up in the world is not the way that everyone shows up in the world. Steve: How do those traits set someone up to operate in a way that is independent and is not waiting for permission? Daniel: Yeah, so the curiosity aspect is that the person is not afraid to ask what may seem like dumb questions, right? Just that’s one example. So for example, Savannah on my team, she is constantly asking these great questions that make obvious assumptions that a given team has or that forces, you know, a team to connect the dots in a way that they haven’t before. I think humility, what helps with that is that a researcher is not showing up as a know-it-all, right? Which helps the relational aspect to develop. And so I think about like Angelina on my team has formed these really strong partnerships with people in creative marketing. And the reason that she does that is because she is able to bring the knowledge that she has, but then also recognize that she could be wrong, that there’s ways that she can improve. And then that makes that team want her around more. So then she’s in additional conversations, she’s part of additional Slack threads or emails, and then that spirals out even further into involvement into bigger projects, involvement earlier in a project. The way that value then fits in is people want a researcher to be around because they know specifically what that researcher offers. Synthesis of data, integration of data, bridging of different parts of a company together, original data collection, analysis, right? All of those good kinds of things. So they want that in the project, they’re clamoring after that. And then the researcher also knows like, “Okay, I have a big picture mindset. “I understand what we are trying to accomplish.” And so then I as the researcher, I’m gonna put other people in the spotlight or I’m gonna make sure to bring someone into a conversation when it’s an appropriate time for them to be involved. And of course, everyone appreciates that, right? Like that feeling of, “Oh, hey, I as a product marketer “or I as a program manager got invited into this thing “because Daniel invited me.” That feels good. There’s then reciprocity afterward. Steve: I’m going to pick up one piece of all the interesting things that you just said. At the beginning, you talked about, you know, asking maybe air quotes, dumb questions, and not being a know-it-all. And it reminds me of when we’re teaching people how to do interviews, there is this thing about asking a question that you already know the answer to or that’s sort of maybe an obvious thing or that the person might think you know the answer to. And, you know, I’ve spent time sort of training people on that, and it’s very nuanced. It’s like it’s the same words, like, I don’t know, “What days of the week do you work?” And there’s a way to ask that question that is dumb, and there’s a way to ask that question that’s, to your point, curious. And some of it is about body language, some of it is about inflection. I think some of it is about intent, what’s kind of in your heart when you ask that question, to your point, again, about curiosity. The interview setting is different than the setting with our colleagues. I feel like there’s maybe more pressure on us to be a bit of a know-it-all, to be seen as credible. Is there a best practice around how do you ask those dumb questions? Because I hear you on all the value those questions bring, but there’s a way to do that that brings the success that you’re describing for these folks in your team that are so effective, have these great relationships. There’s a way to do that that’s harmful to you, that’s harmful to your esteem or respect that you receive, and there’s a way to do it that it all falls under the label of asking dumb questions. Can you identify the tactic that makes that a successful dumb question? Daniel: Yeah, that’s a good question, right? And this is actually the challenge of human interactions is as soon as we try to formalize things into rules, we realize that there are thousands of rules and nuances depending on the audience, the interlocutors, the situation, right? So, just kind of starting at the beginning with what you were saying, the idea of feigned ignorance in an interview setting, absolutely, a very powerful technique. Another one that I’ve seen in interviews or that I’ve used is a tactic of we’re in this together. So some of those interview questions that are really survey questions, how many hours do you work in a week? What’s your job? You can almost put on this air of, “Ugh, all right, here we go. “Just bear with me, just a couple minutes of questions. “Just gotta get through the form, right? “You know how it is.” And then you get responses, persons like, “Yeah, I’m gonna contribute to this. “I understand how it is. “I have my own job where my boss does the same thing to me. “Okay, great.” And then now after two minutes of those survey type questions, you actually have more rapport than you would have otherwise. But then to the second part of your question, like thinking about the work situation, yeah, I think there’s a couple different things come to mind. Like one is doing a little bit of self-deprecation as a tactic, the classic preface of, “Hey, sorry, this may be obvious, “but could you explain what is this acronym “or what’s the context for this thing? “I think I’m just missing that.” So that’s kind of the most humble way of doing it. Another one, it’s related, but it actually makes you sound better, which is, “I’ve just been in five back-to-back meetings. “I’ve been switching context from CS to marketing “to new product development today. “Remind me, what is the OKR that this addresses?” So now you’ve just kind of given yourself a high five, and then you’ve launched into it. And then a third tactic is the back channel. So you’re not asking the question publicly, but you’re going to someone that you trust, that you have a good relationship with, and are like, “Yeah, what’s this acronym? “I’m just totally missing it.” And maybe you’re doing that over Slack in a meeting or you’re doing it afterward. So yeah, you’re absolutely right. There is an element of performativity, even in the “no dumb questions.” But in general, I would say that actually, we worry too much about self-presentation. If I think of most companies that do an annual performance review, let’s say, no one re how funny you are on Slack. No one is thinking about the time that you made them feel good on June 26, earlier in the year, whatever. Really, a performance review, which is what researchers are interested in or should be interested in, comes down to what were the three big things you did this year? What was your impact? What did you accomplish for our customers or for the business? And I actually think, like you were talking earlier about, how do you help newcomers to a company shift their mindset into thinking about impact and extending their skill set into the operational realm sometimes? And I think that’s actually a really critical answer, is like, “Okay, great. You did 20 different projects this year or you were involved in whatever many different things. But if you had to boil it down, what are the two or three things that you could say, ‘This is what I did in the year.'” And really, if you work backward from that, that really simplifies life and reduces a lot of the noise. It’s like, “You know what? I helped reduce customer churn by X percent.” Or, “I made this decision about a new product happen that otherwise would not have happened.” It’s these very kind of core impacts or core demonstrations of what you did that have 10,000 prior steps that led to that thing. But really, that is the big set of things that you’ve done that you should be aiming for. Steve: So we started a little bit with sort of the moment of asking questions, and you gave some examples about sort of how to frame that, but then I think you’re making this larger point that when you look back, even like on a quarter or an annual basis, those small moments that maybe we have some anxiety about are not the ones that should be reflected on as these, what are we trying to get to? But how do we, in the moment, keep that in mind? Because it’s like I guess you’re talking a little about being present versus looking at it in retrospect. So how does that perspective that, yeah, when we come back to look at this, here’s what the important things are going to be, how does that give us the best confidence to make the right decisions in the moment when they feel uncertain or risky? Daniel: Mm-hmm. Yeah. Yeah. I often tell my team some phrase along the lines of, “Let’s be human for a minute.” And I actually, in this conversation, want to be human for a minute. I very much recognize and see in myself and see in others a lot of anxiety in researchers. I see a lot of insecurity. And I use those not in any sort of pejorative way or in any way that indicates some kind of failing. I use those as descriptions of what it means to be a human. I have all sorts of anxiety and all sorts of questions in my mind of, you know, “Am I doing a good job? Am I doing the right thing? If I had done this, you know, X number of years ago, would I be in this place?” Right? I in no way want to dismiss those. I in no way want to suggest that, “Hey, you know, if I just had my act together, all of that would disappear.” It just won’t, right? I actually have ired so much the leaders and authors who actually say, “You know what? Even in my role, even in my position, even after 30 years, I still get nervous before I speak in public or I still cringe when someone’s going to give me .” Right? So I think then the question becomes, “Okay, I can’t get rid of the anxiety or the insecurity. What do I do in the face of it? Do I try to just push it down?” No, that’s not going to be a recipe for success. Do I try to reframe the situation? That’s often successful. Do I have a mantra that I repeat over and over to myself? Sure, yeah, I do that. If someone is giving me hard , I have this phrase, “If you are not making mistakes, you’re not learning.” And all credit to Percival Everett, his new novel, “James,” the slave character James or Jim, says that at one point. I wrote it down on a Post-it, and I’ve been saying it in my head ever since, basically. So I do think that anxiety, insecurity is present and is never going to go away. I do think that as researchers, as designers, as PMs, analysts, right, on and on, that we can look back retrospectively at what we’ve done, but we can also look forward to where we are trying to get to. So by the end of this year, I want to have accomplished this thing and this thing, right? And I have that right in front of me. I’m looking at that every day. I’m being tireless in going after those things. That doesn’t mean that there aren’t hard times. That doesn’t mean that there’s not anxiety, that I don’t mess up micro interactions. But I’m constantly going toward one, two, maybe three things to get to by a certain point in time. Steve: I’d like to yes and some of what you’re saying here. Daniel: Love it. Yeah. Steve: And just to go back to our, we’re going big picture… Daniel: Mm-hmm. Steve: …and then specific example of big picture, we’re going back and forth. Daniel: Mm-hmm. Steve: So the asking dumb questions, not knowing something, Daniel: Mm-hmm. Mm-hmm. Steve: …in addition to everything that you’re saying, I guess I’ve had the experience of being around when someone else asks a dumb question, in those ways that you modeled, Daniel: [laughs] Steve: and just felt enormous relief and gratitude for that person doing that… Daniel: Yeah. Steve: …that it’s helpful for me, and that I can see even what it does for the group. Daniel: Yeah. Steve: That it unlocks something and unlocks some assumption. I’ve personally received it and I can see the group benefits from it. Daniel: Yeah. Steve: And so then that teaches me a little lesson that maybe it’s not about permission. I appreciate your take on waiting for permission, but I get permission when I see somebody bring value and then I think, “Oh, they model it for me. I could do that too.” Or when I’ve taken a risk and done something like asked a question… Daniel: Mm-hmm. Yeah, absolutely. Steve: …that I’m not sure if it’s okay for me to ask, and then gotten some approval from someone else, thank you for asking that. I might get a private message afterwards. And if that happens, in that first example… Daniel: Yeah. Steve: …when someone does it and I’m grateful for it, I’ve learned enough that I might try to tell them, either publicly or privately, receiving the reinforcements and modeling the reinforcements, Daniel: Yeah. Yeah. Steve: and noticing that a behavior that might be unsafe or risky… Daniel: Yeah. Steve: …does bring value and has positive consequences, starts to give me– and maybe that’s the reframe the anxiety a little bit and say… Daniel: Yeah. Steve: …”Yeah, if you look at some specific examples… Daniel: Yeah. Steve: …it’s played out really, really well.” And then modeling is kind of a follow-on for that. Daniel: Yeah, I’m so glad you bring this up. This raises two things I’ve learned. One is in creating the right ecosystem. And then the other is not waiting to lead. So let me unpack both of those. Like, a lot of this conversation has focused on more of the personal, more of the individual realities of day-to-day life at a company or an agency conducting research. But let’s actually talk about the institutional for a bit. Institutional can kind of sound like a sterile, abstract word. And so oftentimes what I actually describe is what I call an ecosystem. So in an ecosystem, we need to have the right conditions in place for everyone to thrive. And so that means having norms about how we conduct meetings. Or that means having checks for bias in promotion s or in performance reviews. That means ensuring representation in this group and that group. On and on. We can talk about this in a number of different, very tactical ways. And what I see then is, okay, so we want to create an ecosystem where people are thriving. And how we can help with that is that we can start to create that ecosystem for ourselves. Even if it is my coworker and myself or my team of four people or my team of 10 people or the broader design org of 40 people or on and on. I think that especially starting out as researchers earlier in career, we are waiting for someone to give us the affirmation or to send us the message after the meeting. But actually you can start doing that on day one right as you are starting your career. You don’t have to wait for things like that to happen. Obviously there is — in what I’m describing, I’m describing kind of a bottoms-up approach to creating that ecosystem. Obviously there is a complement of top-down, right? Like what the company leadership is saying and modeling and demonstrating in their actions, not just their words. But both things really do have to happen together. And I think actually what you are describing is such a good reminder for me as a director, as a leader in the company. Know that the rest of the day after we have this conversation, I’m very much going to be thinking, okay, who can I call out into the spotlight and recognize? Or what just happened in that meeting that shouldn’t have happened? What can I do differently in the next meeting to make sure that so-and-so is able to speak and contribute more instead of being sidelined like they were? Steve: So just as we get closer to wrapping up, maybe one thing we haven’t talked about as much, we talked a little about sort of data collecting as one of the roles of research and you expanded that to pulling multiple sources together. And really, we’ve talked a lot about the relationships that create the context or the work to have impact. One piece we haven’t talked about as much is, I’m going to say like a differentiator in research as opposed to a question asker is, is sensemaking, you know, what we do with this data. You know, how do you think about that or how do you practice that with your team? Daniel: I love that question. I think that one of the biggest differentiators for researchers is the knowledge that we bring. And by knowledge, I mean domain knowledge about various social scientific theories and research theories. And then also the know-how of how to put those theories to work, especially when we are doing original data collection. What I often see is that researchers come into a corporate setting, let’s say, or an agency setting, and then they actually end up forgetting or very rarely putting to work all of the social scientific theories that exist in the world and that they have learned. And that is actually a tremendous opportunity for us. So if I think about sociology, for example, which is my main training, I can bring in theories about, let’s say, collective identity from social movements to generate hypotheses for how we might create collective identity among a customer base, which then instills a sense of community and belonging and then can propel some sort of action. Or if I think about the corporate setting in which I work, if I think about that in of social movements, I can actually apply a theory of, let’s say, political opportunity, where I see that there’s division within leaders or there’s something changing in a market that gives us a chance to accomplish something that would have been impossible to do even just last month. I also think a lot about socioeconomic level and research on social class, race, and gender as I’m conducting research. I think it just it sensitizes me to all sorts of different aspects that otherwise I would miss. And then it helps me to make sense of the data, of course. And I’m given examples from sociology, but certainly psychology, economics, political science, all of that feeds in as well. Where I see a big opportunity for researchers is we often get in this habit of reporting quotes or showing a video clip. And to be clear, those things are very powerful, right? They’re a tool that we can use. But the problem with that is that is only one kind of data. It’s often just as important what the person did not say, or it is important to pay attention to body language or to the fact that off the recording their cell phone rang, which then indicated this thing that can help us design the product, right? So I think often as researchers, we are trying to work backward from, okay, what is the slide deck? Like what is the cool thing that I can put on the slide for the audience? And in doing that, we actually like truncate a lot of our intellectual ability and prowess because we are ignoring or setting off to the side all sorts of other data that exists in the world. Steve: Are you making the link between that experiential data and some of the theories from social science or other disciplines? Daniel: Oh, absolutely. Yeah, definitely. A connection there for sure. And then also there’s a connection in of the methods that we use in the first place, right? So interviews are great or usability tests are great for that quote or for that clip where the customer can’t find the back button for 40 painful seconds, right? All of that is good, but that’s only a couple of data forms. And so if we actually expand into, okay, I’m not going to limit myself to easy video clips and easy quotes. I’m going to think much bigger about what are the business questions? What do I need to solve? We can then develop really creative methods, deploy those, and then generate really interesting data. And yes, you’re right. It may not be as succinct or kind of easy to understand as a customer quote, but if it’s answering the question, then it’s absolutely what we should be using. Steve: We’ve talked throughout this conversation about the collaboration and the relationships with the rest of the organization and sort of permission and confidence and a lot of themes here. And I’m thinking about, yes, as a sociologist you have all these theoretical frameworks to kind of bring in. How much of that would be known to people outside your research group? Daniel: Very little of it, and that is okay. So let me actually use the example of Jobs to be Done, which is a much more kind of approachable framework that I think many people are familiar with. You can use the Jobs to be Done framework. You can make an observation of customers using a milkshake machine in the morning versus in the afternoon like Christensen does in his book. And you can then generate hypotheses for testing and market. Let’s move the milkshake machine in the afternoon over here. Let’s change the size of the cups. Okay, all of that. Great. You are then presenting that information to stakeholders. Where I in the past have gotten off track and where many researchers get off track is they go through that whole analytical process. They go through that whole framework. They may not actually need to do so. All of those frameworks, all of those theories are tools for us to sharpen our thinking and to generate hypotheses. We don’t necessarily need the audience to understand them fully. We may not even need to communicate them at all. And in fact, communicating them may actually just be a distraction for the audience. What you actually need is what was the outcome of all of that work? And that may reduce literally hours upon hours of observation and careful thinking may reduce down to just a couple of bullet points or a couple of talking points. And that’s okay, right? I think as part of researchers’ insecurity and anxiety, myself included, we want to show all of the hard work that we’ve done. We want to say like, “Hey, this is difficult. Not anyone can do this.” And so we go into a lot of detail. But in actuality, most of our audience actually trusts that we are experts. They trust that we are doing the hard work. And instead, what they want is us to communicate that thing. I actually think that we can learn a lot from folks in analytics who are really experienced where, my goodness, I know that dozens of hours of data cleaning and munging and various statistical testing has gone into one or two bullet points or a slide and a slide deck. And I don’t need to see all of that work. I know that they are the expert. I know that they’ve done the hard work. And so I can just go in the conversation from what they’re saying and then we can together figure out what does this mean. Sure. Yeah. Yeah. Yeah. Mm-hmm. Mm-hmm. Yeah. Yeah. Thanks for the pushback. I think this is where the audience is a major overlay to what I’m saying, where I was more speaking for an executive audience that has relatively little time, relatively little bandwidth. And so the approach that I’ve seen be successful is starting with, here’s what we learned, here’s what the implications are. And then if needed, in the slide deck appendix or in the document appendix, here are all of the details about why we are confident in this, what we learned, what our process was to get here. I do think though, as you are working with more operational teams who are really in the nitty gritty with you, like actually creating the marketing campaign or the landing page to test demand for a product or creating the wireframe, whatever, this is where a lot of that detail becomes important. So this is when you’re in a long conversation. It’s probably not a deck or document at that point. It’s probably you with a designer, maybe an analyst and a PM, whoever. And you’re actually having much more of a working session where you’re saying, this is what I’m seeing, here’s why I’m seeing it. You’re answering questions as they come. And actually, one way that we’ve been doing that much more recently is we’ve been having much more quick iterative share outs with those executional or operational folks versus maybe the share out that happens once or twice with a more executive audience. So in a recent project, for example, I think we were doing share outs every two or three days. And it wasn’t just a one-sided conversation share out, but it was really a discussion, almost kind of like a workshop of, okay, this is what we learned over the last couple of days. Here’s what we think it means. How are you seeing it? We iterate and then we go from there. So I’m kind of describing the RITE method, right? R-I-T-E method. But I think oftentimes that is used more for design and developing, let’s say, wireframes or a prototype. But I think actually the method can apply more broadly beyond that. Steve: Anything that we didn’t cover today that I should have asked you about? Daniel: I love that question. I close most of my interviews that way. I want to say two things that I think are often missing when really any kind of leader gives advice and gives prescriptions. One is that this is all a work in progress. I probably make it sound like things are set, that I’ve had these learnings, I’ve made these adjustments, and this is the way that I operate and the team operates. This is one point in time, honestly. I guarantee if we had this conversation in three months or in six months, that I would have a different lens to put on things or I would emphasize certain things more than others. I might even disagree with some of the things that I’ve just told you. And so I do hope that everyone listening to this understands that this really is how I’m seeing the world now. But going back to one of my core values is humility. I’m very willing to be wrong on this or to change on this. The second thing I’ll say is that a lot of advice, especially I’ll call it out, especially on LinkedIn about research or blog posts even about what to do, what to not do, really misses a lot of nuance. So I’m very much speaking from what I have learned works at Remitly and how I have been able to succeed here. I do think, though, that at a different company, things may be different. And so I do want to encourage folks to use what is useful, toss the rest. That’s totally fine. And I do just have to give a quick meta reflection on myself. This is such a classic researcher way to end a conversation, isn’t it? Like I have all of these thoughts and then I kind of end with, hey, but I could be wrong. Here’s a bunch of qualifiers. And I just I think that’s such a sign that I truly am a researcher. Going back to the earlier part of our conversation. I’m not a product manager. I’m not a CEO. I’m not an analyst. Steve: I do have a follow-up question though for… Daniel: Please. Steve: Your observation of what you see on LinkedIn… Daniel: Yeah. Steve: Why did you agree, we’re not on LinkedIn, we’re podcast, it’s a different format, but why did you agree to do this, to be on this podcast? Daniel: Yeah, I think the detail, right, there’s so much nuance there. There’s so many things that sound kind of trite or pithy in in written form, especially in a forum like LinkedIn. But, you know, as soon as you start talking about them in a discussion, you realize how how one thing connects to another or how, you know, the thing that that you think is generally applicable is actually only applicable to maybe one kind of setting. And so I figured that this would be a great way to have a more detailed, nuanced conversation about what it means to be doing research in this day and age. Steve: Well, I’m really grateful for you taking the time and having what to me felt like a very nuanced conversation. We covered a lot, but it’s been really interesting and really enjoyable. So I want to thank you for being on the podcast. Daniel: Oh, you’re very welcome, Steve. Thank you so much for the invitation and for your time. Steve: All right, everyone. That’s a wrap for today. You can always find Dollars to Donuts wherever you get your podcasts, as well as at portigal.com/podcast for all of the episodes with show notes and transcripts. Want to do me a solid? Rate and review Dollars to Donuts on Apple Podcasts. Our theme music is by Bruce Todd. The post 46. Daniel Escher of Remitly first appeared on Portigal Consulting.
01:06:01
45. Reggie Murphy of Zendesk (part 2)
Episodio en Dollars to Donuts
This episode of Dollars to Donuts features part 2 of my two-part conversation with Reggie Murphy of Zendesk. We talk about psychological safety at work, Reggie’s career journey, and online career resources for UX researchers. That helps the team be better researchers when they feel like they have a space where, man, I don’t have to be perfect every time. I’m going to definitely strive really hard to do great work and try to be successful. But I have a leader who’s going to have my back if something goes wrong. It works. I want every people leader who’s listening to this to understand that. That you’re not going to get it right every time. But if you set the environment and the intention of being a leader who understands that people will make mistakes, but it’s not that you made the mistake. It’s, okay, how do you learn from it and not do it again? And how that we can set up parameters within the team to address that particular mistake if it was something like a research protocol or something. – Reggie Murphy Show Links Steve’s corporate speaking engagements Steve on the Rock and Roll Research podcast Interviewing s, second edition Reggie on Dollars to Donuts (part 1) Reggie on LinkedIn Zendesk The Fearless Organization: Creating Psychological Safety in the Workplace for Learning, Innovation, and Growth by Amy C. Edmondson Radical Candor: Be a Kick-Ass Boss Without Losing Your Humanity by Kim Scott Digital Body Language: How to Build Trust and Connection, No Matter the Distance Magid I Wish I Knew podcast Laura Cochran on LinkedIn UXR resources Help other people find Dollars to Donuts by leaving a review on Apple Podcasts. Transcript Steve Portigal:Welcome to Dollars to Donuts, the podcast where I talk with the people who lead research in their organization. I’m Steve Portigal In this episode, I’ve got part two of my two part conversation with Reggie Murphy of Zendesk. But before we get to that…hire me to speak at your next in-house event. You may see that I speak at conferences and meetups, but most of my talks and presentations are done for in-house audiences. If you’ve got an all-hands or a retreat or a speaker series, book me to deliver a keynote, participate in an Ask-Me-Anything session, or engage in a fireside chat with your leaders. I regularly speak on topics like research skills, the impact of research on business, innovation and product development, optimizing research influence, and more. Don’t hesitate to get in touch if you have a specific topic in mind. Go to portigal.com/services and portigal.com/speaking for more information. I was recently a guest on Matt Valle’s Rock and Roll Research podcast. Matt interviews insights and research pros about research but he also talks to them about their own creative endeavors that may not be related to research. Here’s a short clip that is about research, but you can check out the whole thing if you want to hear me talk about not only about research but also about Rolling Stones fan communities and creative writing. Matt Valle: You’ve got a very interesting perspective because you’re sort of growing up when this discipline was nascent and helping to sort of create it to some extent. At the same time, you’re a consumer and a like all of us, and you experience good design and bad design. Once upon a time, you know, you were able to fight through all of the difficulties and challenges to get things done with technology. So as you think of where it is today, where research is today and where it’s headed, you know, what does that look like as you look forward? Steve: Right. I mean, I think, you know, right now we’re in this inflection point of difficulties depending on your industry or geography. And it’s an interesting one because I feel like companies believe that research is important. Matt: Right. Steve: but companies don’t necessarily believe consistently that researchers are important. Matt: Right. Steve: It goes back to that people who do research thing that, you know, that we talked about before. But I think a lot of things are on a pendulum. And they go back and forth. And it doesn’t mean that if we’re anxious right now about individual prospects or career prospects, I think that’s very valid. And so I’m not trying to negate any of that. But I think what you see over time is that there’s demand and then there’s the demand softens, then the demand increases and then it goes back and forth. So I don’t know, I mean, I hope I don’t get proved wrong by this. But I think that the demand for researchers to do research will come back. I think it’s going to come back large. And it’s not that oh people who aren’t researchers do poor quality research. I think it’s different. And I think there’s so much need. And again, right now, I think companies are kind of saying, oh, we don’t need it right now. And they’re justifying that economically. When have you ever worked in a situation where the amount of questions that there were that were important for decisions outpaced and exceeded the resources available to respond to that. So that hasn’t magically gone away. So I guess I feel medium-term enthusiastic that there’s just a lot to learn and it doesn’t go away. It’s not like, well, we’ve got the answers. Now we don’t need any more research. Even in more mature product companies, I think they continue to be faced with questions of all kinds, like tactical, operational, future, meaning, story, narrative, culture, all this stuff. There’s all across the board. I think there’s lots to be learned that lots of disciplines care about. Strategists care about research, product people care about research, designers care about research. They don’t care about research. They care about the answers. They care about insights, right? They care about what we can help them understand and decide. So I don’t think that’s going away. And I think it might keep reshaping itself in of what we mean by research and who’s doing research, like what activities and so on. But I guess I just see more, more, more coming. And again, I’m sorry for anyone that wants to slap me for saying that because that’s not where we are, like today feels very much like somewhat of a brick wall. But I think that’s going to change eventually. I don’t know when. That was from the Rock and Roll Research podcast. Check out our whole conversation. And of course please pick up a copy of the second edition of Interviewing s if you haven’t already. Now, let’s get to my the second part of my conversation with Reggie Murphy, the Senior Director of UX Research at Zendesk. Steve: So we’re talking a lot about processes of collaboration, relationships internally, but maybe we could switch the focus a little bit to you and your team because you’re leading those folks, you’re leading them through change that companies ready for or not ready for. Do you have an approach or philosophy that you bring as a people leader? Reggie Murphy: Yeah, one of the first things that I learned way back when I started training as a manager is that you can get to where you want to be by helping others get to where they want to be. And I used that philosophy in my mentoring, but I started out as a people leader early in my career. My second job I was managing folks. It was more of a player-coach role, but having deep empathy for teams and how teams need to be connected, build relationships and work together collaboratively has always been on my mind. Promoting diversity and being inclusive, all of that has been on my mind throughout my career. But one of the things that recently, I would say recently maybe over the past five, 10 years or so that I really try hard as a people leader to install in my teams is the idea of psychological safety and radical candor. It’s like two things. I talk about Amy Edmonson’s book, The Fearless Organization, and Kim Scott’s book, Radical Candor, all the time after I read it several years ago. Because especially in tech, you can work on a lot of different types of teams with a lot of different types of people. And the way to get teams to work better together is to provide an environment and a space where people can feel free to share their opinions and ideas without fear of onishment or being made fun of or anything like that. And that’s one thing that’s really important to me, and especially in my team here at Zendesk, because we have a different setup than say a company that is just located in maybe two countries. We’re a global company. The 15 people that I have on my team, we are located in Melbourne, Australia, Singapore, Poland, Estonia, Canada, and various parts of the US. We’re a globally distributed team, just the 16 of us. So we’re not able to connect in person very, very much. Last year, we were able to get together everyone who was on my team then. So number one, it’s just hard because of that, just not being able to be in person. So it’s important that we develop connections, team connections, and an environment where people feel like they’re building relationships with each other. And so we try to do some things in our team meetings in order to foster that. It is really hard. We’re not perfect at it, but I think we’re getting better at it than we were, say, two years ago when I ed this team. But we’re working towards growing as a team. And I want the team to be a family. I know that sounds very squishy or melodramatic, I don’t know. But I feel like you’re working with your teammates 80% of the week. So it’s important that you try to feel like a family. And within families, you want to foster the environment where people feel like, “Hey, I can do something wrong. I can fail at something, and it’s not going to count against me.” And that’s what Amy Edmondson talks in her book. She talks about avoidable failure and intelligent failure and the differences. And avoidable failures are things that you can… Hey, if you put in the right protocols and controls, you can avoid having typos in your presentation. Those are avoidable failures. But intelligent failures are like, “Hey, you have the best intentions and you are working towards a goal to be successful on a particular project, but just something goes wrong. Let’s say you were trying to step out and do something different, but it just didn’t work very well. We’re going to applaud you if you fail.” And as a people leader, getting back to your question more directly, it’s important that we establish the conditions and the environment where this is possible. And people feel free to work not harder, but smarter, but be innovative, try new things. And if it doesn’t work out, learn from it and move on and do something else great. Long answer to your question, but I really feel strongly about this, that teams that do this well, and I’ve seen them, they work better together. They’re more successful. People feel like they can grow their careers because their family is ive of them. Their people leader is ive of them. And teams where I’ve seen work like this and operate like this, I’ve seen people’s careers just take off and go. And that’s what I try to do. So that’s my philosophy. And I’m still working at it. It’s still a work in progress, but I am adamant about it. And I love collecting on how I’m doing so that, “Hey, if I’m off center or off course, I want to know about it.” And that’s another thing that helps establish this environment of psychological safety is that the person who’s leading the team is open to this . And you have to be to model that behavior so that your team sees it and they’re able to operate in the same way. Steve: What’s something you do so that your team sees that you are open to that ? Reggie: Whenever I make a mistake or if I either say something wrong or maybe something, I post the wrong thing on Slack or something, I immediately own it, apologize. Try to own. If you make a mistake, you just got to own it. You got to face it. And I’m trying to be vulnerable in that way. And I hope that people see that. I’m not a perfect leader, but I’m certainly going to do my best and try really, really hard to push and to do the things I need to do in order to be a positive leader. But if I do something that doesn’t work very well, I’m going to own it. And I think that when people see that, that say, “Hey, okay. Yeah.” Reg is someone who its a mistake and he can correct it. The burden is not on me to be perfect all the time. We had a couple of errors on the team over the past six months. When I say errors, some things just didn’t go well with say research sampling or a protocol. Maybe somebody didn’t follow the right protocol for a research project. And there was a consequence where say a customer was ed when they weren’t supposed to be ed, that they had opted out of wanting to be ed for a research study. It happens. And so when something like that happens, it’s very important for the people leader to address the problem, but to be open and to help the person who made the mistake understand that, hey, it was not one of those avoidable errors. You were doing the right thing, but it was a mistake. And in these couple of instances, we talked about the mistake in a meeting. And it’s kind of surprised people that I brought it up. I went at my team to see that, “Hey, yeah, this was a mistake that was made. Here’s how we learned from it.” So there’s no shame. There’s no disgrace. And I talked to them, of course, one-on-one and said, “Hey, I want to talk about the mistake that you made in the team meeting. Is that okay?” And so I think that that helps the team be better researchers when they feel like they have a space where, man, I don’t have to be perfect every time. I’m going to definitely strive really hard to do great work and try to be successful. But I have a leader who’s going to have my back if something goes wrong. And that’s how I do it, Steve. It works. I want every people leader who’s listening to this to understand that. That you’re not going to get it right every time. But if you set the environment and the intention of being a leader who understands that people will make mistakes, but it’s not that you made the mistake. It’s, okay, how do you learn from it and not do it again? And how that we can set up parameters within the team to address that particular mistake if it was something like a research protocol or something. How can we get better as a team so that we all avoid doing that thing that was the error before? . Steve: The avoidable error and the intelligent error. Is this right: one outcome might be figuring out a process or some approach that turns the intelligent error into a future avoidable error. Am I even using the properly? Reggie: Yeah, I think yes. Steve: So, I’m going to just play back again. So you talked about the connections, the relationships between people. You talked about modeling yourself, making mistakes, vulnerability, taking responsibility. Reggie: Right. Steve: And you talked about reactively when something happens, having a safe, positive individual and group learning and going forward with it, improving the situation. Did I miss anything or are there other? Reggie: No, that’s exactly right. You’re right. Steve: Well, I’m glad to hear you unpack it because we use a lot of words like psychological safety and team relationships and trust. But sometimes we just take those as read, like, oh, we know what that means. But as a leader, you’re making specific choices that are tactical in nature in order to get to a more mature level of safety and learning and growth among everyone on the team. Reggie: Yeah. And I think in this day and age when many companies are virtual or hybrid, it’s hard when you’re not in person. It makes human communication harder. And what I’ve learned in working for a really globally distributed company and team is that there is a human on the other side of that Zoom. And it’s utterly important that you get to know them and you try to provide a space for them to feel at home when you’re having a conversation and when the team is having a conversation. It is incredibly hard to do, Steve. And I think our team has grown in this way. I think we try to find little things that help us. We’re still working at it. It’s not perfect. It’s still sometimes things get lost in translation on Slack. There’s some miscommunication of expectations about things, but I’m hoping that as we continue to grow together, we own it. We own the things that aren’t working. We identify. We’re not afraid to call it out, raise your hand and say, “Hey, this isn’t working. Can we get together and talk about fixing it?” Tactically, yeah, I do hear psychological safety being thrown around a lot. And so that’s why it’s super important for me as a people leader to make sure that we’re actioning on what it means every day. And that is in one-on-one conversations, that is in team conversations. This is in how we treat each other. One of our core values at Zendesk is we care for each other. I believe that. And the way we do that is to have empathy and all of… That’s another word that’s sort of thrown around. But true empathy is really seeing the world as the other person sees it. One of my graduate school professors talked about that when you’re conducting qualitative research, it’s seeing the world as other person see it. Well, I use it in a different context. It’s developing empathy for your team is seeing the world as they see it. And I’m hoping that we’re doing well in this area. And I just feel very strongly about it as a people leader that we have to set the tone. We have to set the stage for our teams. And hopefully our teams see that and they’re better off because of it. Steve: You mentioned up front from this part of our conversation that you’ve got people in all these different parts of the world. So presumably coming from different international cultures themselves, is that a compounding factor besides time zones and lack of being in person? Does culture, in of where people are from in the world, impact how you approach creating psychological safety? Reggie: I think at Zendesk, Zendesk is such a global company. Everybody is in a lot of different places. So from a cultural standpoint, when you step into Zendesk, you know you’re working for a global company. So a variety of geographies. So I don’t know if there are a lot of cultural differences that we have to work through, maybe a little bit. But I think when you establish the environment that no matter who you are, where you are, your background, your life experience, what you look like, you can Zendesk. You can the UX research team, the product design organization, and you can fit right on in. We’ll accept you open arms. Yeah, maybe there are some norms or social norms that’s different from folks who grew up in Europe versus Asia versus Indonesia or South America or in the States. But I think that when you Zendesk, you’re ing a global company. And I think the people who we attract to our company come in with that thinking in mind, I’m going to work for a global company. And so you understand what that means. But I do think, and this is something interesting, we can kind of go off on a little tangent here, but you do have to install some helpful rules, especially if you’re working virtually, to maybe limit the thing that a cultural difference might be concerning about, if that makes sense. I think what helps, I sometime last year, we did a workshop on something called digital body language. And in the workshop, they talked about what it means to have virtual communication most of the time with the people that you work with. And so in Slack messages, when somebody posts something, we flood it with emoji reactions. We respond with a thumbs up or happy face or how we feel. We let that person know digitally that we heard them. That’s one. Second, I have one of my senior managers is based in Singapore. 12 hour difference right now from me to Singapore. There’s no great time to meet. Somebody has to take the hit. So either it’s 8 p.m. in the evening for me, 8 a.m. for my senior manager or vice versa. So we accept that and we make it work. The other thing is when we’re working async and I know my colleagues in APAC, in Asia Pacific area, Melbourne, Singapore, when they’re Slacking me and I receive that message, whenever it is that they slack me, I wake up, I get that message, I respond immediately or I try to respond so that when they wake up, they get it. It’s that kind of thing. It’s very minor, but that means you’re heard, you’re responsive, you’re prompt. That’s having good digital body language. I think those kinds of things, when you do them well, it fosters the type of culture that helps minimize what your question was before, any cultural differences that might sort of stand in the way of great communication. You have to be intentional about it though. So I try to foster that. It’s not perfect for everyone at the company, but I think we try. That’s the good thing about working at Zendesk. It sounds like I’m doing a commercial for Zendesk, but I love what I do. I love where I do it. So I talk about the environment and the culture. I think we do really, really try hard to get it right. Steve: You’ve mentioned just throughout our conversation, graduate school, some previous employers, some of your training. It’d be great to hear how you found research, a little bit of how you got to where you are today. Reggie: I went to undergraduate school thinking that I would be a news reporter, live on the scene, Reggie Murphy, Channel Nine News. I worked at the campus radio station. I majored in history, minor than communication, thinking that I was going to be a news reporter at some point. Didn’t work out that way. I didn’t know what I wanted to do when I graduated. So I went immediately to a master’s in communication program at the University of Tennessee. And there I learned media. I learned research though. I mean, in a graduate program, you’re going to learn research methods. And I leaned into that. And so some of my first job was working as a salesperson at a radio station. And I didn’t really care about the sales process too much. So that didn’t make me a really good salesperson, but I loved showing the client or potential customer the research that would help them make a decision about whether or not they wanted to buy advertising on our radio station. That’s what I love the most. So after a couple of years of being unsuccessful as a salesperson, I decided to go back to school and to the doctoral program and learn more research methods. At that time, I still didn’t know. UX wasn’t even a term then. It was human computer. I don’t even think HCI was a coined term at that point. It was mostly human factors at that point, showing my age. My seniority, I should say. Steve: They’re really old term, if you want to just feel less bad, when I came in, there were still people talking about the MMI, which was the Man Machine Interface. Reggie: Oh my. Yes, that is back in the day. So when I left graduate school in the PhD program, I had a couple of options. I interviewed for several academic jobs. I thought maybe I’d want to teach, but I found a company called Frank Magid Associates. Right now, I think they’re now just Magid Advisors, I believe. And this company hired PhD researchers to work for their company. Folks with PhDs to work as research analysts. Love that job. That’s when the first research trip I took in that role was in Singapore. Conducting was in Taiwan and Singapore. So three months in, after getting a PhD, I find myself in Taipei doing focus groups with a translator, Mandarin translator. It was an amazing experience to just be dropped into that environment. And so I was hooked on just, at that time, I was a market researcher. But then at that moment, the internet was becoming the internet. So a lot of our clients were television, radio stations, startup internet companies developing websites. And so we began a practice of doing usability testing for these websites. I the first one was excite.com. Now I’m really showing my seniority. And I should say the rest is history. I was hooked on this idea of understanding how people were interacting with technology. So then my next job was the company that I referred to earlier in the people leadership role, the Gannett USA Today company. And that’s where I was for 12 years. And during those 12 years, I moved from being a market researcher to this kind of, when UX became UX, sort of this amalgamation of market research, UX researcher, and the iPhone came out and we were developing USA Today to be on, well, we already developed the website. So now we were building iPhone apps and iPad apps at that time. And so I was bitten by the tech industry bug at that moment. But then 2012 happened and I was laid off. So that was a pivotal moment in my career where I had been working in the media industry primarily for the first 10, 12 years of my career. And so after I was laid off, I ed a company which was a boutique enterprise research and design consultancy. And so we basically, our customers would hire us to come into their company. And so let’s say they installed a very big enterprise system like SAP or Oracle or one of these big platforms so that their company could do financial management or recruiting. SAP is a big platform. It may not work very well in certain places depending on how you implement it. And so companies would hire us to come in and understand why their employees weren’t adopting it. And then we’d go in and do contextual inquiry, ethnography research to help figure out what the problems were to help the company redesign these interfaces so that their employees could use it. I was hooked. When that moment happened, I was hooked because I began to… I mean, I was doing so much research with people who were working on business tools and business platforms. And I would sit there and watch them painfully work through a system that was not helpful in helping them get their work done. And I just develop a lot of empathy around that. And I just … That’s why it’s really interesting I’m working for Zendesk now because at the time I had a customer who was a utility company and we were conducting research and I was sitting there watching them. And if you think about a customer service representative, this is why I now have a lot of empathy when I have a customer service issue and I’m calling or I’m messaging on a platform. There is somebody behind there, there’s a human being trying to navigate four or five conversations, find data and information to help you with your concerns. And I watched it with my own eyes in several projects with that company. And it was just amazing to me that software isn’t developed very well for certain industries. Anyway, I was there for about three and a half years. The company was acquired and then I moved on to Facebook now, Meta. And that’s when I think my career really took a turn for the better. I mean, it went up because I was able to now to work for a growing company. That was at the time when it was hockey stick growth. We were hiring many people. This is maybe eight years ago and it was amazing. I worked on groups, the first iteration of groups, Workplace, which was their enterprise tool. I think their competitor was Microsoft Teams at the time. I worked on the camera effects platform. So this is when Facebook was trying to be Snapchat and they built the Me Too features to do the frames and bunny ears and all that stuff. Amazing work. But I was drawn back into enterprise work when I decided to switch to an IC role. So that was an interesting thing. I went from a people manager and about halfway in my career at Meta, I transitioned to an IC role and worked on the internal recruiting platform that they built internally. And I was helping the platform that recruiters use to recruit folks for jobs at Facebook. Spent a couple of years there and then moved on to Vanguard, a financial services company, very popular, 40 at that time, 40 year old company. Grew a team there, had a lot of success there. It was like a left turn going from social media to financial services industry. But it was great. I’m glad I know now a whole lot more about 401ks and change of ownership transfers. So I increased my finance acumen by working at Vanguard. But then the pandemic hit and I wanted a remote job, a fully remote job. And so Twitter came along, a company formerly known as, had an amazing time there until, well, we all know what happened. But I left before all of that happened. But I was a director of a creation and conversations team. And our main focus was primarily the tweeting experience. And that was really fun to work on. We worked on now, I think it’s still called Twitter Spaces, but the audio feature where you can talk to anyone in the world on Twitter. We worked on that. And that was a really cool experience. But I left, we had some changes in the company and I was looking for a new opportunity and that’s how I found Zendesk. And it’s where I am now. Yeah. Yeah. Steve: You mentioned that approach called “I Wish I Knew” that came from Twitter. People might also know that as the brand of the public-facing podcast that Twitter ran. Reggie: Right. Yeah. So if you go to that podcast, I am in the first episode of that podcast. It was, Steve: Did you have an involvement in that? Reggie: Yeah, we named that podcast that, it was a great experience. I think we did maybe five or six episodes. Yeah, it was great. We were basically telling the story of how research informs decision-making at Twitter. And so I can’t all the episodes, but then this was three years ago when we produced those episodes. I still go back and listen to them every now and then when I want to relive some of the glory days of Twitter, I would say when I was there. Steve: I’m not sure if I know of another example of a corporate research team podcast about research. There are obviously other research podcasts, but I haven’t seen a company share what they were doing that way. Reggie: We loved it. It was galvanizing for our team. Everybody wanted to, I wouldn’t say everybody, but we had a lot of hands raised to, “Hey, when can I do an episode? When can I do an episode?” And it was an intentional effort. We really were doing some great things on our research team to influence product strategy to grow and develop the understanding of how people were navigating the Twitter platform at the time. And I loved being able to tell that story and tell how we were doing it. And so those episodes were in service to doing that. Steve: So just hearing you reflect on your own research career. You and I had a conversation before we had this one. And you used this great phrase, I’m going to read back to you. I don’t know if I tripped or fell into research. Reggie: Yeah, I think it’s — I feel like that’s what happened. I did not know in graduate school that I’d be doing this job that I’m in now. I didn’t — you know, you don’t know what you don’t know when you’re that young. And many people who are in PhD programs, they may work for 10 years and go back. And that’s what it was like in the cohort I was with at the University of Tennessee. Many of my colleagues, they had already had pretty successful careers doing what they were doing. But I was one of the only ones who sort of was pretty junior in my life at that point in time. So I didn’t know what I didn’t know. And I think just working in the market research industry and being in this business during the time of the development of the internet was really helpful in developing my career as a researcher, but a researcher in tech. And once the discipline of UX was established at some point in the early 2000s, I sort of just, yeah, tripped and fell into it. I was already there. But there was a period of time, Steve, in my early on in my career where I was trying to learn a new methodology every year. I was challenging myself to learn, you know, something like predictive analytics and data science. And at one point, I went to a conference and saw an eye tracking machine. I was introduced to that. I learned that. And then ethnography came into play. So as a researcher, I was — every year, I was trying to learn something new. And then once UX became a thing at some point in the mid to early 2000s, you know, I was there. Yeah, fell into it, loved it, and began just nurturing my career in it. Steve: What are ways in which you are helping, whether it’s Zendesk researchers on your team or the community of researchers that we’re all part of in general, what are ways that you are ing them? Reggie: When I was at Twitter in 2021, we had a program that we installed on Black History Month that year where we had opened our expertise to folks from underrepresented groups across the world to help nurture their careers or interest in research and design. And I had a couple of people who I began to mentor during that month. So we had basically four weeks of connecting with our mentees. And during those conversations, I began to start compiling. I would give advice, and I would find links to articles or tutorials or, you know, how-to articles and tutorials, and I would put it in a Word doc. And my mentee and I would — each time we met, we’d go through it, and I’d offer some advice. So that doc started filling up. I had already had a Google Sheet that had a bunch of links to design thinking materials from my days of being trained at the IDEO company. And so I kept those links. One of my colleagues and I, who I worked with at the time, had first developed it, shared it with me, and we sort of kept it as a doc. After that mentorship program was over, I said, “I have all these links that were helpful in these conversations. Why not just put them in a doc?” And because it was helpful, because after that period of time when I was doing a lot of mentoring, more people were reaching out that year because that was the year we were really in the middle of the pandemic. There are a lot of people transitioning from one career to another. They were looking at UX research as an opportunity. We were on Twitter Spaces a lot doing something called Becoming a UX Researcher. So that was something that we started, and each month we’d have an episode where we invite people to come and talk about their career and their role as a UX researcher. It was so much fun. I think we did five episodes. We archived those episodes and put them as a link in this Google Sheet. And so, Steve, by the end of that year, that Google Sheet became this mammoth link farm. It was like, I called it my Craigslist sheet of UX research. That’s basically what it looked like. It wasn’t sexy at all. It wasn’t pretty. And now I had like seven or eight tabs of different things that was linked. Well, let’s fast forward. I began talking to my colleague, Laura Cochran, who’s my former colleague and dear friend. We’ve done a lot of work together over the years. And I said, Laura, can you help me build a website where we can merge all these links together and offer it as a UX research career platform, a place where you can go and get information, career resources, tools, information to help you grow your career. And so over the past few months, we’ve been building out this website, and it’s now in soft launch mode. We don’t have a name for it yet, which I want to ask you a question about that in a little bit. But it’s just called UXR. And the purpose or the goal of this site is really to help you grow your research career. That’s number one. Just understand what it takes to be a great researcher. Maybe you’re a junior in your career. Maybe some of these materials will help you get to the next level. The second thing we hope this website will do is it just inspire maybe folks who are thinking about entering the field of UX research. And so we’re trying to collect stories from others who have made the successful leap from, say, finance to UX research or marketing to UX research or something else to UX research. And we want to highlight those stories, what were the lessons learned, how tos, so that folks who are interested can learn from others who have done it. And then we want to connect people who have shared this information on other platforms. We want to connect them here on this platform so that it’s all together. So if you’re someone who’s sort of junior in your career, we’re hoping that this website sort of helps you grow, develop your skills. If you are trying to go from one career field to UX research, it’ll help you sort of inspire you with stories that you can learn from others. And we’re super excited by this. And so to your original questions, what am I, you know, mentoring is one thing. And this is sort of the next level of mentoring is that, you know, all the advice I’ve been giving people who were interested in becoming a UX researcher or growing their career, I’m trying to put it in this website. But here’s the interesting thing that we want to try to do on this website is provide curated information from others. So like, obviously, there are companies all across the world who conduct research. And we have articles on the site now from, say, interviews, dScout, some of the platforms. The Nielsen Norman Group distributes a lot of information. So we have a few articles there. But we want curated articles from UX researchers who have written a blog post or have logged, have a video that they’ve shared. Like I have some of my friends and colleagues who have written articles about their experience going from one industry to UX research, or maybe there’s a methodology that they’re an expert in that they have shared information about on Medium or Substack, some of those places. We want to curate those articles. So eventually, hopefully over time, those are the majority of articles on the site from individual researchers who are sharing about their experience or talking about a methodology that they’re expert in. Because I think people learn better from other people. And I’m excited about this, Steve. It’s very new. We literally just the day before we’re recording this podcast, we soft launched it. And we’re now in the middle of sort of sharing it with our friends and colleagues, getting some early so that we can, you know, do what you do when you launch a product. You soft launch it, get , make some iterations. And hopefully over the next few weeks or so, we’ll find a moment to really launch it in a bigger way. If you can add it to the show notes so that folks can take a look and we have links there where they can give us direct about the site or if you have a story of you that you posted about your journey from one career to UX research. Or if you have posted about something else that you’ve done or something that you’re expert in in the UX research field, you know, we’d love to highlight and showcase your story on the site. So there’s a place where you can share your information with us and we’ll speak with you and put it on our site. Steve: Is there anything else you think we should cover in our conversation today? Reggie: I think we’ve covered quite a bit today, Steve. I have a question for you. And this is probably a selfish question, but I love the title of your podcast, Dollars to Donuts, and I don’t know. I haven’t — I don’t know if anyone has asked you this before, but I imagine they probably have because we are searching for a name of the website that I was just describing and so I’m curious how you came up with the name Dollars to Donuts or does it have a hidden meaning that people don’t know about? Steve: Well, the phrase “dollars to donuts” is sort of about hunches. It’s an archaic phrase, I guess, but dollars to donuts, if I don’t miss my bet, this is what I think is going to happen. So I think that’s how it was used. I’ve never really, again, it’s an archaic term. So my brute force method was to go to the Wikipedia page of, I think it’s aphorisms, or there’s some category of short phrases. And I think the aphorisms are much longer. And I just sat there and paged through it until it was, I guess it was at the D, so it was not too far. But I was looking for something to spark, because I didn’t want to call it, I don’t know, Research Leadership Podcast or Steve Portigal’s, like I didn’t want that sort of obvious name, I wanted a name, right? Like, you know, I wish I knew, like, that’s an amazing name. And I didn’t know you had that source for it. But I had to kind of make it up. But when I hit it, there’s that moment of discovery where you find something. So the phrase, I think, speaks to the delight of discovery of research. People that have followed me on different platforms might have known that I am quite the donut enthusiast. So it had sort of a personal meaning to me. I think about the old trope of focus groups with the M&M’s in the observation room. And that idea of turning like junk food into profit, right? You kind of going from M&M’s to insights to new products. And I mean, that that one’s a bit of a reach, but that like, Oh, yeah, you go from from donuts to I guess it’s the other way around, right? It’s the dynamic between the food and the money at the end. So that’s sort of my rationalization for it. But I think when I saw it, I was like, Oh, yeah, I mean, I just started scrolling through pages to try to find a term that I could kind of make a pun out of or co-opt or kind of reuse. Reggie: I am looking forward to going through that exercise to create a name for this website. So I’ll let you know how it goes. When we rename it, I’ll send it to you. Steve: Right, we’re gonna share it and it’s pre, it’s pre named, it’s pre named version, but the next. Reggie: Yes. It’s a pre-named version. Steve: Well, I just appreciate so much your generosity of time and sharing and all the great stuff that you’ve done that I think there’s a lot for me here to learn from. It’s just really great to speak with you, Reggie. Thank you. Reggie: It was a pleasure, Steve. Thank you for inviting me on your podcast. I had a great time. Steve: That’s it for today, and as always, I hoped you learned something. And of course you can always find Dollars to Donuts on all the places where you find podcasts, as well as at Portigal dot com slash podcast for all of the episodes with show notes and transcripts. Our theme music is by Bruce Todd. The post 45. Reggie Murphy of Zendesk (part 2) first appeared on Portigal Consulting.
41:21
44. Reggie Murphy of Zendesk (part 1)
Episodio en Dollars to Donuts
This episode of Dollars to Donuts features part 1 of my two-part conversation with Reggie Murphy of Zendesk. We talk about aligning the work of the research team with stakeholder OKRs and empowering non-researchers to do research. The researcher would go into these meetings and say we’re going to do a “I Wish I Knew” exercise, where we start thinking about what we’re building for our customers, what are the questions outstanding that we still don’t have an answer to. We’d go through that exercise, and then we’d prioritize that list. I can’t tell you how valuable those exercises were and how our stakeholders looked at us and said, “Wow, I did not know that research could add this kind of value to our conversation,” because it really helped them see. You know, that question that we’ve been battling around in these meetings isn’t really the one that’s most important. It’s this one. And to see it all together was a revelation for some of our stakeholders. I can’t tell you how important that was. – Reggie Murphy Show Links Interviewing s, second edition Portigal Consulting services, including training The Product Manager Podcast: How To Master Interviews To Build More Lovable Products Reggie on LinkedIn Zendesk The Maze podcast: Scaling research through democratization with Reggie Murphy OKRs Zendesk Relate Help other people find Dollars to Donuts by leaving a review on Apple Podcasts. Transcript Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with people who lead research in their organization. I’m Steve Portigal. In this episode, I’ve got part one of my two-part conversation with Reggie Murphy, who is the Senior Director of UX Research at Zendesk. But before we get to that, when I went to revise Interviewing s, I started with all that I had learned about teaching research over the past 10 years. In fact, I’ve been leading research training workshops for a lot longer, and that was a key source for the first edition way back in 2013. Across all this time, helping folks get better at research has been a big part of my consulting practice. Sometimes I’ll teach a workshop as part of a conference, but mostly I run workshops for a company or a team within a company. Often the teams are interested in up-leveling skills, developing a shared language around research, or just being more effective at customer interviews. Regardless of the experience and skill mix in your organization, I can lead a session that will help you learn and grow together. The people that get value from training with me include those with almost no experience with research, people who have responsibility but not a lot of grounding, and folks with a great deal of expertise. I typically find a mix in any group, and learning together in real time with me and with each other, whether it’s in person or remote, is an excellent way to improve your organization’s research practice. Please get in touch to find out more about how I can help you do research better and do better research. I recently spoke er research with Hannah Clark on the Product Manager podcast. I’ll link to the whole episode, but right now, here’s an excerpt. Hannah Clark: Something I really wanted to make sure that we end off on before I let you go is bias. Because we always hear that interviewers should be as unbiased as possible. So how should we think about bias in a interview context? Steve: Right, and that word in our society, outside the domain we’re talking about today, it’s a really bad word, right? It talks about discrimination, racism, sort of everything that is screwing up our society is bias. So the word is a bad word, but I think when we talk about bias in interviews, we’re talking about cognitive bias. And one thing is I’d encourage people to be a little more forgiving of themselves. It’s how our brains work, right? There’s reasons why human beings have these biases. Confirmation bias is where you hear what you expected to hear. You already had an idea, and so when you hear somebody say that, you’re like, “Yeah, see, I was right.” So that’s not good, right? You want to sort of do better at that. And there are tactics for this, like, “Hey, before we start doing research, let’s all talk about what our assumptions and expectations are, not as hypotheses to test, but just to say them out loud or write them down so that they’re not kind of clenched within our chest, but they’re just things that we can look at and like, ‘Oh yeah, this is a thing that might happen.'” And that sets you up a little bit better to see those biases when they come up and to let go of them, to have things confirmed, but also see something different. But there is a comion for ourselves that’s necessary. I want to offer just like a short story about my own encounter with my own bias. It’s kind of a story about me doing something not great, or at least feeling something not great and kind of overcoming it. And it’s a story about going to interview small businesses and going into this agency, and the agency had the name of the founder on the wall, and I could go in. It’s like this creative environment, lots of fairly young, hip people kind of riding around. And I’m there to meet the founder, which I don’t know how I would get access to this person. And they come out, and they’re older than I am, and I was younger than I am now. And we go into this conference room, and we start talking, and I’m asking about goals and expectations and planning, all the stuff we want to understand about small businesses. And I realize at some point, this guy’s talking about his short-term goals and his long-term goals. And I realize that I am surprised by his articulation of long-term goals. And I realize that the reason I am surprised by it is that I had judged him. I mean, it’s my own ageism. I had decided this guy is sort of this figurehead founder who doesn’t really, is not really involved, and is like there to interview, not do the work. And it’s a terrible story that I had come up with about him based on what I brought into the room, my own biases, my own ageism. And so when I realized that, this is all happening in my head. Like, I’m asking questions, he’s giving me information. And I realize, oh, my questions are sort of based on my mental model, which is completely wrong. And it doesn’t always have to be a horrible ism like this. We all have our mental models about people. And if we can hear them being wrong, then we can redirect. We can like, oh, tell me more about your long-term goals. And so as a researcher, yes, I wish I was a person who didn’t participate in isms and was not ageist myself. But we all are some amount to some extent. And I think, you know, there’s a very unkind way to have that manifest itself. And there’s just a more normal human level of that. And I don’t know that I can be the arbiter of what those levels are. But I’m not saying this to be proud of my ageism, but just to be kind of full disclosure with everybody. When I had that moment, it was like really, really awesome. Like it was just such a great feeling of like it was even joyful, like, oh, I was wrong. I understand this person in a like a much deeper way. And actually to get over myself to be able to do that. My goal is to understand this person in a rich way. That’s what’s really exciting about research and what makes me be able to go get this stuff and kind of bring it back. And so again, they’re not always this extreme, but you often have to get over yourself a certain amount. So I mean, what was happening for me was I could hear myself. I could hear where I was almost clenching, like trying to steer his story a certain way. And he was steering in another way. And just to feel that tension between what he wanted to tell me and what I wanted to talk about, which happens in every interview to a certain extent. And some of that’s just topic based. But here it was sort of identity based. And so there was insight for me in that about myself, about the topic, about this guy. And I say that at the risk of being judged for my own bias, but with the hope that we can all get better at hearing our own bias and in the moment and kind of grappling with it and being intentional about the choices we make in the interview to get to what we’re trying to get to. That’s from the Product Manager podcast. I hope you’ll check out the updated edition of Interviewing s if you haven’t already. So let’s get to the first half of my conversation with Reggie Murphy, the Senior Director of UX Research at Zendesk. Reggie, welcome to Dollars to Donuts. It’s so excellent to have you here. Reggie Murphy: It’s great to be here, Steve. Thanks for the invite. Steve: I like how we’re talking about here as if we’re in a physical place. We are just on a website talking to each other. Reggie: We are in a virtual place. Steve: Yes, I’m virtual here. First of all, what’s your title, your job role? Reggie: I am Senior Director of UX Research at Zendesk. But for the past couple of months, I have been in an interim role as Head of Design for Zendesk also. So I basically have two jobs right now. Steve: Just two. Reggie: So pretty busy. Steve: For people that don’t know, what is Zendesk? Reggie: Zendesk is a complete customer service solution. So we help companies develop really positive and connected conversations with their customers through a system that allows them to get from their customers to hear issues and concerns. And our software allows our customers to triage and prioritize in tickets. So our main software is a ticketing system. And so our customers and s can share what their concerns are, issues are, and the system will enable the agents, the customer service agents at our customers to resolve those issues and problems in an efficient way. Steve: So the s, I guess we have s and we have customers. Reggie: Right. Steve: Just thinking from a research perspective, who is it that you are learning about, what categories of individuals do you learn about in order to improve the product? Reggie: So our primary customer, if we think about cohorts that we conduct research on, there are two. First, it’s the s. And that is the people who set up our platforms. And they could be so they’re like the director of CX, customer experience at an organization. They could be director of customer . They could also be the chief information officer. So there is a variety of roles of folks who are responsible for purchasing our product and installing it within their organization. And then there is someone that we call or an who sets it up so that their organization can use it. So that’s one cohort that we conduct research on to learn how that we best improve the product for them. And then you have the customer service agent. And that’s the title in some companies. And that is the person who is on the front lines talking with the end . So their customer. So for example, Grubhub is a client is a customer. So their end would be someone who say just ordered some food and maybe has an issue with their order and they go online and they use say the messaging system to alert Grubhub that, hey, I have an issue with my order. I didn’t get the right thing. So that’s the end . And so the customer service agent at Grubhub is using our system to talk with that end to help resolve that issue. And so we spend a lot of time researching the s, how they are setting up the system and the agents and the interface that they’re using in order to talk with the customers that are using their product. Steve: Do you ever deal with the end of your clients’ customer systems? Reggie: Not a lot. It’s, you know, we have to make strategic decisions about the work that we do because of the size of our team. We’d love to do more work with the end . And we’ve done a few projects. But by and large, we spend the vast majority of our time speaking with s and agents because we feel like that’s where we get the most bang for our buck and who we’re really building for. And I believe if we do the right thing and we build a thing right for the s and the agents, then the end will have a delightful experience also, hopefully. Yeah. Steve: So you’re talking a little about, I guess, where you have to prioritize and what are the areas that you’re thinking about the most. Can you maybe say a little bit about that in of establishing research or building research? Where have you decided to look and where have you decided to focus? Reggie: So when I first ed, and I talk about this in a podcast I recorded for Maze recently, where we were a team of 11 researchers. I have two research operations folks and I have two research managers. So a team of 16, including myself, relatively small, considering we partner with 200 plus scrum teams in the organization, in the broader product organization, which is about 1900 people. So relatively small team. So when I first ed Zendesk, we were spread out throughout the organization. We were getting work done, but most of the majority of the work was primarily tactical. And we were working towards the end of the product development life cycle, sort of in the evaluation phase. Okay, we’ve already built the thing. Now let’s go and talk to customers and see if it works. And great. We needed to do that type of research, but I believe just philosophically that UX research teams, especially one as small as the one I had inherited when I ed, we needed to be working more towards the beginning of the product development life cycle. Because I thought that that’s where, I think that that’s where the most value can be extracted from a research team. Over the past couple of years, we’ve been shifting our work to achieve that goal. And in doing so, the way we’ve done it is we have aligned the work that we do to the company’s OKRs. So each year, a company has about six or seven big OKRs. And then those translate into other OKRs at the organization level. And we align to those big OKRs. And so I’ve spent the last couple of years orienting our team around that structure. And it’s made our ability to conduct more strategic research at the beginning of the product development life cycle better, more efficient, and we’re adding more value. We still, of course, do tactical research. We still do usability testing and concept testing. Should we go with concept A or B? We still do that. But we’ve seen better success now that we are oriented in this way. And I believe that we’re going to continue as our team, hopefully we get bigger, but let’s say, assume that we stay the same size for the next year or two. I believe this helps us work on the most important work at the company. Now when we do that, obviously there’s some research that still needs to be done that maybe our team can’t do themselves. And that’s what I talked about in the Maze podcast. And so over the past couple of years, we’ve undergone a pretty extensive effort to democratize some of our research methods. Obviously you can’t, you know, someone who’s not a professional researcher, there is certain types of research that we believe that we can enable them to do. And then there’s other types of research, the more extensive, deep research that we believe that, hey, let us handle it. And so we’ve developed a pretty extensive plan that I’ll talk about in that podcast to help designers or product managers who we may not be able to partner with very closely on their particular area. We’ve enabled them to do very basic testing using the testing.com platform or just basic interviews just to get some or insight for a particular customer problem that they’re trying to understand. So I think we’ve been pretty successful developing that system over the past couple of years so that we have some balance now. Our researchers are doing really valuable strategic work, foundational research, some tactical research, and we’ve enabled other functions, not researchers, to do research that we aren’t able to help them with on a day-to-day basis. Steve: I will put a link to that interview in the show notes.This idea of giving other folks tools to do certain kinds of research, democratization, I’ve heard people refer to it as the D-word. It feels like a controversial area. Just having worked on this for a period of time and had success, do you have a hot take on the hot take, I guess, about democratization? Reggie: I’ve read a lot of hot takes on this lately. I do. You and I have been in this business for quite some time. And those of us who have many years of working in different companies and different settings, I’m a little old school, so I believe that, hey, at one point, let me just say at one point of my career, only researchers should do research. I took a hard line on that. But when you work with other functions in the way that I’ve worked across my time at Facebook, well now Meta and X, formerly Twitter, and some other companies, you learn that there are other functions like some designers and PMs who also have research experience that if you were able to enable them a little bit and provide them some guidance and the right tools and set the right boundaries and parameters for how they structure a research program, that they can do valid research that they can be confident in. At one point, yeah, I it, Steve, it made me cringe that I would allow someone to come in and do a usability test without me or without the team. But I’ve backed away from that in that because we’re now in a world where we have limited resources, headcount, we have constraints on time, we need to build and ship. And so we have to be flexible and adapt. And I think over the years, I have begun to really understand what that means in of delegating research. And now I don’t cringe as much because I am working with very capable, very intelligent people who are open and understanding that, “Hey, look, I’m not a researcher, so please help us. We really need assistance here.” And so at Zendesk, we’ve set up a really solid model where we can be confident in the other research work that other people are doing. So I don’t know if that’s a hot take, that’s a meandering answer, but I no longer have that firm hardline stance that I did back in the day because now I have worked long enough with cross-functional stakeholders that just want to do the right thing. They’re not going, you know, maybe there’s some rogues here and there, but I’m working with people who really want to learn and understand customers and they’re willing to listen and learn and be educated by a research team in order to do it and to do it well so that they can be confident in the outcome. Steve: And just to affirm the deep roots of that cringe,the fear of what other people will do, way way back, I working at an agency doing in context research. And our clients would occasionally ask to show up and sit in. And so we’re not even talking about leading or have any ownership of or any responsibility for, but just be present during. And that was a horrifying concept, that just the mere presence of somebody, they would disrupt this elegant, finely nuanced dynamic. And I think we kind of started to compromise and we would stage this. We talked about re-interviewing somebody or finding a friend of a friend, recruiting something so that we knew it wouldn’t screw up. In a consulting situation, again, early days of the work, so it wasn’t maybe well understood. Reggie: Right, exactly. Steve: Just the sense that this was fragile and could be ruined, and that would hurt the work and hurt us and hurt the relationship. I think the roots of what you’re talking about go back and back and back. And when we say them now, it’s ridiculous. What I’m saying sounds ridiculous. How could you ever prevent people from coming or not want them to come? Now we want the opposite, right? Reggie: I some of those days, but I think in the second company I worked for, which was a media company, the Gannett company that owned USA Today, maybe eight or nine years into my career there, we began to do more ethnographic research. This is when I was doing a lot of training with the IDEO company and design thinking was the thing at that moment. And this is why my stance on this softened because we spent a lot of time in the field with other non-researchers. We would intentionally invite them so that they could see for themselves what customers were saying. We were tired of going out and conducting a piece of research, bringing it back, and people not believing us. You know, because those are questions that come up, “Well, what did they say?” And there was skepticism. And as much as we tried to convey the value of the work and do all the things to make people feel confident, you know, there was always that bit of skepticism. So we began to empower and engage our stakeholders and invite them to infield assignments. And that’s when I believe that I saw it from a different perspective. I saw some genuine, sincere desire to sit and listen. And we would tell them, we had a sheet and I still have some of these istrative protocol sheets where it’s stakeholder, basically how to be a good stakeholder 101. Here’s what you do. Here’s what you don’t do. Just sit and listen. Don’t say anything. So after a while though, we would allow them to probe, ask questions because we would educate them on how to do it. Now for those who are still on that hard line today, who would cringe at me even suggesting that we do that, I would say that, yeah, there are probably some cross-functional partners who may not be the type of person who would do very well with this responsibility if you give it to them to do it solo. So then what do you do? Well, at Zendesk, we have sort of a consultancy model. So if you’re a PM, product manager, and you have something that you want to, you have a question, you don’t have a research partner. We have a research request process that we triage these requests. And then we set up time, office hours, time with a researcher, and we’ll consult with you to understand what is the problem that you’re trying to explore and understand. And given the level of urgency or what we believe to be the level of understanding that you need, the precision that you need to have to have a really successful outcome, we’ll decide I think you can do this on your own. Here are some tools. You can go into our learning management platform. You can do our certification course and here are the tools, go for it. Let us know if you have any problems. That’s one route. The other route is after that consultation, maybe we don’t feel confident and maybe the person who’s asking for help may not feel as confident. And there we can make an informed decision about maybe we go into the direction of, “Hey, we’ll consult with you along the way. It’s not like you’re flying blind.” That gives us, the research team, confidence and comfort that we’re doing the right thing and we’re not sacrificing integrity or quality of the work and the outcomes are what we want or what we envision them or expect them to be from the research. Steve: I’m having a small, if not big, aha from that. Some of the discussion around democratization is, yeah, look at the nature of the work, the urgency, the complexity. Sometimes it’s around the method. But I think if I’m hearing properly, you’re also saying, look at the team, the requester, and their capability. Educatability is maybe not the right word, but you want the work to be successful and who’s gonna execute it is a factor there. So you choose how to respond. That’s something that you factor in in choosing what path to recommend. Reggie: Absolutely. I have another example to give you. Recently our company had its biggest customer event. It’s called Relate. It was in Las Vegas this year. The product team took a lot of people to the event this year, which hadn’t really been done previously at an event like this where you have 1,500 customers. And the research team, the design team, we were all there and we set up an area where customers could come and talk with us about various product ideas that we were launching at the event, but also to help us look into the future. We called it co-deg the future of Zendesk. And it was just three researchers, well, four researchers, including myself at the event. We had designers and the remainder were product managers. So in order for us to be successful at these customer conversations, we developed sort of some tools, some guidance that we shared with the organization. Say, “Hey, if you are speaking with a customer, here are some questions that you can ask them and here’s how you can ask them.” And we enabled them to, I believe, have some really fruitful, enriching conversations. And I felt good about that. I felt good about what we learned. And I think as you grow and mature in your career as a people leader, which I am, but even as a UX researcher, you begin to develop sort of this level of, okay, it can’t be all bad. Like there are ways that you can adapt and get the type of outcome that you expect. Is every piece of research that a non-researcher will do will be perfect? No. Will they make mistakes? Yes. But on balance, I believe that if you set up the right parameters, provide the right guidance and consultation, then you can really get the outcome that you want. I think that we did that at this conference this year. Steve: did you see some of your non-researcher colleagues being successful in their interactions with customers? Reggie: It was amazing to see. They were engaging, they were listening, probing in different places appropriately. Because we were sharing some concepts on a big corkboard wall. And to be a good researcher, and I think I made the comment in the Mace podcast that everybody can be a researcher. I’ve read lots of articles where I say, “Everybody can be a researcher.” I truly believe that any time that you are in front of a customer, no matter who you are at the company at Zendesk. So let’s just say Zendesk, I’m just talking about Zendesk right now. Any employee, no matter your title, anytime you are in front of a customer, you can be a researcher. You have to be a researcher, really. Because that’s how you learn what that customer is trying to convey to you. And I was so proud of us and the guidance that we gave them. But in watching how we interacted with customers, you would have thought it was a team full of researchers. And I’m not going to take all the credit for it, but because I think we were just working with, I have an incredible team, teammates, designers and product managers at the company. In order to have a really solid conversation with a or a customer, in our case, by and large, all you have to do is really listen and ask why and how. And I think, you know what I’m talking about, Steve. You ask those questions that can really help move the conversation along so that you can get to the understanding of what that customer’s need may be. Right. Steve: It’s simple and easy and really hard all at the same time. Maybe that’s why as a field we have kind of fraught conversations about this. Everyone is a researcher is, I mean, it’s a good hot take because I think it’s a distressing comment and what do we mean by research? What do we mean by researcher? I think you’ve provided the context, but that little pull quote, and I promise not to just pull that out of context on you. But that whole quote, it distresses people. Reggie: Yeah. Steve: But I think you’re framing it in a really positive way. And you’re right, it makes me wonder, and I am certainly guilty of this, collecting stories of when we see things go wrong. I mean, I like those because they’re good learning moments, but Reggie: Absolutely. Steve: I think there is a tendency or risk of focusing on this narrative that this is hard and people can’t do it. But when you just look on the other side of the mirror like I think you’re doing and saying like, this is hard and people can do it. It unlocks a lot more potential for the outcomes that we all want, the kind of information that we’re looking for. Reggie: If I can say one more thing about this. What I’ve enjoyed about this process of enabling our cross-functional partners to do research, empowering them to do it, is that it makes them better product managers. It makes them better designers because now they have the customer in mind. And I can’t tell you how many organizations or teams I’ve been on where that wasn’t the case. And I’ve had to kind of sort of jump in the middle of conversation and say, “Hey, wait, has anybody asked the customer what they think about what we’re doing here?” And so I love that the organization that I’m in are thirsty for these opportunities to, number one, talk to customers, but number two, to learn from a team of expert researchers on how to do it. Because in the end, it’s going to make them better. And I feel very strongly about that, that, hey, we’re not where we want to be, but I think our entire organization is on a really nice journey up the maturity curve on becoming more customer first and understanding the customer better. And these opportunities like the Relate event and just other projects that we’re conducting internally, I think are really good ways for our teammates across functions to learn, grow in their own careers and to be better at understanding how to gather information to make more informed decisions about what we are creating, building and shipping for our customers. Steve: Maybe I can use that as a segue to go back to something you said quite early on in our conversation. One of the areas where you’ve been pushing to build that maturity, and you talked about shifting where research is focused from kind of evaluative late in the product cycle to more strategic and other things that are taking place earlier in the cycle. And you talked about kind of realigning. And you said a little bit at a high level about how OKRs were kind of the unlock, I guess, to kind of get in there. And because I think you’re describing a very common challenge that maybe less mature organizations face or research leaders face or research teams of one. I’m only allowed to do this work. No one wants this other work for me. I keep recommending it. And it’s sort of that I think some people are kind of stuck in these little traps where they can’t change the perception of what value they can bring. And therefore, they can’t get the opportunities to bring that value. So could you unpack a little bit more about how you changed what the opportunity was for research at Zendesk? Reggie: There were a couple of things that I did. First of all, the first six months, I just was assessing what our relationships were like with our key stakeholders. And I came to find out that, you know, while we had some syncs with them, they weren’t necessarily on a cadence. When I say syncs, one-on-one meetings with our design leaders and product leaders and engineering leaders. So, number one, we need to set up a consistent cadence where we are connecting with the leadership of these particular work streams on a consistent basis. Then we would put ourselves, make sure that we are positioned appropriately in the important conversations that are happening earlier in the product development lifecycle. When the customer problem is being talked about, or what we think are the customer problems, researchers who are in that moment, and they’re in those meetings early on, are able to help that conversation understand, well, how do we know that this is a problem, number one? And I think it was very important at the very beginning to establish a clear and consistent cadence of syncing with other cross-functional stakeholders. So that was the very first thing, and it was developing that relationship. So I think the relationship that we had, and I might be exaggerating, was we have an idea, it’s pretty baked, research, come on in. Let’s tell you about this idea. Now let’s go set up a research program to test it. I’m not saying all the work was like that, but that’s what I observed. I said, we got to fix that. So that was the first, I think, thing that I really tried to do was develop a closer relationship. Steve: So a very sort of tactical thing, you’re setting up a cadence of one-on-one relationships with the people in these other parts of the organization. And one of the outcomes there is to get into those meetings where a researcher can identify assumptions and challenges and so on. Are those meetings where those conversations are happening, are those literally the one-on-ones you’re setting up? Reggie: Well, they were the workstream leads meetings. So there’s scrum teams all over, and they’re small, medium, and large sizes. But there were meetings where the design leadership, the product manager, the product marketing manager, maybe there was a data science leader. That core team that is working on that thing, I wanted to make sure that not only were we in those meetings at a consistent cadence, but that we were developing one-on-one relationships with everybody in that group. So the relationship is one thing where you’re doing a one-on-one, and you’re just learning about that person, and you’re really understanding what that person needs and wants. But from a team level, now you are in the thick of it, and you are in those conversations, like you said, where the customer problem is talked about from the very start. And in addition to that, we installed some protocols or exercises or activities where the researcher will facilitate a conversation around what are the potential customer problems. We call it I Wish I Knew. And this was something that I learned about when I was working at the company formerly known as Twitter. And it was an exercise. It was essentially a brainstorming exercise. And when I first ed Zendesk, it was right around May and June. And then I guess right around September, October, we started annual planning for the next year. And that was a perfect time to install this activity. So we piloted it in a couple of work streams. This was a moment where the researcher could go into these meetings and say, “This meeting, we’re going to do an IWIK exercise, I Wish I Knew exercise, where as we start thinking about annual planning and the products that we were thinking about building for our customers, what are the questions outstanding that we still don’t have an answer to or areas that we still need to explore?” And so we’d go through that exercise, and then we’d prioritize that list that they would come up with. I can’t tell you how valuable those exercises were and how our stakeholders looked at us and said, “Wow, I did not know that research could add this kind of value to our conversation,” because it really helped them see. Because especially once you get into the prioritization work, you look and you go, “That question that we’ve been battling around in these meetings isn’t really the one that’s most important. It’s this one.” And to see it all together was sort of a revelation for some of our stakeholders. So when we installed that exercise, it changed the game a bit in of our relationship and our position or brand inside the minds of our cross-functional partners, because now we’re providing a value at the very beginning of the product development lifecycle that is helping establish, “Okay, so these are the problems that we need to be worrying about as we think about planning for the next year.” Can’t tell you how important that was. And then, so if I think about the next part of this was as the company started looking at their annual plan and defining these big buckets of areas that we wanted to work in. So for example, intelligent conversations is sort of one of them, just throwing that out there. And this is all of our sort of artificial intelligence work. And we think about within that big bucket, there’s like maybe, I don’t know, I’m just picking this up, there’s 15 projects or 15 streams of work that need to happen. And we strategically aligned a researcher to that category of work to work with those teams to figure out, well, where can research lean into the most to have the most impact in that work stream that laddered up to that OKR. And when a researcher is in that position and they’re reporting out research, and at the very first couple of slides, this is the research project that is designed to learn 1, 2, 3, A, B, C that will help inform OKR XYZ. That is how you operate. That is how you become a valuable research team. It’s not perfect yet. I think we still have room to grow and get better at it. But over the first couple of years, I think we’re doing a really good job at setting up a structure where we are able to add that kind of value to our product development organization. Steve: Can I play back some of what I heard? Reggie: Yes. Steve: Because I think this is a universal complaint and concern, right? Not getting access, not getting permission. You talked about the individual relationships with people and as part of that, being in those meetings, being in those team meetings. I think it’s on a case-by-case basis, right? How do you get invited to that? But there’s an audacity a little bit of showing up and saying, “Hey, we’re here to be part of this meeting,” then making a contribution. But I think maybe biding your time a little bit. I’m now putting words in your mouth. But building the relationships, being in the meetings, hearing what’s happening, and then choosing a point at which to bring value. And the value that you’re bringing there is facilitative of others. It’s not, “Here’s what we know. Oh, we have the answers.” But just, “Let’s help you talk about what you have to decide and learn.” Reggie: You’re onto something. So yes, we’re doing that. But in addition, we are bringing some answers to that meeting because one of the other things I did in the first year is we established a research and insights library, a research repository. So we aggregated all the UX research that had been done to date at the company, and we put it in our confluence site, and we set up a nice searching function. So literally every research project that we’ve done that we could find is there. And so in those meetings, when we are having conversations about different customer problems that may not have been answered, we may be able to answer some of them because we now can point to this library and we say, “Hey, wait a second. A couple of years ago, we explored this problem at this angle. So the question that you’re asking, we’ve already covered that piece of it, but maybe we can explore it in this way, in a new way.” And so I don’t know if I’m countering what your assertion was earlier, but I’m basically saying yes, we are able to bring some answers to that meeting for the purpose of helping triage the priority or prioritize the list of questions that come out in the brainstorming of the I Wish I Knew brainstorm session. Steve: So you are countering it, which was the outcome of me doing, like, “Here’s what I think I heard” question for you. And then I guess the thing that I took away from you sort of describing that presentation where a researcher is part of a work stream and says, “We learned these things. It affects these projects.” I think you said, like, in the first couple of slides, we talk about things like oh, speaking the language of business. Sometimes people talk about you got to know, like, the business model of the company. But I think you’re giving a very actionable definition of what it means to speak the language of business, which is to know what initiatives or what projects or what OKRs to be able to frame the information you’re bringing in of what things it ties to that are already what people are concerned about. So doing that translation, so, like, why this is important. Reggie: Absolutely, because it’s going to ladder up and inform this OKR. Now what I want to mention is this is how we set up our structure to help the researchers align directly with valuable work and strategic work that they can do. But in addition to that, we are also pushing ourselves to even look beyond that specific work and look horizontally. And that has really galvanized our team. For example, I mean, adoption. If you’re at a B2B company, adoption is a huge priority. So not only are we doing these discrete projects that are laddering up to the OKRs, we’re also stepping back and looking across the organization and saying, okay, well, out of all the projects that we’re working on, what are we learning about customer adoption in all of them? And how can we now inform how the company is thinking about how we help our customers adopt our product and extend it and upgrade, add more? Like what do they need? And so I’m so excited by this work that we’ve recently done around adoption. And we’re seeing a lot of our cross-functional partners leaning in on it and asking us for it. And we’re trying to do a little bit more. But the reason why I’m mentioning this, I just want to sort of set the structure of how we’re maturing as a team is that, yes, we’re aligned to these OKRs. And it’s actually freeing us up to do the strategic work, but also to pause, step back a little bit and even look higher than that. And I just can’t wait to see what we do for the rest of this year and next. Steve: What’s the mechanism for different researchers working on different projects? They have different whatever, it’s documents, it’s in their heads. How do you collectively find those overarching issues and kind of pull them together? Reggie: You know, with the adoption work, it kind of organically happened. We do have check-ins every quarter where our research team, we get together and we do what we call research readouts for that quarter. And this is where a researcher will share all the projects that they’re working on, what they intend to learn. Maybe some of the projects are already in flight, so they’ll give a status update. And in that moment, the purpose of those meetings is to find opportunities where we can collaborate and work together on different projects that are similar. And organically, that’s when these opportunities surface. And I think that’s what happened with the adoption work. There were several projects around adoption. And oftentimes, even with a team as small as ours, that you may not know everything that your research colleague is working on. And that’s why we do these quarterly syncs where they read out what they’re working on. It’s super-duper helpful for us being able to do this. And that’s what happened. It just sort of organically happened. But I think we’re trying to be more intentional on finding opportunities like that, where out of all the streams of research that we’re conducting, what are we learning that is sort of a global insight that is meaty enough that we can highlight it as a thing that the company needs to lean in on and care about? That to me is where value is from a UX research team, even as small, medium, and large, no matter what size you– if you’re doing that on a consistent basis, senior leadership, executive leadership, they pay attention. And I’ve seen it. And I think what insight has done to us, we’re starting to gain this traction and awareness that we’re doing this kind of work. And I love it. And I encourage anyone who’s listening to this podcast to start thinking in this way, because you can get so buried into the day-to-day research. And you really want to add value to that work stream. And you really want to do well and be successful. But you also have to look ahead. And you also have to look beyond it a little bit. And when you do so, you can find opportunities like that to tackle issues that you may not see in just one particular, one discrete project. Steve: And that’s the way it is, at least for part one. Stay tuned to this donut channel for part two with Reggie coming soon. A reminder that you can always find Dollars to Donuts at all the podcasty places. Plus, you can visit portigal.com/podcast for all of the episodes with show notes and transcripts. Our theme music is by Bruce Todd. The post 44. Reggie Murphy of Zendesk (part 1) first appeared on Portigal Consulting.
44:36
43. Leanne Waldal returns
Episodio en Dollars to Donuts
In this episode of Dollars to Donuts I catch up with Leanne Waldal, five years after she first appeared on the podcast. She’s now a Principal in Experience at ADP. A couple of years ago, I realized I know things. We all know things, but sometimes we go through life thinking there’s always something more for us to know, or we don’t know as much as others. A couple of years ago I was like, oh, I know some stuff. I could share it. If I think of myself at 23, 24 years old, I had people who were my age now who were telling me things that I listened to and got advice from. I’m that person now. I can be the person who like gives people advice or says, I don’t actually know everything, but here’s some things I learned over the years that might help you. It makes me feel good to do that. It boosts my confidence. It helps me feel like I can actually do something that’s not just my craft or not just my job for a paycheck or not just this, but I actually have something to offer. And that’s a great feeling. – Leanne Waldal Show Links Interviewing s, second edition Beyond The Surface: Navigating The Depths Of Research With Steve Portigal (Greenbook Podcast) Leanne Waldal on Dollars to Donuts, 2019 Leanne on LinkedIn ADP Badass: Making s Awesome by Kathy Sierra Dropbox Orbiting the Giant Hairball: A Corporate Fool’s Guide to Surviving with Grace by Gordon MacKenzie Organic Online Howard Rheingold NASDAQ crash, 2000 San Francisco 2004 same-sex weddings Help other people find Dollars to Donuts by leaving a review on Apple Podcasts. Transcript Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead research in their organization. I’m Steve Portigal. In this episode I catch up with Leanne Waldal, 5 years after she was first on Dollars to Donuts. But before we get to that, I updated my classic book, Interviewing s, to incorporate what I’ve learned in 10 more years of being a researcher and in teaching other people. After the book came out, I spoke er research with Karen Lynch on the Greenbook podcast. Here’s a short clip. Karen Lynch: I mean, there’s even some frameworks for creating a knowledge management platform for yourself, you know, how to have a database of your own research. So excellent applications for a smaller shop that might not have access to platforms and tools. But here’s a way you can kind of create your own hub, knowledge hub. You did a very—a good job, solid job, an important job of also providing different—you know, here are some forms. You gave structure to the field is what you did. You know, here’s some forms that you can look at for, you know, debriefing your interviews after you conduct them. Here are some kind of, like—I don’t want to call them templates. But here’s the framework that you can do for creating your discussion guide. Here’s some tools you can lose—use to synthesize your data. So you were—you’ve given some very tangible tools in this book for anybody who is really trying soup to nuts to go off on their own for the first time or just get to know the field that maybe they’ve been hired in, really practical, tangible things that researchers can borrow from. I mean, again, having been in the field for a long time, some of this was—you know, some of this was a part of my practice. But I’m like, “You know what? That’s a great debrief form. That one really stood out to me.” For example, Interview Debrief Form, where it’s not just take notes on what you just experienced, but it’s prompting your brain to think through what it means. So kudos to you on that. Is that just a practice of yours? Or did something kind of stimulate that thought that you’d might want to include those? Steve: Well, you know, there’s this interesting part of research where it’s collaborative and facilitative. I mean, it’s not just what, you know, I, as the researcher, or me, as part of the research team, learns. It’s, you know, the people that we’re working with. And so I have an obligation to them. Boy, that sounds like—it sounds very—more moralistic than I mean. Like, I can do a better job if I can help them learn something and take something away. But, if I also—if I hear what they’re taking away, especially—I’m, you know, I’m not the domain expert. I work as a consultant, so I come into an area that somebody else inhabits. And so they’re going to always see things in the research that I won’t see. It’s really helpful for me to understand what didn’t they hear that person say. Like, if there’s a gap in what they took away, then I now know I need to kind of emphasis that because it’s—there’s a takeaway that’s obvious to me that isn’t to them. So that needs to be surfaced as I share back. So I can get that out a debrief. And, when I hear what they heard and what surprises them, I understand, yeah, how they’re framing the world, what’s relevant information. Like, I’m getting this indirect . So, you know, I—like, it’s not my natural way of being to have a template for an activity. I’d rather just chat. And sometimes that suits me well, and sometimes I need to put a little more structure in it. So, I think, you know, writing up a debriefing guide—well, I think there’s—having something formal like a template or a tool you can use sort of reminds me that this is an important part of the process. I need to make time for it and mental space for it, and I need to tell the people I’m collaborating with that they should leave time for it. And guess what? This is a serious activity. I don’t—I’m not just trying to, like, get coffee with you and, “Hey… what did you think?” I’ve got a document here. So there’s a little bit of a theater. And I don’t mean that in an unkind way. But there’s a little bit of a, you know, a formality to it that reminds me to take it seriously and that shows my collaborators that I value what they have to say and that I, you know, I’ve got some format for that. That’s from the Greenbook podcast. I’ll put the whole episode in the show notes. And I hope you’ll check out this updated edition of Interviewing s, and share it with all of the people in your network. You can help me and help the book by writing even a tiny review on Amazon. So, let’s get to my conversation with Leanne. She’s now a Principal in experience at ADP. Well, Leanne, thank you for coming back to Dollars to Donuts after, I guess, five years Leanne Waldal: Good to be here. Steve: or so. It’s nice to talk to you again. It’s nice to talk to you again for this podcast. That introduction presumes that you and I have not spoken in the intervening five years, Leanne: We have. Steve: but we have. Leanne: Yes, indeed. Steve: We have. The secret is out. Let’s talk about some of the things that you’ve been up to professionally in the last five or so years. We can kind of start anywhere and go anywhere, but I think that’s one of the things we’d like to cover and just hear what some of your experiences have been and what you’re thinking about nowadays with research. Leanne: Yeah, and the wonderful thing about the last five years is it was about four years ago that the pandemic started. And when we spoke about five years ago, it was probably six months before this pandemic started and it changed research. So I went and worked for a company and helped them understand using ethnographic research in and , what was going on with the markets that they were trying to sell into their help teams really understand what they needed to do to better sell into those markets. And then I ed an agency. And at the beginning of 2020, I was selling a project to a company with my teammates in the San Francisco office. And we had planned out like many of us in early 2020, all the things that we were going to do in this project over the course of 2020. And then of course, the pandemic happened and we all went home. And so we had to shift the type of research we were doing from going out and talking to people in person and seeing them in person to doing everything over video and then also running some surveys. One interesting thing that happened in the summer of 2020 is humanity got tired of answering surveys. So I was doing mixed method research in the summer of 2020 with interviews with people about their experiences. And then I also wanted from those experiences to measure them at scale with a survey to find out, well, if I talk to 20 people and then I take these things and find out, you know, how do a thousand people respond to these experiences? It was a really easy target. It was millennials in California. It took like 35 days to get a thousand responses. And that was fascinating to me because it was really easy for us at that time to get people to talk to us on video. But I talked to the provider we were using at that time and they said, people are tired of answering surveys. This pandemic is wearing people out. We have all sorts of personal things we’re dealing with and we’re also terrified the world’s about to end. And so then in 2021, before the vaccines came out, we were doing some work with a startup at the agency I was at. And I realized that some of their early like Kickstarter ers were in the San Francisco area. And I said, this is about motorcycles. Motorcycles are used outside. We could actually meet people in person. Like that’s the thing I miss the most. I’m an extrovert. For all the people who are introverts, the pandemic was just wonderful for them to a certain extent. Like lots of people I know who are introverts were happy to work from home and happy to be at home. Me, I miss people. And so I started up ethnographic research again in 2021 by meeting motorcyclists in a park. Got some great pictures of all of us in masks standing about eight feet apart from each other handing out water bottles and shouting across the grass to each other. When I left the agency, I went to work for a startup and introduced again ethnographic research to them because they had a mobile app but hadn’t really watched people use it out in the real world. And so I started just in the San Francisco Bay area to just run a pilot, like let’s go get some people using this and see how they use it. And then got the funding to go visit some customers and visit some consumers and some other people. So I was tromping around in the snow in Minnesota in like negative degrees following somebody who was using our app also tromping through the snow, which was super fun. And when that startup laid me off, I did some consulting. I did some consulting for a large company, also doing some ethnographic research, going into some offices of the people who use their products to show the pain points. And also then started what I started doing, which I hadn’t done in a while, which was really fun was looking qualitatively at the pain points that were happening and then going to the people who have numbers from revenue and numbers from usage analytics and figure out how much money is being lost because of the experience or how much money could be gained from cleaning up some of the experience. And then I ed ADP in June of last year as a principal in experience. And I’m leading all sorts of research, including recently some ethnographic research again with one of our clients, which is super fun, teaching people, teaching people and teaching people. Steve: I think we like to forget things that happened during the pandemic because, as you said, people were worried that everything was going to end. So that slow survey response or slow and low survey response, do you know, has that rebounded in the time since then? Leanne: It definitely has. So by the time I was at a startup that I was at, I think we did some ethnographic research, some interview research, and then we had like, you know, six key things that we wanted to understand from a certain consumer population. I think we got all the response we needed in like 11 days. So from what I’ve seen in that sort of consumer research and mapping survey research to qualitative research, we’ve gone back to getting people to respond to it more quickly. But if you the summer of 2020, we were terrified. Why would you even if you’re offered a gift card to answer a 10 question survey, like spend time on that when you’re trying to take care of your kid, dogs and yourself? Steve: We were terrified and we also felt, I think, again, it’s hard to with any clarity, but there was this sense of trying to preserve some aspects of normalcy, like family and work and so on. But you could see why people would want to run a survey because that’s part of their job and they want to continue feeling normal, but it’s a really, it’s an interesting observation that people didn’t want to respond to surveys because of what everything that was going on. Leanne: Yeah. And as researchers, we lost some of our ability to understand people’s experiences because of the pandemic and being locked down. So we were limited to diary studies that are about things like surveys and interviews. Steve: I having a lot of, we had a lot of complicated feelings about a lot of things, but watching that period where research was starting to pivot to remote and I don’t know, I felt like there was a certain gleefulness in being able to guide that. I don’t want to make it like a Coke versus Pepsi thing, but it seems like there is a remote versus in-person belief system or, you know, adherence and that when in-person became impossible, the folks that could give guidance and best practices on remote, it was hard to not see that guidance being offered because it was everywhere. Leanne: Oh, of course. It was amazing at that time for people who didn’t know how to do remote. Like the agency I was at prided itself on doing contextual inquiry all the time. So teaching people how to do things more remote, I always like to mix things up and tell people like you can’t do all remote or all contextual or all quant. You need to mix it up to understand a human experience. Steve: But that doesn’t work with our desire to make a single declarative, this is the best way to do it, Leanne, what do you mean, what do you mean mix it up? Leanne: Oh, we all combine complexities in our experiences and how we do things. I find it really hard to stick just in one camp. Steve: And I think it’s interesting, you talk about being an extrovert kind of driving you to, if I understood, yes, extroverts suffered from lockdown, but that as a researcher who’s also extroverted, you really were, I think, creative in trying to find ways to do some portion of context and kind of get yourself out there and get to where people were. Leanne: Yeah, I felt really lucky to be working for this agency in San Francisco where we had an office where the windows open. So we were coming back to fully masked with all the windows open, standing apart from each other just to see each other. Do something on a whiteboard together, sit in a room and talk together just to, because you can’t just like one thing about the difference between desk research remotely and seeing someone in person is you can’t see the Post-it notes around the monitor. We all got Post-it notes around our monitor. We all got things, you know, I’ve got stuff here today to remind us of what to do. And if you have a workplace, you’ve also decorated your space. So, you know, we’re decorator crabs, basically. We bring in pictures of our family and someone brings us back, you know, a token from some trip and we put it on our desk. And you know, the team all goes out and does something fun together and we put a picture on our bulletin board and you can’t see that. And recently something interesting I noticed was that, which I hadn’t thought of before is that with this remote, like you and I just look at each other, you know, on video, if I shared my screen with you, you can only see one screen at a time. Well, a lot of people work with two or three screens now. So how do you in a remote world see how they’re comparing things across two screens? Like they could, I could share one screen with you and then another, but you couldn’t actually see my experience of, you know, comparing this Word doc with this Word doc. So that fascinated me. I was like, Oh, amazing. The world has also changed that monitors are cheap and we can have multiple now. Steve: I would have to ask whether or not you had multiple monitors and ask to see them. Leanne: Remotely, you might just assume your screen is your screen. Like think about if you’re doing research with engineers, you know, you and I are both of a certain age. So you when we only had one monitor and it was like a big thing, the two of them on a desk, there’s no way you could have two monitors. Well, now if you go look at any software developer who’s working, they’ve got a couple of monitors or three monitors. They’ve got curved monitors. You know, we just sort of surround ourselves with all this stuff to look at. And you can only see that if you go see people in person, we don’t yet have a way with any of zoom or any other video technology, be able to see across all of those monitors all at once. So I think that’s an interesting point of innovation for these video companies. Steve: And we’re still talking about research where the thing we would be looking at would be something that takes place on a screen. Leanne: So I want to be able to see what you’re doing. Also, you know, motorcyclists can’t see their motorcycle. If I talk to you on video, maybe you can bring like your phone or something out to show me your motorcycle, but I can’t get that giddy experience you have about talking about it person. Steve: Sorry, did you say the giddy experience? Leanne: Yeah! Steve: Who’s giddy in that example? Leanne: The motorcyclists. Motorcyclists feel very, very strongly about their motorcycle. And I also noticed, because I’d been doing in the consulting gig I did before where I’m working now, I was doing some remote interviews. And then where I’m working now, I do remote interviews and go out and talk with people. And I’m noticing this distinct difference between how vulnerable someone will be with me and how much they’ll share with me in person versus on a screen. And that, if you think about it, just makes human sense. We can sense each other better and share things better when we’re in person than when we’re on a screen. Steve: When you and I did the previous episode of this podcast, I came to your office and we sat in a room holding microphones to do it. Leanne: Yeah, exactly. Yeah. Steve: Technically, I guess we could have done that again, but I think we rely on this technology. We’re using remote screen-based recording, blah, blah, blah, that has become the default, even when it’s not the only option. Leanne: Yeah, and I think it’s fine. This gives us opportunities to talk to people we can never talk to. But I think we have to make sure we the importance of doing things in person. Humans are people who sense things off of each other, and it’s important to make sure that’s there in the mix with everything we’re doing. Steve: And you set up off the top to teaching and you talked about going with your colleagues Leanne: Mm-hmm. Steve: and teammates and clients and so on into the field. We have a number of different parties that have a different experience. Leanne: Mm-hmm. Mm-hmm. Steve: So the motorcyclist who’s giddy, us the researcher, and I mean, I’ll just say like, Leanne: Mm-hmm. Steve: I need to be in the room with that giddy person. So I need something that I’m missing from the remote. But then there’s also the people that you’re taking out that you want to give them a sense Leanne: Mm-hmm. Steve: of their customers and their s. What’s been your experience over these few years in trying to do that for your clients and stakeholders? Leanne: Well, for that sort of work, bringing marketers, salespeople, engineers, product managers, designers, et cetera, out into the field for consumers or customers, B2B or B2C or B2B2C, it never fails to surprise them. I’ve been doing this for a long time, and it’s the same thing over and over. And I love that sort of teaching moment of bringing someone out to an experience and just saying, I’m going to run the camera. If you don’t know what to say, I have no problem keeping the conversation going and saying, tell me more, tell me more, show me more. What about that? You get to just observe, or you can participate. It’s all up to you. There’s no rules, really. And I’ve never had someone say that they didn’t or weren’t surprised by something that they saw or they heard or observed. And usually, the things people are most surprised by are the environments that people are in. Because if you only see me in this room, like right now, I’m in a phone booth in a WeWork. I probably look like I’m in a sauna. You have no idea what’s going on around me. And so to see the vast experience of someone’s house or of someone’s business or someone’s office or following them along while they do something out in the world, usually people I’m with are as surprised by that as they are by the things in the product or the service or whatever it is that trying to understand the experience of. Well, for B2B, when I’ve worked in B2B, Steve: What is it that these folks are learning from these experiences? Leanne: I’m trying to help people understand the humanity of the people who are using the product. That it’s not just someone using a credit card to pay for a subscription, or it’s not just the buyer and their team who are at work and using your product in their work. It’s not just a job. There’s a human behind all that who’s got needs and has got friends at work or has colleagues or people that influence or don’t, or they have other products they’re using around your products that you can’t see from your usage analytics. So I’m trying to get them to see the giddiness of the motorcyclist, not just like, “I am this demographic and I’ve got this much money to buy this kind of product,” but look at how they feel about it. That’s what brand campaigns and marketing campaigns do. They get into our feelings. And so I try to get the product side of a company to understand that piece, that you can get more engagement in your product. You can get people to use it more if you get them to feel good about it. It’s like Kathy Sierra’s book, “Make Your s Badass,” or something. I probably just mangled that. Something like that. You want your s and your customers to talk about you at a cocktail party after two drinks. You want them to you so well. And then that Maya Angelou quote about, “I don’t what you said or did. I how I felt.” That’s the human. And so in companies, marketers know that. Brand people know that. I try to help product and design people that because they know that too. We just don’t that when we’re focused so much on usage analytics. It’s like, yeah, usage analytics and revenue analytics are important, but they aren’t important in a vacuum. You have to also make people feel good and feel smart when they’re using your product. Yeah. Steve: Have you seen any shift in the appetite for the kind of, you know, understanding that you’re enabling people to gain? Leanne: Research and design appetite has gone up and down in the tech industry over the past 20, 30 years that I’ve been in it. Right now, there’s a lot of people out there saying nobody has appetite for us. I’m like, well, maybe they have an appetite for a certain type of work that we do. We don’t have an appetite for all the work that we do. And that appetite ebbs and flows, just like the appetite for anything else. It’s just humans and trends and businesses and business decisions. And my advice to everybody has always been like, you can create an appetite for what you provide. It’s a sales technique. Understand what somebody needs, meet them where they are. Instead of being like, here are the five things I do. Here’s the checklist, sort of like McDonald’s, like here are the things you can order from me. Oh, you don’t want any of these? And instead say, what are you looking for? What are you trying to achieve? What are you doing? How can I help? Steve: I want to clarify, or I’m trying to think about this as a question and not a statement, but I might just go with a statement. There’s a difference between, you know, when you say, find out what people need and then help them achieve it, that doesn’t mean if they don’t ask for contextual research, you don’t do contextual research. And maybe this is where the McDonald’s thing breaks down for me, or needs like another metaphor layer on top of the metaphor. Because if you ask what they need, no one needs research, they need the information or the decision or whatever they’re going to do about it. And so you have a lot of ways to get to that outcome. But if you say to them, hey, I do research, to your point, right, if you say, hey, I do research, do you want research? No, I don’t. Leanne: Yeah, exactly. Or also like waiting for someone to ask you for a certain kind of research. When someone who doesn’t know the breadth and the depth of what you can do with research, asks you for something, they might not know what they need or want. They’re only asking based on their own knowledge. So I always find it’s better to figure out what kind of research to do or how to prioritize it or where to be by hanging out with people and finding out, say, what are you working on? What’s coming up? What do we know? What do we not know? Here’s these analytics. Do we know why these analytics look this way? Well, we could find that out this way, or we could do this this way, and see which one of those things that I start proposing to them based on what I heard them say they’re working on or what they’re trying to achieve, sort of like sparks a conversation. And then research is desired and wanted and valued and invited to the table. But I see a lot of researchers, and I coach researchers on this, not going in with like, I’m going to run a survey on this, or we’re going to do some interviews, or I’m going to do this unmoderated. Nobody knows what unmoderated research is outside of the research community. So stay away from the methodology. Stay away from like, I’m going to do this and stick with like, I’m your partner and I want to help you out Steve: The labels that are always anchored in my mind is the difference between proactive and reactive. And you’re talking about being proactive. And even I get a picture as you kind of talk that there’s a relationship and there’s something over time. And I kind of hang out with them and see what they’re thinking about and what they’re talking about. That is not a — this is sales, it’s not a sales call, it’s a relationship-based sale. Leanne: Yeah. And that’s why, as you and I have talked before, I don’t want to go back to run a consulting firm. I don’t want to be a consultant unless I have to, for, you know, to get revenue in my life. I enjoy being a part of a team to like, nurture relationships and get collaborations going and work with people so we can do something together, because I don’t like being a lone wolf. I don’t like being a single point of failure. And, you know, I want to do things together. Steve: In some of the conversations we’ve had where you’ve said — made these comments before about, you know, the relationships, I get the sense, and you can correct me here, those relationships aren’t handed to you. You’re seeking them out. Leanne: The startup I last worked for, after I got laid off, customers of that startup were still texting me up to like, three or four months after I left the company, because they didn’t know I’d left the company. Every time they text me, I’d have to let them know. But I’d gotten such a good relationship with them that was beyond just like, I want to know how you use this product. It was like, I want to know how to use this product. And how did you start this business? And, and tell me about your family and this little town you live in. And where’s the best place to get lunch? And, you know, I would get invited over for dinner with their families. And you want that sort of relationship, which is very much like a sales tactic, because that’s what people who sell want. They want to get under your skin a little bit to um, understand you better, because that helps them sell to something that you need. And that’s all about relationships and collaboration. Steve: What things, if any, are different in the sales relationships and collaboration that you want to nurture with s, customers, versus the colleagues, the people that you work with that you’re also wanting to do collaboration with? Leanne: Well, you have a different goal when you’re a salesperson. You’re trying to hit a number. You’re trying to get someone to buy something. My goal with anybody is, I want to help you out. I want to see if there’s something you need that I can help you with. I want to see if, you know, you know, the types of skills that I have and the types of things I can do and the craft I have can help you. And if it can’t, I’m not going to, I’m not going to waste my time, you know, doing something if it’s not going to help you. I don’t know that all researchers or research teams see themselves as a service organization, but we basically are a service organization. We’re providing a service to and with people who hopefully will use and are interested in using the insights and the learnings they can get with us, from us, to build product, to design things, to market something, to sell something. Steve: Hearing you talk is really refreshing for me because I think it’s easy for us, or for me at least personally, to get sucked into all of this as adversarial. I don’t know, we hear a lot of stories from each other, you know, in this work, we tell a lot of adversarial stories, persuading, getting permission, convincing. Leanne: Do you pay attention to the subreddits on research? Because it’s full of that. Yeah. I just watch it and read it. Steve: And sales, I mean, being a consultant, you know, I have lots of peers where we talk about sales, and the more — even though I’ve been doing this for a long time, I still keep learning and relearning that sales is not a persuasion adversarial kind of work. It is a — it’s all the things that you’re talking about, it’s relationships and how can I be of value to you? And that way of being is — I think you live that way, and so you work that way. Leanne: Well, I learned it from running the consulting company. And then when I was at Dropbox and they were creating a sales team, I sort of like did a swap with the sales team. I was like, you teach me how to sell things because I actually want to know this more. Like I see, I see value in knowing these techniques. And, you know, you teach me that and give me access to people you’re selling to, and I’ll teach you about them. And that was a really valuable collaboration then. Because Dropbox was just getting into the enterprise. We were just starting to sell to larger customers. And I was working with all of the new enterprise managers to basically learn what they did, but then also learn who they were selling to, and then feed that back to them to say, you know, I think you could sell this customer this, or this customer needs this. Steve: What kind of things did you learn for yourself about sales from that? Leanne: Well, that was when sort of like a chandelier went off. And I realized that like techniques we use in research to understand someone’s experience are the exact same thing sales people do. So a lot of researchers will be like, nobody does research like we do. Product managers don’t do it. They ask leading questions. Sales people are just trying to sell. And I’m like, hmm, I’m starting to see that. And this was back then. So about 10 years ago when I was like, oh, salespeople are actually doing a lot of research and product managers are actually doing a lot of research. Why does it matter how you ask the question? If what you’re trying to do is discover something or understand something, as long as you get to the end goal, the path you got there doesn’t really matter. Steve: And did you learn anything about — again, you’re using sales in this very elevated way to describe how you work with colleagues, for example. Is this the point at which you started to develop those skills further? Leanne: Yeah, that was the job where I grew up. I had run my own consulting firm for 17 years. So from when I was in my mid to late 20s until I was in my early 40s. And when I took my first job after that long consulting period at Dropbox, I was like, oh, I’m going to learn how to be an adult inside of a corporation now. It doesn’t matter what age you are when you learn how to do that. So it went through that learning curve there. Steve: The idea of growing up is really — that’s a big one. Leanne: Well, I don’t think we all ever grow up. We just grow up in certain ways. It’s like, oh, now I know how to behave professionally. I didn’t know how to behave professionally before. Steve: I mean, you ran a consultancy for 17 years, so you at least were able to survive in that time. Leanne: But you know this. It’s a different sort of professional to service clients and manage clients and sell things than to learn how to behave and how to manage the politics and the relationships inside of a corporation. There’s this really great book called Orbiting the Giant Hairball, which I go back to every once in a while, because it’s just all about the humanity in these large corporations or in these mid-sized startups and how do people get along and work together and make decisions and collaborate and get things done. It’s very different from running your own thing. Steve: You mentioned just offhand, I think, that you talk to researchers and people are reaching out to you for advice, and I’d just love to hear kind of what that’s like, and I don’t know what’s coming up for people right now that you’re talking to them about. Leanne: So what’s coming up is questions about how to, particularly for researchers early in their career who maybe started doing research in 2020, they’ve never done contextual inquiry. They’ve never done ethnographic research. They’re working at a company where they don’t have a manager or a leader who’s encouraged them to do that. They started remote. A lot of these people went to college remote, graduated in 2020 or 2021, got a job as an early career researcher, and they’ve just never done this. And so I basically start with the basics, like here’s how you observe someone. The other thing I noticed, and I also talked with some people recently about what they’re seeing of people who are earlier in their career who went to school during the pandemic and now are coming out, that it’s the same thing that those of us who are older went through in the pandemic. Leanne: We lost a certain amount of social ability. And most of us, or a lot of us, got that back in the Leanne: last year or two. We started meeting people in person again. We started going to dinner parties. We started going out to bars, went to concerts. Some of us went back to offices and figured out how to be in an office again. But there’s a lot of fear of, “I don’t know how to do this, and I’m scared to do this, and I’ve never done this before,” that a lot of it I attribute to the pandemic. And I think those of us who do mentoring and coaching need to be aware of that and teach people how to do that. I know someone who is teaching people how to dress to go to an office, how to wash your hair, how to have conversations with people. And that’s something that’s really specific to this point in time that didn’t exist before. So say, like, 2016, what was I mentoring and coaching people about? It was more relationship coaching. Like, how do you get along with a product manager who disagrees with your research results? Or how do you have influence? Or how do you learn a new skill? Or you’re a junior researcher, you want to become a senior researcher, what do you need to do to show that so that you can get that promotion? I’m not hearing that so much anymore. I’m hearing more around, well, one, how do I get a job? But then also, sort of like, how do I do these things that nobody ever taught me to do and I never had a chance to do before, and I want to try to do them now? Some of these people I’m coaching are the only researcher in a company. So they don’t even have someone to sort of like manage them or say, this is how we did it in the past, 2019 and earlier. So yeah, I just I like to help people out that way. I sort of feel like the things I can do right now are improve relationships among different teams. And I can also help out people who are trying to grow up in their career. Steve: What else is coming up in these mentorship conversations you’re having? Leanne: Presentation skills. So a lot of people come to me and say, I’ve been giving presentations, you know, over Zoom for years, and now I’m being asked to give a presentation in person. And I thought, how do you not know how to give a presentation in person? I’m like, oh, you’ve never worked in an office where you had to stand up in a room in front of a table of eight people with a pointer, you know, how we used to plug like a USB thing into our laptops, and then that we’d have a clicker and then presentation would show up on screen? Well, people are being asked to do that again. But that’s another soft skill that nobody’s taught them how to do. And one thing that is an advantage of presenting over video is you can have notes. You know, nobody can see that you’ve got your PowerPoint and like presenter view with your notes, or nobody can see that you’ve got like a notebook, you know, with like, oh, this is what I say on this slide. When you’re in a room, you’re on stage. And I think a lot, a lot, a lot of us forgot how to be on stage, who used to be on stage. But some people have never been on stage. And now they’re being asked to be on stage. And they didn’t get that practice at college, because you always presented your papers and everything over video or in a small classroom. I’m really glad that my daughter started college in person and is in college in person. So she’s getting that like those that like social growth that you need before you start to turn into a young adult trying to get yourself into the workforce. Steve: It’s a fascinating observation that maybe is obvious to everyone, but I had never really thought of this. There’s a significant cohort of people in the workforce who have never, who don’t have a pre-pandemic norm to return to around research, presenting, business travel, any of those kinds of things. Leanne: Yeah. Just like professional skills, how to get along with people. Yeah. Steve: It’s interesting that at least the people that you’re in with have some awareness that they have a gap. Seems like that would be the first step to addressing it is knowing that it’s missing, that there’s a thing that is expected of you. Leanne: Yeah. Yeah, there are a couple people I mentor who come to me in panics, like, I have to do this, and I’ve never done it before. And I’m like, oh, it’ll be okay. Steve: Yeah, what do you tell someone who hasn’t done a business trip before, who hasn’t worked in an office? What’s the granularity of advice here? Leanne: That even me in my early 50s, I still have new things I have to do every once in a while. And we can all do new things and hard things, and you will actually be okay. And then we can get down to brass tacks and go through the tactics. What, how do you pack? Or how do you like memorize things or practice? A lot of people who give presentations over video don’t practice first, because you’ve got all these s around you to do it. You’re wearing your sweatpants and you put on a button down shirt, but you’ve got sweatpants on. You’ve got all your Post-it notes nearby of what to say. And so without all of those s, what are you going to do? Well, you have to practice more. You have to think ahead. You have to plan, make lists if that’s your thing. And I think it’s really surprised people that they need to do that. Steve: What’s the guidance, this is decontextualized, I guess, but what’s the guidance for a person who’s new to in-person research? Leanne:Well, I say, you know, go find someone to bring with you who’s done it before. So you aren’t, you and everybody else isn’t brand new to it all. It’s this, you know, it’s something like, you don’t all want to be new to doing in-person research on the group that’s going out to do it. See if you can find someone who’s done it before. Or find someone from marketing sales, who’s done something similar. Who’s at least like gone to visit customers or has gone to conferences and meet with customers. Your first pancake of the first time you do anything will always be a little rough. So it’s always good to have someone there who, you know, can advise you a little, or, you know, steer you in a slightly different direction when they see you going astray. And if you don’t have that, then, you know, just be gentle with yourself. Whatever you’re doing is the best you could possibly do. Steve: That’s the lesson for everything, right? Let’s just be gentle for yourself. What do you get out of mentorship? Leanne: Well, a couple of years ago, I realized I know things. And, you know, we all know things, but sometimes we go through life thinking there’s always something more for us to know, or we don’t know as much as others. And it was a couple of years ago when I was like, oh, I know some stuff. I could share it. You know, maybe it would provide some value to people who, you know, like if I think of myself at 23, 24 years old, I had people who were my age now who were telling me things that I listened to and got advice from. And it just sort of popped into my head. I was like, oh, I’m that person now. I can be the person who like gives people advice or says like, you know, I don’t actually know everything, but here’s some things I learned over the years that might help you. So, and it, it makes me feel good to do that. It boosts my confidence. It helps me feel like, oh, I can actually do something that’s not just my craft or not just my job for a paycheck or not just this, but like I actually have something to offer. And that’s a great feeling. Yeah. And then you’re sort of surprised, like, oh, I actually know how to do this. Steve: I don’t know if this happens for you this way, but sometimes I don’t know what I know until I’m in a situation where I’m asked to help somebody out. Leanne: Yeah. And other people will say to me, like, well, Leanne, we see you as an expert and everything. I’m like, yeah, but I don’t see myself that way. Yeah. Yeah. And for me, it meant I was taking it on in a way that felt comfortable for me. Like I don’t need to be or want to be the expert who, you know, gets on stage everywhere or has the big title or anything like that. Steve: But to choose to be a mentor is to partially take on the role of the expert. The theme of some of your earlier points about creating good collaborations around research are understanding what somebody needs and how you can be helpful to them, and that you like to be helpful. And that seems to manifest in mentorship as well. Leanne: But I do like to help people. Yeah. And that’s that being a helpful person is something that’s often attributed more to women than to men. And I used to when I was younger, sort of like sort of like push away things that were like, oh, that’s stereotypically female. And I’m like, OK, so what if I’m doing something that’s stereotypically female? I like it. It makes me feel good. Makes me feel strong. People appreciate it. And so I’m going to own it. But there are those of us who are Gen X feminists who grew up in a time when you had to sort of reject things that were stereotypically female. So I’m starting to embrace more of it now. Steve: Is that Gen X at 25 and Gen X at 50 are approaching life differently? Leanne: Yes, I was bald at 25 with an attitude. Steve: There’s so many good follow-up questions to the statement, I was bald at 25. Leanne: I’d come out of the closet probably like, I don’t know, three years before, two years before, something like that, and declared myself to the world as a dyke. So I shaved my head and I wore everything rainbows. And yeah, it was fun. I look at that part of myself and I’m like, oh, that was awesome. I don’t need to do that now. Steve: The part of you that we’re talking about that likes to help, how did that part of you manifest when you shaved your head and wore rainbows? Leanne: Oh, it didn’t manifest itself at all then. I was angry at the world. Yeah. Yeah. It was not an easy world to come out in 25, 30 years ago. I was, let’s see, 25. Steve: When did you start your consultancy? Leanne: I was working for AT&T at that time. AT&T Wireless, because I’d worked for a startup cell phone company. I was in Seattle. I moved to San Francisco in 1996, and that improved my life immensely. So I worked for a little startup called Organic Online in 1996. It’s now a huge company, but at that time it was like 30 people. Left there and worked for a startup that was started by Howard Rheingold, if anybody re him. And they ran out of money, so it got laid off. And then I was doing, people were asking me to do contract work for them. And at this time I wasn’t doing research at all. I was doing QA and server performance load testing. And so I was doing contract work and someone said, oh, you’re getting so much contract work, you could start a company. So I started a consulting company in the fall of 1997 at the age of 26, where I was very brazen and thought I could do anything. And yeah, and sold, just sold projects to all sorts of startups and tech companies. 1997, ’98, ’99, the money was falling off trees and started selling research when a company had asked if we could do it and I didn’t know what it was. And so I went and asked some friends and someone told me and I was like, oh, well, let’s figure this out. In the late ’90s in the web and tech industry, if you just basically said you knew how to do something, someone would pay you to do it and then you just found other people and figured out how to do it. Sort of fake it till you make it. Steve: Is there a point professionally where the seeds of what you’re talking about now, because I think you’re describing a way of being, that’s about finding out how to help people and doing that. Can you identify some of the seeds of that? When you were angry and wearing rainbows, that was not present there, but where did it start to emerge and how you worked? Leanne: Oh, yeah. When the NASDAQ crashed. So do you the NASDAQ? NASDAQ crashed in the spring of 2000, which was actually a bigger deal for those of us in the San Francisco Bay Area than the terrorist attack in 2001, because it had a bigger effect on our life here on the West Coast. I had to lay off a considerable number of people at my startup company, at my consulting company. And I had never done that before. I didn’t know how to do that. I was a terrible manager. I was like running. People who worked for me then thought I was great, but I look back at that and I’m like, oh, compared to me now and having gone through management training and everything, I didn’t know what I was doing. And I wished at that time that I could be able to do more for people who had relied on me for a source of income to pay their rent and pay their bills to them better when we just lost tons of clients and revenue streams all at once after the NASDAQ crashed. Yeah, probably started then because then the company turned mostly into a research company. We still did QA probably until about 2010 or so, but we were primarily doing research projects and I was primarily hiring people who were in research. And I also had got married and had a kid and it changes your life. So in 2004 was when Gavin Newsom made marriage legal for 45 days in San Francisco and my wife was pregnant. And so we got married when she was pregnant, told her dad he should bring a shotgun to city hall because it was a shotgun wedding, bought a house, got a letter from the California Supreme Court saying our marriage wasn’t valid so we had to run down and get domestically partnered before the kid was born. And for all of us, it’s usually things like that that happened during our life that help us gain a little more empathy for experiences in other people. It’s just the act of being human and growing older and having experiences that makes you understand other people’s experiences. And oh, everybody holds pain and everybody holds things that they won’t tell you about and we’re all here to help each other. Steve: And just hearing you talk about it, even thinking like the layoffs, how we handle endings, I’m reminded of something that you and I talked about, not on this podcast, but just years ago. And I was catching up with you during a period of time where, I guess you were leaving a job and you were thinking about how you wanted to leave everything. Would you mind kind of describing some of what you did and how you thought about it? Leanne: Sure. I had been laid off because a company was reorganizing and the organization that I was overseeing no longer existed. So there was actually no place for my role anymore. And I was given time. So it wasn’t like the startup that laid me off a little over a year ago where it was just like, you’re done today, turn your laptop in. It was like, here’s the package and you’ve got this many days and you can still come into the office but you don’t have to do work anymore. Being told that you don’t have to do anything anymore, I was like, lots of people depend on me here. I have lots of relationships I need to on to someone else. So the initial sort of angry or hurt response to being laid off, I was like, okay, I can have this angry or hurt response because I love this job. But I also understand the business decision behind it. And I also understand that there are people who will need things from me before I leave. So one of my colleagues actually said when I met with them on the last day and said, here’s the budget for the team and here’s the things, I documented all the things that we were doing. Here’s all the people that we had relationships with. I already sent out email intros and made sure everyone had them. What they said to me was like, they’d never seen someone leave that way before. And I said, well, why would you want to burn bridges? The tech community is really small. And just like when I’ve been harassed or mistreated at a company, I haven’t been the person to want to sue the company because this tech community is so small and I don’t want to be that person. So I want to take other ways to file a complaint or make something known or make sure that things are documented. And so it’s the same thing with leaving, whether it’s my choice to leave, when I’ve made a choice to leave a company, I’ve done the same thing. I make sure everything’s documented. Everybody has a relationship off and everyone’s going to be taken care of. So like when I’ve left companies, you know, and given like 60 days notice, I’ve made sure that whoever is reporting to me, they knew who they’re going to report to next. We had like closed off our relationship. You know, I told them how to get in touch with me outside of the company. You know, if they want any coaching or mentoring in the future, because as long as I’m working, I want to make sure I maintain good relationships with everybody. I think that’s the most important part of work, is relationships and colleagues. Steve: And I think when we talked about it, you said, these people, you may manage them in another job, or they may manage you in another job, or they may interview you for another job, or you may interview them for another job. Leanne: Yeah, exactly. Yeah. I mean, I have had people who worked for me in the past who will just say misbehaved and then applied for a job on my team, you know, at another company I was at. But just like that thing, like, I don’t what you said or what you did. I how you made me feel. I look at their resume in a stack and I’m like, no, like you did something that really harmed my agency or you did something that was really professionally inappropriate. You know, I can’t have you working for me anymore. That company that laid me off, the next company I was at, I was working with the sales team and they wanted to sell to that company that had laid me off. And I said, sure, I know people there. I can, I can help set up something. And we, so we set up this whole lunch. We were going to lunch at a restaurant in San Francisco and waiting in line to be seated was the person who had like, laid me off. And, you know, we gave each other a hug and I was like, you know, we’re okay. Like there doesn’t need to be an adversarial relationship when there’s business decisions made and when we all live in the same town and we’re all in the same tech community. But people do set things up that way sometimes. And I think the harm it does is to yourself. It’s sort of like, I tell people, you have a choice of what you’re going to do with the feelings you have right now. You’re feeling frustrated. You’re feeling overwhelmed. You’re feeling disrespected, undervalued, whatever. Okay. Those are feelings. What are you going to do about it? Like you have control over your actions and your next steps and what you say and what you do. This is no longer about research. This is about being a human in a workplace. Steve: Yeah, which is the core of research. Leanne: Yeah. Well, and it’s also how to show value in research is to be that human who’s professional and can manage situations and keep a certain sort of like emotional regularity. Steve: You know, the successful researcher or leader or manager who does a great transition when they’re laid off is also the person that is good at understanding what people’s needs are and proposing ways to help them accomplish it so that they collaborate and research together. It is the same set of values and life skills that you’re talking about. I mean, I think we went someplace really interesting and I like what you said, you know, these are human skills and I do think this is about research even though it’s about being a human. Leanne: Well, that’s what I coach people who are looking for a job is to focus more on like, what is the hiring, like use your research skills. What is the hiring manager looking for? Does that fit you? Because there’s all sorts of researchers out there right now who are hurting because they can’t find work. I think the way that we find value again and find ourselves jobs again is to use those skills we have as researchers to understand what people want. Steve: Is there anything in today’s conversation that we didn’t get to you bring up? Leanne: No, that’s been great, Steve. I like the way it just sort of wandered. This has been fun. Steve: Thanks Leanne for being just so wide reaching in what you have to share and kind of digging into a lot of related aspects. It’s really very interesting and inspirational for me personally to have this conversation with you. Thank you. Leanne: Oh, you’re welcome. And thank you, Steve. Always a joy to talk with you. Steve: Okay donut friends, thanks a whole heap for listening to this episode! Don’t forget, you can always find Dollars to Donuts where all the podcasts are, or visit Portigal dot com slash podcast for all of the episodes with show notes and transcripts. Our theme music is by Bruce Todd. The post 43. Leanne Waldal returns first appeared on Portigal Consulting.
50:13
42. Celeste Ridlen of Robinhood
Episodio en Dollars to Donuts
For this episode of Dollars to Donuts I had a wonderful conversation with Celeste Ridlen, the Head of Research at Robinhood This is a fundamental leadership-y thing where no two people are going to do that same leadership role the same way. You’re never going to do them the same way as somebody else. And that’s actually a really good thing because the situation may call for exactly what you can offer. But because of that, if you’re looking to other people to decide whether or not you’re going to be suited to doing that role, it’s kind of like thinking about whether or not you should be a writer based on whether or not you can write exactly like Mary Shelley. – Celeste Ridlen Show Links Interviewing s, second edition Steve on the CX Chronicles Celeste on LinkedIn Robinhood Cognitive Psychology, Florida State University Dr. Michael Kaschak Dr. Roy Baumeister Dr. Dianne Tice San Jose State University Human Factors and Ergonomics Master’s Degree Program Oracle Symantec Corporation Time by Pink Floyd Airbnb Mary Shelley Grateful Dead Help other people find Dollars to Donuts by leaving a review on Apple Podcasts. Transcript Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead research in their organization. I’m Steve Portigal. Did you know there’s a new, significantly updated second edition of Interviewing s? Of course you did. I hope you’ve checked it out and are recommending it to everyone you know. Shortly after the book came out, I had a conversation with Adrian Brady-Cesana for his CX Chronicles podcast. The link to the whole episode is in the show notes, but here’s a quick excerpt where we talk about domain expertise in research. Adrian Brady-Cesana: I’d love for you to share a couple examples or a couple stories of some of the things that you’ve seen working with your clients and working in your business around sort of how you’ve seen some of the companies that had really incredible teams or some of the commonalities or some of the things that you saw again and again and again with your clients that really had a solid handle on how they sort of built up their team, built out the different roles and really kind of stratified how their team was going to be taking care of their customers. Steve: I think there’s a lot of pressure on people doing research right now to carry it yourself all the way through. And I think this is such collaborative work. And I just, I think I’ve seen more success when there is some collaboration and that’s a big, collaboration is a big, big term. But one thing that, you know, your question makes me think of is complexity. Like I think as research as a practice has grown, it’s finding its way into many more complex domains like installing and maintaining and configuring servers and network devices, not even just servers, but the whole, the whole infrastructure. I worked years ago on a credit default swap trading, and you might or might not have heard that phrase, but boy, just dig and dig and dig. And it’s like, it doesn’t make any sense until you’ve really been involved. And, you know, so for me as a consultant, but even my clients who are on teams, like they’re, they’re not necessarily domain experts. And so I think this really interesting challenge comes up for whether you’re a researcher, whether you’re someone else in the organization that’s out talking to customers is, is trying to navigate that balance between like, how much do I need to understand about this? And so for me, I think one thing I’ve seen that to be really successful, it goes back to the collaboration thing is when you pair up someone who’s great at research, which is, okay, I don’t know about this. I want you to explain it to me. And someone who is great at the domain, whose job isn’t to ask questions. Their job is to hear what doesn’t make sense about the technology or about the deployment or about the process. And that collaboration is really, really sharp, I think, and has a great effect on, you know, when you’re talking to customers and s, I think sometimes we’re nervous because while we want to be seen as credible, especially if it’s an actual customer, right? We ask for their time. We want to go talk to them. You’re going to send some idiot that doesn’t know what they’re talking about. That isn’t necessarily the reaction you’ll get, but I think it’s sometimes the reaction that we fear. And so it can be really a really great triangle between a or a customer who has, who’s a practitioner of something very complex, you know, and a person from the producer or, you know, maker side of it, the company side, who’s knows the domain and someone who knows how to listen and ask questions and follow up and sort of facilitate this. When I see researchers kind of getting immersed into a domain, they do build up some competency, but some of these things are decades of specificity and really kind of elusive stuff. So I think just to go back to your question, I think teams where there’s bandwidth for collaboration and you can bring in people with different perspectives, different domain and process expertise to create a great interview for the customer that you’re talking to. Like it’s a good experience to talk to a researcher and a domain expert because you just, you can watch who they make eye with as they kind of see like, oh, you’re the, I’ve had people even tell me, oh, okay, you’re the question asker and you’re the person that knows that you’re the engineer. Like people can figure that out. And it’s a really, nobody’s pretending to be anything that they aren’t. And it really, I think can be very harmonious, but you have to create the bandwidth that kind of that collaboration on the team. So everybody can work together to get the insights that we want to get from the people we’re building for. Again, that was from the CX Chronicles podcast. Now let’s get to my conversation with Celeste Ridlen. She’s the head of research at Robinhood. Celeste, thank you so much for being on Dollars to Donuts. It’s really great to have you here. Celeste Ridlen: It’s awesome to be here. Thanks for inviting me. Steve: Can we start with an introduction from you, build the rest of the conversation off from that? Celeste: Yeah. What would you like to know in my introduction? I exist. Steve: You exist. Celeste: My name is Celeste. I’m the head of research at Robinhood. I’ve been doing research for 15 years now. My background is in human factors and ergonomics. I live in San Francisco, long walks on the beach, that kind of stuff. Steve: How did you discover the field of human factors? Celeste: I was in a cognitive psych lab, like working in a cognitive psych lab at Florida State University, and I was trying to think about my next step. I liked to joke that with a psychology degree and an English degree, you basically are qualified to be a mall security guard. I was looking at grad school and I decided that I was going to talk to Dr. Kashak who was running the lab at the time about my interests, because I had tried on neuroscience and I didn’t want to hurt animals and place electrodes on rats’ brains, so I cast aside neuroscience and then social psychology. I worked with Baumeister and Tice and that was interesting, but also so vague and not applicable, and much love to social psychology, but just so vague. I was in Dr. Kashak’s lab and I was asking him about what his advice would be for what I should study if I was most interested in cognitive psych. He asked me if I liked technology, if I liked industrial engineering, like making things. We had this great conversation about human factors, which is essentially cognitive ergonomics. As a field that started blooming right around the time we started industrializing weapons, which is a weird historical fact that I got excited about. That’s how I got into that. Steve: How did you find a program for yourself? This was grad school. You did choose grad school. Celeste: I did. Well, no, there wasn’t any, I mean, now there’s HCI and stuff like that, but at the time there weren’t programs dedicated to that yet. I looked around, there were a few programs. One of them was at Georgia Tech. One of them, I don’t even where they all were, but they were approximations of what I was looking for. Some of them were called human factors, some of them weren’t. The one I ended up going with was at San Jose State, and yes, right up there in of name and stature with Georgia Tech. But I chose San Jose State specifically because they had an applied, terminal master’s degree program, there was an applied emphasis. Where they had relationships with tech companies in the Bay Area and NASA, and they were working directly with these companies to get students into exactly the jobs that I was excited about. That’s why I chose it. I knew I didn’t want to be an academic. I had an end goal. I didn’t want things to be vague. I wanted to see the fruits of my labor immediately. Then I, sight unseen, came out to California, have never looked back. Steve: Was there a first applied job or project or something that came to you through that program? Celeste: Yeah, a lot of people had stuff come directly from school. I happened to sit next to somebody in one of my classes. I mean, I don’t know if it was like a stats class or I don’t which one it was. But there was somebody who was already working at Oracle, and she let me know about a job opening. And I wasn’t even done with grad school, and they took a chance on me. I was a contract physician. In retrospect, it was probably like a low lift, but it was my first sort of foray into things. And so, no, it wasn’t like a direct, it wasn’t directly because of like this, the program itself, but it was the people inside of it. Like you literally never know who you’re sitting next to. So it was lucky for me. Steve: What kind of things were you doing in that first Oracle job? Celeste: Oh, God. One time I transcribed an offsite, like a field visit that I did not attend. So I had headphones on, and I had to hand transcribe 20 plus hours of people spewing acronyms that I didn’t know or understand. So that was fun. I did a lot of like participant recruiting. This was a joy, but definitely a labor of love there. I did a lot of like synthesis and conducting studies, but I did a lot of the stuff that people either have automated now or would not consider in like an entry level research job now, which is character building, let’s call it. Without sort of going through your resume step by step, but what’s, Steve: I don’t know, you can assess what the next marker is like, what’s another role that came after that, that was significant for you? Celeste: Yeah. So I hopped from that to a full time position at Symantec about a year later. I don’t even think Symantec is a company anymore, which is at the time it was like 25, 30 years old. And so there’s sort of like two camps in tech, or at least there were at the time. These old, like slow moving companies with long, long product cycles, like Oracle, Symantec, it’s like there’s a lot to an implementation. And so you’ve got like a five year release cycle, like it’s really, really long. And then you had new companies that were very young, right, like on the edge of being what I would call startupy, like where it’s, everybody’s doing everything. And so I jumped from Symantec, which was like, it was a great job. They also took a chance on me. And I shifted into working in sort of like the newer building phase of tech where like there were almost no researchers. And we had to sort of build a process and perspective and relationships as the company’s like trying to grow like crazy. So Twitter was a really, really pivotal point in my career where I got really, really excited. It was like a dream job. I always wanted to work on Twitter. Like it was, you know, this 2013 Twitter, that means something different now, for sure. But I was so excited. And it was amazing to be surrounded by people that had that much ion and energy. And I, you know, I was part of that bricklaying process, right? Like I had a boss. And I loved it so much that I got invited to do another bricklaying at Airbnb a few years later. I was at Airbnb for a very long time. We went from 20 researchers to like, I don’t know, over 100 at our peak. So that was a lot of bricks. Steve: Can you explain the bricklaying metaphor? Celeste: Yeah, what? That doesn’t make sense to you? Okay, so what I mean by that is you have this sort of like nascent or non-existent research team, like research function, let’s call it like a discipline. Maybe there’s one person, maybe there’s five people, but everyone’s just kind of like still being very reactive because the company doesn’t know how to work with research yet and doesn’t really know what part of the culture it fits in, where it fits, so on and so forth, like how to engage, that kind of stuff. So what I mean by bricklaying is like there’s process, right? So how do you recruit participants? What are sort of like the safety issues with your research participant agreement or your NDA? What are the safety issues with the way that you reach out to your participants? So I was like building a lot of programmatic structure on top of then hiring people, trying to identify and prioritize research questions, all the things, like all the how-to’s, all the — so I was interviewing like a million people every week. I was participating in like a crap ton of interviews for the company and then also of the s. So it was just nonstop assessing things, basically, but figuring out like do we have a crit, as an example, as a research team? Do we do like a weekly critique? What does that culture look like? Is it required? Are we forcing you to do that? Those kinds of things are the bricklaying. Like they’re all part of the bricklaying. It’s not just like hiring and setting up process, but it’s also like what kind of culture do we want to have and what’s mandatory, what’s optional, and like what needs to be grassroots versus top-down, that sort of thing. It’s fun. It’s really, really fun. Steve: I don’t know if this is part of the bricklaying metaphor, but what about things like how much we know or don’t know about people that we’re building for? Celeste: Yeah, that’s a big one. I mean, when you first start — let’s keep going with the bricklaying metaphor — when you first start, like you could put a brick anywhere, so to speak, and you would have impact, right? Because you didn’t know anything and now you know something. But that gets like more complicated and more nuances is necessary when you start, when you’re sort of saturating on a particular topic. There is a moment to either move on or rethink it. And so, yeah, that’s definitely a part of it. Steve: Right. Now you got me thinking of the Tetris metaphor where the more bricks you put down, the more difficulty placing those future bricks. There’s more scrutiny and impact on that choice. Celeste: Yeah. Yeah. I’m very good at Tetris. I don’t know if I’m good at bricklaying, but this is the third company I’ve moved to where I felt like that was a part of what my role entailed. Because it’s not just in research, too. It’s the company at large. You’re defining process and practice as a group of people who’s maybe under a thousand to then like three thousand, five thousand people in a very fast — like the only way to address any of that is to just figure out how to scale yourself and figure out what’s important and what isn’t, because you have to make some decisions very quickly there. Steve: I do want to move on to Robinhod in a moment, but I have a question that I want to go back. So what’s your title at Robinhood? Celeste: I am the head of research at Robinhood. Steve: And hearing you talk, I’m inferring that bricklaying is not just the purview of people who have leadership titles. You’re describing Twitter and Airbnb as roles where you didn’t have a leadership title, I think, but your job bricklaying was part of what you were doing. It was the context for your work. Celeste: Yeah, you and I could go on forever on a much belabored subject about the difference between management and leadership. I was a manager at Airbnb, but you’re right that the bricklaying metaphor is not specific to somebody with a leadership title or somebody who is a people manager. It’s something everybody has to build together. And it depends a lot on the chemistry of both the research team and the broader company you’re at. So it never looks the same way twice. Steve: Well, I’d love to hear if it’s possible, you know, different companies, different roles, but being an individual contributor, being a people manager, being a leader, again, the companies are different. So maybe at the question, the comparison isn’t right. But would you be able to characterize either ways that one could or ways that you have been involved in bricklaying? I love the metaphor. Just these three different contexts for bricklaying, you’re coming into it with what seem like different responsibilities or titles. Celeste: Totally. Steve: What is it like? How do you compare and contrast across the three? Celeste: I think in all three cases, the similarities were around, no one’s going to tell you that you absolutely need to be doing that unless you’re like, you know, the pet of something that is kind of objectively, everything is your job, right, to some degree. So I think that one, it’s a little bit easier to believe that everything is within your purview. But for other roles, I came in and there are opportunities, there are problems everywhere, and you can either decide that they’re not your problem, which is an approach, or you decide that you’d like to solve it. Nobody’s going to probably yell at you for solving it. So you take it on. So things like rewriting interview questions, like that’s something small that I did. Creating processes and how-to docs was something I did like every five minutes at Twitter. And there was like a running joke that no one was going to know how to do anything after I quit, because there was no one to write how-to docs anymore. I’m sure they wrote them plenty. But I think it’s just about deciding if you have a perspective or a skill that can be lent to that particular problem or like opportunity, and then just doing it. I mean, there’s nuance there, right? You have to ask questions and make sure you’re not hurting anybody’s feelings, someone else isn’t working on it, that kind of thing. But a lot of times in those environments, the best thing you can do is just say, “Hey, no one has a problem. I’m going to do this. Good? Good? Yes? Anyone want to weigh in? No? Awesome. I’m going to take this on.” And then everyone is pretty thankful if they that you did it at all, which is, you shouldn’t be doing it for credit anyway. Steve: Over the course of your history, have you worked in environments that weren’t in the bricklaying mode? It seems like the first two you described were that. Celeste: Yes. Yeah. So both Symantec and Oracle had very well-funded and staffed research disciplines. They weren’t massive. I think Symantec, there were still only like six of us, but it’s not enormous for a 20,000-person company at the time. But there was already a way of doing things within the broader UX team, within the company. We had a process, we had practices, we had tooling. No one was starving for those sorts of things. And so everything was kind of plug and play. There were great jobs, but the environments themselves are different. You learn different. You get reps for doing the work there a lot more than doing the work and the stuff around the work, if that makes sense. Steve: If I’m applying for a job in research, let’s just imagine that’s the thing I’m doing. It’s like, ask your doctor if Flonase is right for you. Ask what kind of environment is right for whom? Having seen both and participated in both these, if there are only two, and I’m sure there’s nuances here. Celeste: There’s a lot. Steve: Yeah. If places where it’s more plug and play versus places where there’s a lot of problems to be solved and opportunity to solve them, what’s your advice for people into, I guess to me, I hear two parts. How do I assess what the situation is? And then how do I self-assess about what works for me? Celeste: I think it seems to me like there are a lot of ways to do all of those things, but I’m going to tell you a couple of things. One is how to assess or how to self-assess. But then the second or third, depending on how you just numbered those in your head, would be people that I’ve seen succeed in either of those situations, of which there are probably shades, it’s a spectrum or something like that, because they might be different too. So when I think about assessing, when you’re interviewing for a job, I think the thing that I would look for is who’s funding the research team and why do they think that a research team should exist? Those are things that you would think you get the same answer every time, but you extremely do not. Sometimes people want a research team or have a research team because it’s a box to check. This is what we do to make products. This seems like a thing past a certain point of company maturity that we would definitely want to have. So I’m not entirely sure, but I’m going to, here you go. I’m going to just check that box and it’s going to be great. And you’re going to be my researcher. So there’s stuff like that that can kind of give you a glimpse into the research maturity of the organization or whatever you want to call it. Another question I like to ask, so around funding, do you have the research? Do you have the tooling you feel like you need to do your job? If you needed a new tool, what would it take to get it? Is it a procurement question? Do you have a budget? Do you have to go talk to somebody else about budget? Those are other things. And then asking questions even about headcount. So like, how did this role open? Why does someone think it’s important? Why did someone decide to do this over hiring another engineer? Sometimes those yield interesting answers too. And you can kind of tell where in a research maturity a company is based on that. What I see in of success for one sort of a person over another, I hate putting it like that because I think you can be a different sort of person at different places in your career too. But if you’re interested in sort of the bricklaying, let’s call it, I’ve seen people say they want to do that. And then they jump in, they realize like, oh shit, nobody writes anything down here. There is leadership lacking and cross-functional ways. And I can either take the reins or just get really upset. And you never know how you’re going to react in that situation until you find yourself in it. So I’ve seen people who were like, yes, I want that. I want that. I’m excited about that. And then they get there and they’re just very uncomfortable, very frustrated. And what they actually wanted was the excitement of moving quickly, but they didn’t understand all of the things that surround that, that you need to take along with the moving quickly and the excitement. Steve: What led you to Robinhood and the role that you have? Celeste: Yeah. So this is a complicated question. So Twitter, Airbnb, and Robinhood, while they are pretty disparate in of topic, like your customers, things like that, they’re not in the same vertical. The similarities they have besides the company phase and stage is that they’re all very, very mission driven, like very strongly mission driven. And once you’ve worked at a company like that, I should just speak for myself here because maybe this is not exciting to everyone else, but once I worked at Twitter, I was like, I can never have anything but this ever again. Because when people really believe in it, when people really believe in the work you’re doing and believe it’s for a purpose that isn’t just like contributing to the capitalist abyss, it’s more motivating and more exciting to me to get up every day and like, and focus on a mission. It’s what I return to when I’m trying to prioritize things, when I’m feeling like I don’t know what to do. It’s a nice way to hold the center. So on top of that, when I was leaving my last job, I interviewed, you know, there are many mission driven companies. And so like I interviewed for some heads of research roles actually, and I was even surprised to be offered some. I was leading the largest team at Airbnb when I left. It was like, it’s called hosting, which was Homes, which is the biggest part of the business, Homes, Community. We had an Olympics team and then experiences was also a part of it. I just felt like I was collecting Pokemon at that point. Like they just kept giving me stuff. But so I basically decided that I didn’t want to be a head of research. This is why this is a funny story. I opted for Robinhood for the mission. And also because I was excited to lead research and like a lateral move across a few teams reporting to the head of research at the time. It was okay with me because I had all these like ideas about what being a head of research would mean, which is why I didn’t want to be one. So like I won’t be as close to the work anymore, or I’m going to spend all my days like writing and rewriting career frameworks, or I have to be super front and center. That’s another one that like comes up for me all the time. I’m not, I’m chatty, but I’m like fairly introverted. And I don’t personally love being like the star of the show. So I was also kind of looking at people that I had reported to over the years and or known that like then went on to be heads of research. And I just couldn’t see myself doing what they were doing, or at least what I thought they were doing. And anyway, this is all about me ing Robinhood. But basically, through a series of twists and turns in my first few months, I found myself in the role I was avoiding, which is fun. But looking back, I think my biggest gap in thinking at the time was that I forgot or I didn’t know that leadership roles are what you make them. So this isn’t very researchy, or even like UX-y. I think this is just like a fundamental leadership-y thing where no two people are going to do that same leadership role, whether it’s the head of research, the CEO, the COO, whatever, because those are all equivalent roles, right? The same way. Like you’re never going to do them the same way as somebody else. And that’s actually a really good thing because the situation may call for exactly what you can offer. But because of that, if you’re looking to other people to decide like whether or not you’re going to be suited to doing that role, it’s kind of like thinking about whether or not you should be a writer based on whether or not you can write exactly like, I don’t know, Mary Shelley. I love that that’s the first one I thought of. But she did it her way and she wrote Frankenstein, right? And then you’re going to do it your way and maybe not write Frankenstein. And just because you can’t write Frankenstein doesn’t immediately invalidate whatever it is you are going to write. So that was a really long answer for why are you doing this job? But it was like a personal growth moment where the very thing I was avoiding, I had to confront. And I learned a lot about it. Steve: Maybe there’s some blurring between what we think we can’t do and what we think or we know that we don’t want to do. Celeste: That’s beautiful. Thank you. Oh, yeah. I agree. Steve: Because I think you’re talking about like you’re looking at other, right, we all compare ourselves to people, whether we compare ourselves to Mary Shelley or someone who’s been ahead of research, but you’re calling out just to reflect back and kind of make sure I understand. You’re kind of calling out, well, you didn’t want to do the job because you didn’t want to sort of spend your energy and time and focus doing the things you saw other people doing. But the aha is that there’s a way for you to be being a leadership role that isn’t those things. Celeste: That’s exactly it. The things that you’re good at are the things you should be doing as a head of research or as literally any other leadership role. And you should be surrounding yourself with people that are good at the things that you’re bad at, because together you’re going to make beautiful music or write Frankenstein, you know, whatever. Right. Whatever you want to extend there. I really like that that’s the first person I thought. Steve: That’s a good improv moment. Don’t think about it. Just say it. I want to ask about “mission driven.” I think you’re saying, you know, once you sort of had a taste of that at Twitter, that became important to you. It was. And you wanted to. I guess I want to ask, is there a distinction between mission driven as just kind of a cultural quality and like the specific mission? Which of the bits of it are the ones that are calling to you so strongly? Celeste: No shade to the broader corporate lifestyle or maybe much shade. I don’t know. But I think when you’re every company has like a mission and a vision and like values and they there’s lots of pomp and circumstance around those things. And I think the distinction is in the discussions that they’re integrating into every day. So at Oracle and Symantec, as much as I loved the people there and I respect and ire the work that they were doing, we weren’t talking about the broader like, what are all of us collectively across all these products trying to do together? What is our ultimate goal beyond making money? Like what’s important to us? What is the legacy we want to leave as a company? And yeah, it exists somewhere written down in some wiki, corporate wiki. But like it just it wasn’t a part of the day to day conversation. Whereas like at Twitter, we were obsessed with being the global town square. We were obsessed with making it with enabling people to discuss and connect and like communicate. And it was really exciting and invigorating to be working side by side with people that were like, I don’t know how we’re going to do this, but we’re going to do it and it’s going to be amazing. And then at Airbnb, belong anywhere isn’t just like an advertising tagline. Like everybody is talking about the ways that we’re going to make people feel more or less like they belong based on design choices, strategic directions. Like it’s infused in everything. And it’s the same with Robinhood, ours is democratized finance for all. We talk about it literally every week. It is a constant in every meeting. It emerges, the language emerges, we’re weighing trade-offs and thinking about it. And it sounds a little culty at its worst, but I like to think that at its best it’s a force for good. I think you are what you measure. And so when you’re running a lot of experimentation and you’re looking at all these metrics that maybe build up to something that you didn’t actually want to aim towards, but just made sense in the individual examples, you can return to the mission as like, okay, but are these metrics democratizing finance or are they doing this other thing over here? Have we lost sight of it? Whereas at Oracle, at Symantec, I felt like that was not necessarily the lighthouse. Steve: Let me throw a different metaphor in here because we’re doing so well with them. Celeste: We’re killing it with the metaphors. Steve: I wouldn’t go to a, well, I guess we’d have to turn on a time machine, but I wouldn’t go to a Grateful Dead concert for absolutely anything. But if I was going to go to one, going with a friend of mine that loves music and loves the dead would be the way to do it, to be in that experience with someone who is into it. I’m using that as an analogy for the mission-driven thing. I don’t care about if the dead and whatever the dead is about is the mission. I don’t actually care about it. It’s not my mission. Belong anywhere might or might not be my mission. But if you’re going to go to a concert or if you’re going to work in an environment, one where that kind of ion and commitment and attention to detail and thoroughly building out every aspect of decisions being made based on that, you’re kind of highlighting how powerful that is and how rewarding that is. I guess I’m asking, is that still true if the mission is one that you are ambivalent about, say, versus 100% bought into? That’s what I’m probing on here. Celeste: Yeah. First of all, I love the metaphor because you’re right. You could go to a show, and if you went to the show with your best friend and they were crazy about the dead, you would have a completely different experience because energy is a very human thing. Energy is infectious that way. The ion, the enthusiasm, you can’t help but kind of like, “Oh man, I was going to say ‘Catch a whiff.’ That’s a little too on the nose with the Grateful Dead reference.” But we’re going to do this all day. Yes, I actually do think that if you are fairly ambivalent about the mission. I don’t think I woke up in 2021 and was like, “You know, what really needs to happen in the world is finance needs to be democratized for everyone, for all.” I don’t think that I woke up feeling that way. I definitely don’t think that I looked at Twitter’s mission and was like, “Man, I have never been more like — it is my life’s calling to be a part of the global town square.” I think the difference is that the mission has to resonate at least a little bit. You have to be like, “Yeah, of course. I mean, yes. Do I agree with the idea of democratizing finance for all? Absolutely. I love that.” I didn’t know I wanted to do that until I started looking into it. But it doesn’t make it any less important to me, especially if I’m surrounded by people that are all agreeing and believing in the same thing. So it helps, but I don’t think — it’s not like you have to be born with that ion in mind in order for it to be infectious. I was looking for a less disgusting word, but that’s what we’re going with. Steve: When we’re talking today, start of spring 2024, how long have you been at Robinhood? Celeste: Oh boy. How long have I been at Robinhood? I’ve been at Robinhood for two years, and I’m sighing deeply because every year is getting shorter in my life. Just like Pink Floyd promised. I’ve been here for two years. I just celebrated my two-year anniversary in the beginning of December, so a lot has happened during that time. I started in December of 2021. A few months after that, the head of research left. My job changed every two to three months for probably over a year, which isn’t inherently weird, except that 2022 happened, which meant that the crypto markets and the stock markets, everything was going down in 2022. There was a huge wave of layoffs across tech. We also had to lay people off. All of that is super difficult as an employee, but also as a leader of a team. It’s really, really tough, especially when you spend some time thinking that this is not what you wanted to do. So when I look back at the two years, it feels like more than two years, but it also feels like I just blinked and two years ed. So there’s a lot of cognitive dissonance in that length of time. Steve: What’s the cognitive dissonance? Celeste: It feels like a lot and like a little at the same time. Steve: Yeah. So over those two years, what are some bricks that you’ve laid? Celeste: So the team predates me by a lot. It’s not like I built the team and built all these processes. The team predates me by a lot. It was originally put together by my predecessors. I deeply appreciate everybody who did that for what and who they left behind. Those have been gifts. But the bricklaying in this case was that Robinhood had just IPO’d over the summer and was sort of nestling into everything changes when you IPO. Airbnb IPO’d while I was there, Twitter IPO’d while I was there. Everything changes. There’s a lot more you need to do. Because your funding looks different, there’s a lot more that you need to be held able to. And so a lot of process changes when that happens. The research team had also been through a lot. There were some big, dramatic leadership and team changes during the entire year of 2021, and I showed up in December. So a lot of my bricklaying, I wouldn’t say it was like building a team from the ground up, but it was sort of like there was a lot of healing that had to happen as a group of people. When I started, I was hearing a lot about people not trusting each other or feeling like if so-and-so got promoted, why didn’t I? I don’t even think their work is good. There wasn’t really very much teaminess. So a lot of the bricklaying was rebuilding cohesion and trust, thinking about things differently, re-evaluating tooling because everybody, when you’re laying people off, are also reassessing the budget and the tools that you need or don’t need. So a lot of really tricky stuff like that happening at the same time. And then kind of re-evaluating research’s relationship with the company, which was a pretty tall order, but it’s been fun. Steve: What are things that you can do to build teaminess? Celeste: Not sure I have a good answer to this because I’m going to give you that delicious research answer, it depends. Steve: Yes! Celeste: You’re welcome. Steve: You’re going to ring a bell right now. Celeste: Yeah. Someone somewhere is furious at me for this, but it depends on the situation. We had layoffs at Airbnb when I was still there and it was devastating and we had to sort of rebuild our emotional baseline also. And it was tough to find all the loose ends and figure out what work still needed to be done, what work had fallen off and that was okay, who we were without the people that we had lost and things like that. It’s really, it’s hard to be laid off and it’s very hard to lay off. It’s just like a no one wins situation. So in that, in like the Airbnb case, the teaminess came from just, actually it might be the same as Robinhood, being consistent. Everybody showing up and being exposed to each other and just talking about what was hard, what wasn’t working, do people have ideas, here’s what I’m doing, what are you doing, but being really active and pushing for and like pretty regular consistent and trying to foster moments of recognition when something was going well. There was a lot of stuff like that. I don’t know if I have like a silver bullet answer though, because it depends so much on like the way that things are bad or that need healing. We definitely went from people not being willing to help each other because they were worried about not getting credit to people helping each other without thinking twice about getting credit and those were signals to me that we were like nature was healing. We’re on the right track. Steve: I know you’re apologizing a little bit for the answer not being like a full list of three things like you didn’t say, oh, we had an offsite or we had a cake, so it’s like, but I really am struck by the fact that you’re talking about like intentional ways of being that are maybe smallish, but that are sustained over time. But that sounds much harder and much sort of less obvious to come up with or to execute. And that seems like, well, if you want to make change, it does depend, but there’s a set of tools that you’re drawing from and a set of principles is, I don’t know, subtler than a fix it kind of approach. Celeste: Fixing it, especially with something as fragile as like people’s chemistry and sentiments feels like a fool’s errand. Like coming in and being like, let me show you how to do things. I’m going to fix these feelings. That’s not really, I don’t think you’re going to get anywhere if you approach it that way. Someone has probably done that, but I don’t think that’s possible in the toolkit that I have. Maybe Mary Shelley’s done it. Steve: She was a great management and leadership consultant in her coaching business. Celeste: Yeah, good call. What comes to mind is that there were two things that were similar about both circumstances. One was sort of the community that I’m talking about, which like you can’t inauthentically build community. You have to push it through connection. You have to get people to connect with each other. And sometimes it’s awkward and it’s not mandatory fun. That doesn’t work. But if you can give people real reasons to show up and be there for each other and be honest with each other, you’re going to, it would be hard for that not to move in a positive direction. I think human beings just crave connection and community. Even the introverted ones, it turns out. But the second thing is like, this is a businessy answer, but just hear me out. Transparency. So here’s what I know. Here’s what I don’t know. Here’s what I’m doing. Things like that are during, especially things like layoffs or moments where the team has really lost a lot of trust in either each other, the situation around them, the circumstances. They know that you’re doing everything you can to contribute to a shared understanding. It’s again, it’s hard not to move in a positive direction if people are like, you’re being as honest as you can with me. And that opens the opportunities for, I’m having a hard time with this and I didn’t want to say anything until I felt like you were also showing up and spilling your guts about what’s happening. So yeah. Steve: You also mentioned re-evaluating researchers’ relationship with the company. What is that about? Celeste: The company loves research. I don’t think the company will ever stop loving research. Some of that, again, love dearly my predecessors and their approach to it. I got really lucky, but, and it was always going to have a place because our co-founders started doing research themselves. When they started the business, they focused a lot on listening to people, on observing people do things, like on very researchy things. So the research team was always going to be a core function at Robinhood. I didn’t invent that. I’m not going to create that. But when I started, there was this, they had grown a ton very quickly, like the entire company did. And every kind of PM, GM, executive marketer up and down the chain was and still is asking about research. We need to do it on every last thing. We have an unholy amount of opportunities to be -centered in what we’re doing. And the tone for that is consistently being set at the top, which is delightful. And I don’t feel like I’ll ever see that quite in the same way in my career. So it just feels like I’ve struck gold. They’re always asking about our customers and their perspective. They know that our co-founders are doing the same. Everybody’s asking about this. But when no one ever questions if research should be involved, and if everyone is insisting that research study everything all the time, you’re faced with a different set of problems. Your relationship to the organization is different because researchers, as I’m sure you’ve seen this too, we’re so happy to be included and consulted because we’re so unused to that that our relationship ends up being, yes, I’m going to say yes to everything because I’m so excited to be in this position and be consulted and be listened to. And so that’s going to take anybody who starts at Robinhood by complete surprise. Everyone’s always like, “You said we had a seat at the table, but holy smokes, we really have a seat at the table.” Not prioritizing, not saying no. You end up spreading yourself really thin. You end up studying things you really don’t need to study because the leverage isn’t there. It’s not some sort of force multiplier all the time. Not everything needs research. Actually there’s a general manager that I work with that loves to use this metaphor. Speaking of our metaphors, he likes to say, “If I’m opening up an ice cream shop, do I really need to do research that I should offer vanilla? I just know that vanilla should be on the menu. So do we really need to do research on that and why?” My response to that, of course, is like, in the vanilla answer, sure, but do we know that where you’re opening the ice cream shop is actually a place where people are interested in vanilla? Because you’re assuming a really narrow, maybe it’s Americans only, but whatever. There’s a group of people that like vanilla generally universally, but very generally. Is that target market where you’re opening up your shop? On top of that, is it a fancy vanilla? Are we using elaborate beans from some rare island or whatever? Or is it this just straight up vanilla, no frills? What is resonating? What’s needed based on the context of the situation? But also I kind of agree with him that maybe we don’t need to do a study about vanilla at all. Maybe we need to understand everything around the vanilla. And that to me is I think the relationship with research that is ongoing and needs to change. If you can’t, as a product manager or some other leader, have a perspective on something without research, I feel a little uncomfortable with that. You should have a point of view. I want information to inform it. But if I don’t, because it’s a fairly inconsequential thing, I think you should still be able to make that choice anyway so that I can work on the stuff, my team can work on the stuff that’s the most important based on the things we’re good at. Not every question is going to be able to be answered by research. So that’s what I mean by the relationship. And it’s nobody’s fault that this is the state of affairs. It’s just what happens when the pendulum swings really hard in the other direction. Steve: Right. You talked a little bit about saying no, but in the vanilla example, it’s almost like, hey, here’s seven more questions that are harder to parse out, riskier if your assumptions are wrong, and that create more context to the vanilla question begs a bunch of larger questions. And so that’s not saying no, though. That’s like, no, we’re not going to do the vanilla thing. But hey, how’s about… Celeste: There you go. That’s what I mean. It’s not that I think we should be disengaging from every part of the conversation. It’s just that we need to be asking the right questions. And sometimes the questions that we’re being asked that we say yes to are still vanilla ice cream questions, like just the baseline yes or no vanilla without the nuance, the subtext, all the stuff around the things. And again, it’s just because research is such a part of the product building and marketing culture at Robinhood that people come to you with very strong requests about doing vanilla ice cream research. And we just have to keep making sure that we’re working on the things that are the most important from a business UX timing perspective to make sure that we’re not just answering lower leverage questions. What’s the approach to changing that relationship in the way that you’re articulating? I spent a lot of time with product leaders and GMs and things like that talking about what the most important decisions they need to make, questions they have are, and then kind of going over… I’ve been made fun of before because I’m like, that’s an experimental question. Like straight up, just run an experiment. It’s faster and more reliable. You’ll have more confidence if you run an experiment. People laugh at me because they’re like, well, wait, don’t you collect data a different way? Yeah. I mean, just because I’m a hammer doesn’t mean everything looks like a nail to me. So I spend a lot of time doing that because it does need to come from the top. But then making sure that people have the tools to be able to feel like they can say no because a lot of researchers feel like it’s going to damage the relationship between them and their product teams if they’re not able to do everything that’s being asked of them, which is disheartening. So we work a lot on that. We talk a lot about what a higher leverage question looks like or how to meet the short-term moment of what the team needs while making it into a longer-term study also. So I don’t mean in length of time, but I mean, I’m sure you’ve heard this a bunch, Steve. You know how there’s this debate within research about whether strategic or tactical is a better use of our time? I find that interesting because I don’t think they are separate. I think that you can make sort of like what people would refer to as very highly tactical research like, “Do people understand this? Whoa!” You can make it strategic by asking questions inside of it and putting together a broader narrative. If you’re telling me that like, “I’m too senior to do usability testing,” or whatever it is, I’ve heard that multiple times, which is unholy. But it tells me more about how you’re thinking about structuring your research in a way that you’re thinking about insights and what value you bring to the team. It tells me a lot more about that than it does about your seniority or anything else. And so it’s not about like doing less usability research, but it’s about doing the research that really needs to be done. And so we talk about how to prioritize, we talk a lot about like this over that, and we’re just pretty open about what’s important and why. Back to your point about transparency, I think. I’ve been wrong before. Steve: Yeah? Not in this conversation. Celeste: Probably several times in this conversation, but never about Mary Shelley. You’re welcome for that. Steve: I’m thinking about this relationship between researchers on your team and any particular product team and so on, where, yeah, what you said, there might be a near-term question, but there’s a way to do that research that also… Either you refactor that question into a higher value question, or you do, and you didn’t say this, but I am going to say yes and, you do kind of a yes and of the short-term implication and the longer-term implication. But just this whole exploration of like, what are research being asked for? What does research provide? Just makes me think about questions that come up, like, do we give recommendations? Like what are the kind of outputs of research? And I know it depends. Of course it depends. But does this, I don’t know, does this area provoke anything for you, a point of view, or something you’re thinking about? Celeste: On the topic of recommendations, I’ve seen at some companies the idea that if people aren’t taking your recommendations, that you have not had impact, which I find charming. Like I personally do not want to work somewhere. I’d be really concerned if someone was taking 100% of my team’s advice all the time. We’re looking through a really specific lens. There’s all these other inputs to consider. It’s really great when the research is informing a decision, but sometimes the decision is not going to go with what the research is recommending. And I don’t think that’s a failure on the researcher, as long as everybody’s aware of the trade-offs and everyone understands the insights. And by insights, what I mean is like what the data itself translates to for us as a product, as a business, whatever the case may be. It’s okay to have wrong recommendations, but it would be weird if you were taking my advice all the time because it would mean that you’re really only considering one input or weighing it heavier than others. So that feels weird, but I do think, to your point about it being a bit of a controversial topic, I do think we should have recommendations, like be giving recommendations, not because we’re all geniuses, although maybe that’s true. But because we are the most informed about what the data means, what we can say and what we can’t say based on what we did, thinking about things like intellectual honesty and rigor and all of that, I don’t know why we would ever think that we were not suited to making some sort of recommendation. Because other people are going to take the information that you give and then just make their own recommendation, but that’s filtered through an entirely different lens with way less of the broader context that you’ve collected. It’s okay if your recommendation is bad, and you should probably shape it in a way that makes it shelf-stable, right? If you say, “Move the button to the right,” that’s not shelf-stable, because at some point someone will probably look at your research years down the line and go, “What in the hell were they talking about when they made that recommendation?” But people preferring design A over design B is just sharing data. People preferring design A over design B because the information they were looking for was a lot easier to find, that’s more shelf-stable, right? You understand why, you understand the context of the insight itself, and even if I don’t do exactly what you say, maybe I end up going with design B anyway, maybe I can make design B better based on the way that you frame that. So there’s a lot in there. Steve: I want to pick out what a recommendation is, because people prefer design A over design B, that’s data. People prefer design A over design B because they can find the information. Okay, that’s an insight. So then the recommendation would be, go with design A. Celeste: Right. Yeah, but you have to have that other piece in there for it to make any sense at all. Steve: We should go with design A because it helps people to find the information they’re looking for. Celeste: Words are hard, Steve. Steve: I don’t know, just as a counter-argument a little bit, and I think you’re on board with this, the risk of being wrong there is you only know what you know about that. So design B makes us more money, design B we can implement faster, design B is compliant with something, design B is consistent with what we’re doing in three other platforms. Celeste: Absolutely. Steve: So I think that’s an example of them not going with design B and that being okay because you’ve given them the information that you have to make that recommendation. Celeste: Yes. So I would argue that if you’re not incorporating things like which design would make us more money or if you’re not incorporating those pieces of information into your process, you’re leaving value on the table. You should be helping the person who needs to make the decision make it with all that context in there. Anything you can help them with. You have an amazing sense of synthesis and distillation and not everybody’s fantastic at that. So if you can pull that in, you probably should do that. But yes, you’re right that even if your recommendation is wrong, it’s not always that you’re completely incorrect. It’s like not right now because we would prefer to make more money with design B or whatever it is. But you’ve planted this seed and I understand that what people need is not what this is providing and so we can get successive approximations closer to it. Steve: Right. I think the recommendation discussion sometimes leaves me cold because it loses all the context that you’re providing. It’s the recommendation and the insight in the larger context of the set of things that might be factors for this team. And I think you’re saying you should try to know all of them. Celeste: Why not? Steve: I’ll just add there’s always going to be something that you won’t know. Celeste: Of course. Steve: I don’t know, it reminds me of like trying to write like Mary Shelley. I think if you do, if you do like creative writing and you do workshops like you get that says, here’s what the problem is and here’s what the solution is. And I think it can be really helpful when you’re the writer to choose not to do that, but to do it, ignore the advice in an informed way. Well, that’s not what my objective is. That’s not my strategy. We need short term wins here, whatever. And I think we’re agreeing strongly here. And it’s just some of the language gets oversimplified when it gets, you know, boiled down to like a social media posts. Celeste: Yeah, I think this particular topic is one that is not great for like a LinkedIn rant. And yet, it’s all we see. Steve: But great for a podcast episode, right? Because we can dig in. Celeste: Sure. Steve: You brought up my other another one that I take umbrage at which is the, you know, not we’re not having impact if it doesn’t get taken up. And I liked what you said. I’ve also heard people say my research doesn’t have value. If no one takes action, it’s even more binary, right? If that my research impact is sort of higher level than value. My work is worthless. If someone doesn’t do the thing that I told them to do. And I feel so sad because that’s researchers saying that, like as if we don’t have enough people telling us we’re without value. We’ve sort of taken on as a profession taken on some of that, you know, willingly devaluing proactively. I’m going to devalue it for you. It’s depressing. And that makes me sad. Celeste: Yeah, I agree. It also not to I’m sure that the folks saying this are perfectly smart people. But to me, when I hear that kind of stuff, it feels like it lacks curiosity. So you’re saying that unless there’s like something I can see in front of me, or unless there’s like a direct action being taken, that I have failed when I think if you had a little more curiosity about it, you could notice little hints like, well, but these people are changing their language. And they’re using the language of the people that we’re studying instead of whatever business bullshit we’ve come up with. And that’s impact. I think there’s we’ve changed the roap where we’ve just decided not to go this route. We’ve saved three months of engineering. So it’s almost that inaction is action. There is there’s lots of stuff like that. And I think if you are open to seeing it, it makes itself known to you. But if you are not curious how your research is sort of creeping through an organization, you’re going to end up with it flying right over your head. And then you assume you have no value or whatever it is you said, it’s depressing. Steve: You know, you’re bringing your the same way that you talked about bringing teaminess is it seems very analogous to the way you’re talking about both creating and sort of realizing the value that it’s smaller signals over a longer period of time with some I’m not sure if the constancy thing applies here or not. Celeste: Maybe. Steve: But it’s a binary outcome kind of thing that yeah, taking a slower and more looking for looking for those subtle more subtle signals. Celeste: Yeah, you’re reminding me of I’ve had this conversation a few times after becoming a manager where people are like just shouldn’t do research on this area because they’re not valuing it the way they should, which often the subtext is they’re not taking my advice or they’re doing things the way that I fundamentally disagree with. You probably know this better than anybody, but this is not a field for the faint of heart. This is like a full on eternal marathon. It took me three years at Airbnb and many different approaches to convince people that non drip pricing was the right call. It cost us a lot of money and we had to find ways to offset that. And every time I got knocked down about it, I could have chosen to never bring it up again. But it’s chipping away. It’s finding advocates. It’s finding other people that agree or believe it. If I can convince one person each time of something that I thought that I believe in, I’m doing great. That’s impact. And so you’re right that there’s like little subtleties, but it’s also just a long game. Like all of this is a long game. You’re lucky if you have a short term win. You’re super lucky. Steve: Anything else to add today? Celeste: No, I mean, thanks for inviting me to do this. It’s very flattering to talk about my thoughts and feelings on things we mutually like for however long this just was. It was very nice. Thank you. Steve: Yeah, thank you. You shared a lot of interesting perspectives and good history. So yeah, thanks a lot for taking the time. Celeste: This was awesome. Steve: Over and out, good buddies. That’s all for this episode. Thanks for listening. Recommend Dollars to Donuts to your peers. You can find this podcast in all of the usual places. A review on Apple Podcasts helps others find it. Go to portugal.com/podcast to find all the episodes, including show notes and transcripts. Our theme music is by Bruce Todd. Celeste: Did you say a favorite puppet? Steve: Favorite puppet. Celeste: What came up for you just now? Steve: You know so many things about so many things and I was just trying to open up the possibility space. Celeste: Favorit
01:02:54
41. Carol Rossi returns
Episodio en Dollars to Donuts
In this episode of Dollars to Donuts Carol Rossi returns to update us on the last 9 years. She’s now a consultant who focuses on research leadership. I’m happy making the contributions that I’m making, even though it’s hard to directly measure impact. I’m hearing from people that they’re finding value from the work that we’re doing together. I want to leave people with this idea that the work that we’ve done together is valuable to them, whether it’s tomorrow, or two years from now they see value in it in some way that they couldn’t have anticipated. I’m trying to be as clear as I can about focusing in the areas where I think I can make the best contribution and have the most impact. And keep reexamining, how do I feel about the work that I’m doing? And what am I getting back from people? – Carol Rossi Show Links Interviewing s, second edition Tent Talks Featuring: Steve Portigal Amazon Reviews for Interviewing s Elevate Your Team’s Impact with Storytelling Carol Rossi on Dollars to Donuts, 2015 Carol on LinkedIn Carol’s website Edmunds NerdWallet Prioritizing Research for Impact on Maven GeoCities When the Healthiest Person You Know Gets Lung Cancer Help other people find Dollars to Donuts by leaving a review on Apple Podcasts. Transcript Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead research in their organization. I’m Steve Portigal. In this episode, I catch up with Carol Rossi, nine years after she was first on Dollars to Donuts. There’s a bigger and better new edition of my classic book, Interviewing s. As part of the launch of the book, I spoke with Russ Unger for his Tent Talk speaker series. Here’s a little clip. Russ Unger: What’s your approach to ensuring that the gathered from interviews is effectively communicated and incorporated into the design process? Steve: The first part of that I think is that you have to do something. You have to make sense of what you gather. Some of this kind of goes to maturity of any individual practice. I think the less experienced folks are, the more they want to just take what they about what was said and type it up. And that verb is, that’s stenography maybe, or collation as far as you get. You put these pieces together. And then you’re just taking requests or gathering complaints. You might as well use a survey for that. I think it’s the iceberg model, right? Some of it is above the surface, but a lot of it is below the surface. Below the surface means going back to what was said and looking at it and making inferences. What wasn’t said? How was it said? What said at the beginning and what said at the end? And that’s just within one interview. What did person A say? What did person B say? And there’s a whole new chapter about this. It’s the analysis and synthesis process. And some folks say that the ratio should be two to one. For every hour of the that you gather, you should spend two hours analyzing and synthesizing. And I think in a less evolved practice, it’s the inverse. You might spend half an hour for every hour or even less. The caveat here is not every research question merits this. If we are looking for, I don’t know, choice preference between something and something else, we might be really clear about what that is. We come back and say, do this. But for anything where we want to understand why or understand opportunities or understand motivation, a new space you want to go into, characterize a customer that we haven’t worked with before, it really is worthwhile to go and do this analysis and synthesis. How do we have impact? We have to have something impactful to say. I just want to say that. Some other factors that I think can make or break it is working collaboratively with stakeholders, the folks that you want to inform, influence, take action before you do the research. And so having an understanding of what business challenges are or business goals, like what are we trying to do as a company? And then formulating really good research questions. What are we going to learn in order to inform that? And then choosing methods and approaches that can that. And not doing that in a vacuum. And then this has the effect of switching your role from being proactive to reactive. I think it’s hard to have an impact with reactive work. Those requests that come are often late. They’re often based on a shallow assumption about what kind of value research can provide. And so you are going to give a thumbs up, thumbs down in some direction. So your sort of role as a provider of these kinds of insights is diminished. If you can be proactive, which means maybe understanding a roap or what decisions are being made or who else is going to do what and proposing research on your own roap that is intentional and is ahead of time, you leave space, of course, for things that come up, fire drills and so on. But trying to work in a proactive, collaborative way, aligning on goals and then putting the effort in to make sense changes the whole conversation about what you’ve learned. You get to that point of sharing with somebody. That’s part of a larger Tent Talk. You can check out the whole show and definitely buy your postal carrier and barista their very own copy of the second edition of Interviewing s. If you want to help me out, write a very short review of Interviewing s on Amazon. Over the last couple of years, I’ve been partnering with Inzovu to run training workshops about storytelling. Storytelling is an essential human skill that powers how teams work together with each other and with their colleagues. I’ll put a link in the show notes with more info about what I’ve been up to with Inzovu. And if storytelling is something you’d like to build up in your organization, reach out to Inzovu or to me. Okay, let’s go to my conversation with Carol. She’s a consultant with a focus on research leadership. Carol, welcome back to Dollar a Donuts after nine years since we last talked. It’s great to talk to you again. Carol Rossi: Yeah, thanks, Steve. I can’t believe it’s been nine years. Steve: Time does fly. Let’s talk about those nine years. You know, what’s been the shift in your evolution in your professional world since then? Carol: When we last talked on the show, I was at Edmunds and I was leading the UX research team there. And I had been there at that point, I guess, four years. I had started the team there and then ended up staying at Edmunds until 2017. And then I took a moment, because I’d been there for quite a long time, and took a moment to kind of ask myself what I wanted to do next. I call it my gap year. So I did some consulting, some really contract work as well as like consulting, helping people think about how to set up a team. And then in 2018, I went to NerdWallet and that involved a move. So I was in LA for the bulk of my career. 2018, I moved to San Francisco for the job at NerdWallet. And that was an established team that I led for about four years. And I mean, we can go into detail about any of this stuff, but basically left NerdWallet in 2022 and started a consultancy where I’m now focused on helping companies, helping leaders know how to get the most impact from research Steve: Can we talk about NerdWallet a little bit and then talk about your consulting work now? Carol: Yeah, Sure. Steve: So it was an established team. Is that right? Carol: Yeah, it was. So there were three people on the team. There was actually an open headcount when I ed. We ended up doubling the size of that team. So we still remained a relatively small team, but we did get some additional people. We actually, I think some of the work that I’m really proud of there is that we went from having these researchers doing sort of very siloed work, or even though they were all researchers, they were hardly really working with each other even. And then developed that team to the point where we had a lot more strategic impact. We started a voice of customer program. Two of the people on the team became managers during the time that I was there. So they saw a fair amount of professional growth. And when I left, again, there was this voice of customer program established, as well as a program to train designers and PMs and content managers to do some of their own research. We had, well, on the market research side, they were doing some brand work. We were doing some kind of explorations about how that played out in product. So there were more things that are more sort of horizontal activities we were doing, and also empowering some people to collect their own insights, as well as deepening the impact of our team. Steve: When you talk about coming in and the researchers that were there were siloed, my mind starts to go to that embedded word and what that means. But I think you’re talking about siloed in a grander scheme of things. But I don’t know, what does siloed look like then? Carol: I think it’s a really good distinction. The difference between siloed and embedded to me is that embedded can be and is a very valuable way to participate in a product development team. So it’s like, and we ended up with this sort of hybrid model, I would call it. Because at the time that I left, the team was reporting to ultimately me, but they were dedicated to specific focus areas. So we had one person working on the logged in experience and that involved maybe three pods. We were calling them pods, but squads, whatever, product trio, whatever language we use to talk about the combination of the PM, the designer, the content strategist, and some number of engineers. So we’d have one researcher per, let’s say, three of those pods, but they were all within a focus area. So one was dedicated to the logged in experience. We had, for example, a couple people working on what we call the guest experience or shopping. So if you’re looking for, so I should say NerdWallet is a company that provides advice and products to consumers who might be looking for financial products. Consumers might be looking for a credit card or a mortgage or a personal loan or whatever. So you can either go and read some articles and then get linked to some potential credit cards for you based on what you’re interested in and your credit score and those kinds of things. Or you can the app, , and get tailored advice based on your specific situation. So those are, at the time, were separate areas of the company in of the way the development was divided up. So I think embedded to me is there’s a very healthy relationship with those pods where the researcher is either dedicated to one or maybe crosses over a couple of those areas, of those pods. But siloed to me is people are working on something so exclusively that maybe there isn’t a lot of conversation across. And I think what you lose in that kind of model is opportunity to take advantage of research that might be going on in an adjacent area or even a very different area but has relevance to what you’re doing. And so you can have a lot more efficiency across the research function if you’re not re-doing work, you know. Or people are learning techniques from each other, you know. Or people are partnering so that there’s some broader impact across these different focus areas. So there might be — because to the consumer, to the ultimate , the customer, they’re not seeing it, right, as these sort of separate areas. They’re seeing it as one experience. And sometimes in order to do product development, you have to divide things up. So how do we keep the flow and the things that need to be similar across the experience and have it make sense by looking for those areas of, you know, similarity or continuity or whatever the word is that you want to use there. Some of the things that we did that worked really well were have just — so first of all, I should just be really clear. Because it was a manageable team, I mean, small enough team, we could do things like have team time every week where researchers felt like they were able to have a dedicated, you know, I think it was an hour or something, but a dedicated time where they could talk about some of the stuff they were doing, present problems to each other, learn from each other, like have time to be able to say, I’m doing this thing, I think there might be some relationship to what you did last year or what so-and-so did who’s not even here anymore and what can we talk about there. So I think there’s — with a small enough team, you can definitely have people, you know, embedded or partially embedded within specific areas so they’re having maximum impact in those areas, but still conversation across. So I think that’s one thing that we did. Another thing we did was have kind of a loose repository. We weren’t using a really fancy tool. We just literally had, you know, a wiki where all of the research that was done was available. So people could go in and see what had been done and see if there was something relevant to them. And that could be like product managers, designers, anybody could go in and look at that and see. And then they’d usually come back and ask us questions. Hey, I saw this thing, you know, I wonder how that can be relevant to our team. So I think there are a few things that you can do. Steve: You mentioned that you put in programs to teach other folks who are not career researchers to do research. What did that look like? How did that work? Carol: I think the way that I’ve seen that work well is to create — when I’ve created a three-part series, workshop series. And so we start with these three workshops and then we do ongoing coaching. So it’s not just a matter of taking a, you know, a training session. And the first workshop is really setting up the research for success. And so that’s really about planning and study. So then there we talk about starting with the business objective, you know, like people will often start with a research question. Like we need to know X. Okay, well, why do you need to know X? Like there’s some business reason why you need to know it. So what’s the thing you need to know? Why do you need to know it? What decisions will be made as a result of that? And then what’s the best way to get that answer? Obviously, you know, in what timeframe do you need to know it and those things as well. But starting with that framework to give people an appreciation for the fact that we don’t just run a study because we have a question. We kind of put context around it. Even if it’s a lean and I’m using the language of run a study, but this is like the language that some people are using is having conversations with customers or collecting insights, whatever language people are using. It’s the same thinking. And in that first workshop, we talk a lot about reducing bias, making sure we’re not asking leading questions or, you know, the way that we’re writing a task or something that we’re going to put up on a prompt that we’re going to put up on an unmoderated tool for a participant to engage with whatever. We talk a lot about how to do that in a way that those are going to be effective. And by the end of the first workshop, everybody has a lightweight research plan. I give a template. So there’s a template that has all those elements in it. And there are a lot of tips and tools and sample. Questions and sample tasks. So it’s pretty plug and play, but the foundational understanding is there in of, you know, not introducing bias and some of those other elements. The second workshop is literally run a study. So when I was at Edmunds and we were doing in-person research, we would recruit a bunch of participants to come in and we’d have designers, PMs, engineers running their own interviews and, you know, we’d sit and give often. Now what we do is all unmoderated. These workshops are all online now remote. So, you know, it’s an unmoderated tool and they set it up. They set up their study and the tool, and then we, you know, wait for the results to come in the videos or whatever. And then the third workshop is in researcher language synthesis. And that’s the, like, how do you go from all this data that you just got to actionable insights? So we look at, we talk about the data. We talk about findings that come from there. We talk about insights that are really most important. And then we talk about prioritizing those insights according to the business objective, back to the business objective, back to the decisions that need to be made. What are the most important of all of those insights? Cause you might get a lot of stuff, you know, out of even a lean study. What are the things you need to take action on? And then we talk about taking action. And I have a, again, there’s a template for summarizing their findings of their study, but there’s a table that shows like, what was the insight? Okay. It was high priority. So we’re going to take action. We’re going to do this thing. Who is going to do this thing? It’s assigned to a team. Who’s the point person. Maybe it’s the PM on the team. Maybe it’s the designer. By what date is this thing going to be done? So everybody on the team now has agreed beyond the person that ran the study. They go back to the team, have the conversation. Everybody has agreed. Here’s what we’re going to do as a result. And then that goes into that table goes into their summary. And then there’s a way to go back. If the person that’s running this study. Is not one of the people on that team in this case, they probably are because they’re the designer or the PM or whatever. But you can go back and see what was actually done. Was it, you know, when was it done? What impact was gained by that study. And you can then add the impact t your impact tracker. Then there’s the coaching that happens after the training. And that’s really vital to help people sear in the knowledge from the training and get as they go along and ask questions. So sometimes the designer will go to the researcher who led the training and ask, will you take a look at my plan ’cause I’m going off the template a bit and I wanna make sure this makes sense. Or they have a question about synthesis because they get something that they didn’t anticipate and they wanna talk through how to do it. Or they need help figuring out how to message something to somebody that wasn’t on the team but needs to get these insights. So there are things that come up in life and sometimes it’s on something that they’re doing. Like when I was at Edmonds and we were doing live interviews, we’d actually have a conversation after each set of interviews with the people that were running them and how did you think that went? And if we saw something that maybe they could benefit from, we would share that with them. So I think that’s a really important part of it and something I incorporate into the workshop that I do to train, it’s not just the training, it’s the follow-up coaching as well. Steve: I think there’s a lot of hand-wringing off and on over the years about the risks and the consequences of, what did you call them? Non-career researchers, that’s a great term. Carol: People who do research, I think is what people are saying now. Steve: You know, we talk about the consequences of these kinds of programs that allow non-career researchers or people who do research. If we empower them as we’re kind of sort of the gatekeepers of the skills and the knowledge to do research, which may not even be an accurate framing anyway, ’cause people are doing research anyway. Carol: Yeah. Steve: here’s sometimes some hand-wringing about unintended consequences or intended consequences. I don’t know, with these programs of these different organizations, were there longer term kinds of changes that you noticed? Carol: Yeah, I think it’s a good question. And I’ll just say, I don’t have that argument anymore with people. I have stopped trying to defend how this can work and how I’ve seen it work well, because the fact is, I’m just really realistic. First of all, I’ve seen it work well in this way that we talked about where there’s this sort of training set and then this sort of coaching activity, and there’s a conversation. It’s an ongoing conversation. And so what I’ve seen work well, one of the things that I’ve seen come out of that that’s been really beneficial is that people who have gone through this program tend to have a better sense, when we’ve been in-house, tend to have a better sense of how to work with research and have a better appreciation for the research that the researchers are doing, the career researchers are doing. And that partnership is richer. I have seen it go awry. I’ve seen people go through a few workshops, refuse the coaching, and then do things like put an app in front of consumers and say, “Do you like it?” So it’s not without risk. I totally get that. At the same time, I’ve stopped having that discussion with the researchers that are worried about the field being diluted, or I’ve stopped using the word democratize, ’cause we’re not democratizing. We’re helping people do stuff that, frankly, they’re already doing. So why wouldn’t we help them do it better? So I think what, and I see it now in the consultancy, I’m really focused, if I say that I’m focused on helping leaders in companies that maybe don’t have a research leader, or maybe they’ve got one or two researchers, or no researchers, and they’ve got all of these other people out having conversations with customers, why wouldn’t I want to help them do that in a way that it’s gonna be more effective, where they’ll get good data? Because we all know that if you just go out and put an app in front of somebody and say, “Do you like it?” You’re not gonna get, it used to be garbage in, garbage out, right? That language still applies decades later. So yes, there are risks. I know what the risks are. I think I named one of them anyway. People just go, “Well, why can’t I do persona research?” Or whatever, probably not the best example, but just helping them realize there are things that you need to know, and I get that you need to know those things, and you’re probably not gonna get what you’re looking for with this method. And so having those conversations, it doesn’t mean that once I leave, they’re not gonna try to do that anyway. I can’t control that. Even if I’m in the company, I can’t control that. So I think the risk of people who do research or non-career researchers doing this just without any guidance is greater than the risk of them thinking they can do something that they really need a career researcher for. And I think it’s not, this is not unrelated to, I mean, it’s a bit of a tangent, but it’s not unrelated to the thing that we see where companies think they want research and they hire someone. I’m seeing, I was seeing more of this like before the big sort of layoffs happened starting at the end of 2022, I guess. I was seeing more first researcher roles that were a player coach, kind of lead manager, which I think is great. I think that’s what I would advise clients to do if you’re gonna get one person, make sure they’re at that level. But I do still see companies hiring more junior people. And I know what they’re thinking. They’re thinking we need someone to do some research. So they’ll get someone who’s very smart and very well-trained in their research chops, but there may be a senior researcher or maybe more junior than that. And then they’re overwhelmed with, they don’t have a sense of the landscape or how to manage in that kind of an environment. They aren’t getting mentorship in their research work. And then there’s sort of a, like there can be at the company, it kind of a, well, that didn’t really work out. So we don’t need research. You know, there’s sort of, instead of the concept of research being seen, instead of research being seen as like a concept or a practice that kind of associated with a person. And then they go, we don’t need any researchers. We just need to do this ourselves. And so I feel like that has, I’ve seen that a bit. And I’ve seen, I mean, I’ve also been, some of the people that come to me for coaching are people in that situation because they’re researchers that are not getting mentorship and they’ve kind of been thrown into this situation where they just don’t have the experience to be able to manage all the pieces that go with it because it’s not just about running studies. And I think I totally get the excitement about being the first researcher, you know, and when someone wants you to play that role. And I mean, it’s, you know, there’s a lot of trust that goes into that. I also know people that are sort of senior researcher level, I’m just throwing these out. I mean, it’s all, you know, it just depends on the person, but who would say, and you know, they’re in a career search and we’re talking about their career search and they’re like, I don’t want to be the first person ’cause I know what’s involved in that. So, you know, I think it’s like, yeah, I get why someone would take that job, even if they maybe have like a couple of years of experience ’cause it’s exciting. And I also hear people are probably qualified, you know, who have been working for six or seven years. And again, those numbers are just, who knows, you know, it just depends on the person. And they’re like, I don’t want to do that ’cause I know how hard it is. Steve: We sort of shifted in this conversation a little bit to talking about your consultancy. What did you start and why? Carol: I had been, towards the end of my time at NerdWallet, I had been getting calls from coworkers asking for help to set up a research program. Like, how do I get started if I want to set up research? And, you know, I was just having these conversations and realizing that I was really excited about this topic and that it’s your beginning. The beginning point is really exciting to me, right? So when I left NerdWallet, I started looking at open roles at the time. And they were this, like I was saying, player/coach kind of role, right? And so it’s like you’re doing some of the bigger research while you’re setting up operations, while you’re setting up a roap, while you’re setting up, you know, all the infrastructure and everything. And I had already done that. I had done it a couple times. So I realized I wasn’t excited about doing that again. And what I was excited about was the leadership components of that. And so the coaching or advising, and we can talk about what I think the differences are there, but, you know, the sort of training, helping people become more self-sufficient, either leaders feel like they’re stronger at ing a research practice, whether they have researchers or not. Again, like we were saying earlier, helping designers, PMs, you know, et cetera, feeling confident that they can collect insights. If they’re going to do it anyway, we may as well help them do it well. So those are the pieces that I realized I was more interested in. And also just having conversations with people about the importance of operations and thinking about research ops from the beginning or the middle, wherever you are, and how that can be such a force multiplier, you know, such a way to move forward more quickly by spending some time on infrastructure, tools, templates, like having some kind of process, knowing, you know, having some way for people to capture the insights that they’re collecting and share it in whatever way that, however that looks like things that are going to help you do things better and faster later. So those were the pieces that I was really interested in. And I decided to just go out on my own. I have, you know, I was out on my own for a while through, let’s see, like through most of the 2000s, that looked more like contract research work at that point. And I was doing that in parallel with other work that I was doing that was not tech. But at this time I was like, I’m going to go all in on this consulting model and see what happens. And that was like towards the end of 2022. Steve: Since you teased us with coaching versus advising, I’m going to ask you to take the bait. What do you think the difference is? Carol: I mean, I think, and this isn’t like, you know, genius. I think this is the way that a lot of people distinguish those. But to me, coaching is more, let me start with advising. Coaching is more like I’m working with the head of design or I’m working with somebody, you know, head of product or someone in that team that’s in a leadership role to help them see, you know, for themselves, like how that can, how research can be, have more impact or, you know, again, whether they have researchers or not. And so advising, I think has much more of a, like we’re in a conversation and I’m giving them ideas or tips. Coaching is more of a, I’m working with, I don’t do the big sort of life coaching or big picture career coaching. Like, should I do this anymore necessarily? Because I’m not like trained as a coach where I would do life coaching kind of thing. It’s more like, you know, somebody is in an ops role and wants to shift to a research role and they have all the training to do that, but people aren’t seeing them as a researcher. What do they need to do with their portfolio, their resume? How do they need to talk about the work? Somebody gets laid off, you know, it’s a surprise. They’re trying to prepare for their next role. Somebody is, like I said, in a role where they’re like the only researcher and they’re not getting the mentorship. They got on a specific thing and they don’t really know how to work on it. And their manager isn’t really kind of maybe helping them figure it out. Like it’s a very specific engagement around a topic that we can say, here’s the end goal and here are the steps that you can go through to get to that end goal. And what are the milestones that we can look at along the way, even if it’s just like four weeks or six weeks. It’s a very specific set of things that we’re doing to get somebody to a particular place. Whereas advising is also there’s a set sort of, you know, sort of a set like arrangement, a number of sessions or whatever. But it’s more like me tossing out advice or ideas, maybe more than I would in a coaching model. Steve: I’m going to use a word you haven’t used, but when you talk about coaching, I think a little about facilitation. Whereas in the advising, you have a best practice or an idea or suggestion. In the coaching, you’re kind of working along the path to get this person to articulate specific goals, that kind of thing. Carol: It’s kind of like they are going to do the work to get to a certain place. And I am helping facilitate that. And it’s the way that I work with people in coaching, it’s like there’s actually a worksheet that we use. And the worksheet kind of starts with like, what, again, I sort of should distinguish I’m not doing this sort of big picture, like what is my life about, but I do start with like, what’s your mission statement as a researcher? And what is your broader goal over the next few years? And then what are you trying to get to in the next few months, whatever that timeframe is? And that’s a worksheet where it’s like, literally, what steps are you going to take to get there? What, you know, how are you going to know that you’ve achieved that step? So what milestones are we looking for? What does success look like? When are we going to say you’re done with that step and, you know, maybe addressing a different step? And so it’s not super linear like that, but it really is. It’s like a, you know, a template. And I found that that worked really well. Actually developed the template when I was at NerdWallet, because I found it worked really well for the team to help them think through either the broader, like, I want to get to be a manager. How do I do that kind of thing? Or the very specific, they got on a performance review about something and over the next few months they want to work on it. And so it’s a really simple template and approach, but that’s how I keep the coaching engagements to like a particular goal that people are going for. Steve: So coaching engagements, advising engagements, what are the other ways in which you’re working for whomever? Carol: So I do workshops and I have one workshop that’s really targeted to researchers or, I mean, it could be anybody, but mostly the people who come are like lead researcher or managers or senior researchers or designers. It could be PMs as well, but that’s Prioritizing Research for impact. And you know, there’s a lot of conversation about impact. It’s really the thing that we have had to make sure that we’re measuring, right? It’s not about, and what is impact? We can talk about that in a minute, but the workshop is about how to think through how you’re going to get to impact. It’s not just run the studies that you want. It’s not just run the studies that somebody’s telling you they want. It’s like, what’s the business objective that we’re trying to achieve? What decisions are going to be made if we have this information for a particular study? What do we already know about this? And then we sort of go through this framework based on clarity, risk and cost. So what do we already know that’s clarity? What do we still need to know? What’s the risk of going forward without more research, any research? And what’s the cost of doing research? What’s the cost of developing this? And there’s a worksheet. It’s really a spreadsheet that we toss all of this information into and have the conversation about each of these possible research projects. And then at the end, you can see what’s high priority, what’s medium, what’s low priority. And then we also talk about how to involve, who do you involve in this prioritization process? How do you involve partners? And then when we get to the end, like who’s the ultimate decision maker for research? That may not be the person that, I mean, sometimes people come into the workshop and they’re like, well, the person who’s making the ultimate decisions, the person who should be really. So that’s a conversation to have. And then after the decisions have been made, what are some best practices to convey prioritization decisions? Transparency, you know, share the work, show people how you got to that decision. Hopefully they were either involved in the conversation up front or someone on their team was who has helped them understand the process. And so nobody is super surprised at the end, ideally. And then sharing out the results, like literally share the worksheet with everybody that needs to have it so they can see what decisions were made, which projects were prioritized against what other projects. And then for each of the, you know, if it’s sort of low priority and you’re not going to move forward, how do you communicate that? If it’s high priority, how do you communicate that? And then we end up with a lot of things that are sort of medium, like we need to do something, but we don’t need to do a fresh study. And so maybe that’s a researcher is going to go sit through what we already know, and that will save the team time by not doing fresh research because we already know a lot about it. So we have high clarity, but it is high risk to move forward without doing anything else, you know, and the cost to do this research, meaning like go through this stuff is pretty low relative to the cost of going through development and getting it wrong, which is pretty high. So pulling those levers in, you know, in the workshop, we go through this for like three research projects so people can actually do it by the end of the workshop, they’ve prioritized three projects, then they can take that back to their organization and use that tool, the worksheet. Yeah, it’s on Maven, which is a platform that has, it’s actually a really good platform for all kinds of workshops and leadership. There are workshops on AI now, there’s all kinds of stuff in there. So that’s the one that’s about prioritizing research for impact. I also have one that I literally call it see maximum impact from customer conversations. And that’s a it’s creating a game plan for the, it could you can call it your research program, you can call it your, you know, customer insights practice, you can, however you describe the thing you’re trying to do by having customer conversations. But the idea is that we do that one’s really tailored to like, leadership. So the people that come are usually like head of product, head of design, product ops, you know, UX leaders, whatever, it’s leadership role. It could be somebody who’s starting a research team who’s a researcher, and they haven’t done this before the kind of player coach person we’re talking about. But the idea is at the end of that workshop, we have a game plan. We do a gap analysis, what’s the current state of research, I’m just going to call it research shorthand, you know, what’s the ultimate desired state. And then let’s make a three month, very specific three month plan to get there. And we look at infrastructure, meaning tools, processes, training, whatever’s going on there, the operational pieces, we look at staff, that could mean you have a researcher, it could mean people doing research, it could mean there’s some operations person on another team that’s helping you recruit, could be anything. And then we organically in the conversation, we start to talk about the research roap, because people will come in and they’ll go, well, the most important thing we need to know is X. And so it’s not a workshop to lay out your whole research roap. But those pieces come in, the thing we ultimately need to know is this. Right now we need to know this other piece. So yeah, that’s also on Maven. I’ve run it internally within the company for, you know, a handful of leaders. And I’ve also started running it on Maven. The third workshop that I have right now is this training, you know, designers, PMs, content strategists, whoever, to do their own research. And that’s the thing that we talked about earlier, three parts, planning a study, executing a study, synthesizing to get to actionable insights, and then some coaching. And that one I’ve been running within companies, and I’m going to put it on Maven soon. It’s in the process of moving. I can still run it within a company, but it’s in the process of also becoming available on Maven. Steve: The one for leaders, the title has customer conversations, not research in it. Carol: What I’m finding, and I’m not the only one, I’ve been in conversation with a lot of people that are finding this. I mean, in this conversation, we’re talking about research, I’m using that language. But you know, my target audience really is like head of product, head of design. And so there can be, and I think in the last year and a half, become even more of a challenge with the word research in that audience that sometimes people think it means it’s going to be big, it’s going to be expensive, it’s going to take a lot of time. And yeah, sometimes it might be big, expensive and take a lot of time if what you need to know is foundationally something really important to your business, right? That you don’t know that’s going to, you know, make it or break it kind of thing, right? But I think a lot of what people need is not necessarily that. And I don’t want people, I don’t want those leaders to think that having conversations with customers needs to be big, expensive and take a lot of time. Of course, they’re doing research, you know. But like if you look on my website right now, the word “research” does not appear in until you scroll below the fold. And so I’m experimenting with the way to talk about the offerings that go beyond the word research, because I, unfortunately, I used to be much more of a purist, like many, many years ago earlier in my career. Well, people need to know that research can be lean. Yeah, people are going to figure out that research can be read because we’re going to do it. You know that I don’t need to be preaching about it and I don’t need to be stuck on using that language. I think one of the things that’s held us up in the past as a field is that we’ve been too attached to language process ideas that aren’t necessarily current anymore. And so I’m like, call it whatever you want, you know, like we’re going to do this thing and I think it’s going to help your business and I’m not attached to the word. Steve: When you started talking about developing this business for yourself, you kind of hinged on like what was exciting to you. And I’m wondering, you know, now that you’re kind of up and going, like do you find in a different experience for yourself when you are doing this, say, through Maven and it’s for the public, for lack of a better term, versus working with an organization and kind of going into that organization? Is there any differences for you when what you’re doing in those different kinds of venues? Carol: You know, I have this really deep background in teaching. And so for me, leading workshops is really fun and it’s really exciting and working within the organization can also be fun and exciting. It just, they are different and I enjoy both. Yeah, I mean, I like the kind of bringing people together from different organizations and seeing the kinds of experiences they bring in to the workshop and they get a lot of benefit out of that conversation. I mean, this is the that I get, like not only was it valuable to get the, you know, the material and the worksheets and whatever insights I’m bringing and facilitation, but the experience that other people are bringing in from, you know, if we do this publicly is really valuable. And frankly, sometimes I have to really rein it in because they can just start going on and trying to solve each other’s stuff, you know, help each other solve things and, which is great and I love it when they, at the end, you know, people say, let’s connect on LinkedIn and keep the conversation going. I’m actually about to set up a way for people to keep the conversation going across cohorts. So that’s something that I’m going to be doing later this year as well, because there’s so much benefit that people find from checking in, you know? So yeah, it’s different and they’re both interesting to me for very different reasons. Steve: You had offered to give a little more definition about what impact meant. So I want to loop back to that. Carol: I’ve been looking at and following what other research leaders are saying about this too. And I think that one thing that we seem to all agree on is that impact goes beyond what I call product impact. So, you know, pretty obvious that impact means we do some research, we come up with some insights, you know, we take the most important of those and we do something to change the existing product or we move into an area that’s new and we see some kind of impact that we can measure in of, you know, lift in engagement or revenue or customer satisfaction or whatever the thing is that we’re measuring, right, from a business perspective. That’s one kind of impact, but there are other types. And so I think there are three things. One, product impact, like I just described, and organizational impact. And that’s stuff like what we were talking about earlier, seeing teams understand better how to work with career researchers by going through the process of learning how to do some research for themselves. I would call that organizational impact. Organizational impact is, you know, content strategist understands, you know, let’s cut that one. I’m going to stick with the first one. Operational impact is stuff like efficiency. So and this, again, relates back to something I said earlier, but we look at this, you know, we try to prioritize the most important and most impactful research. We look at something where we already have a lot of information. We have a lot of clarity about this problem, but maybe this team doesn’t know it. So for example, we sort of real example, there was a new team spun up around a very important initiative. So the product manager, the designer, and the content strategist were all new, but there was a researcher that had been doing research in that area. And so the product designer, content strategist, designer thought they needed to do a six-week sprint to uncover, you know, where they needed to go with this very important thing. And researcher knew that there was a lot of information already. Researchers spent, you know, something like half a day going through all the information that they had, sat down with this trio, shared the information with them in an hour. Researchers spent like four hours. We can calculate the cost of that time, saved this trio the first three weeks of the sprint. We can calculate the cost of the time that they would have spent and do math and say, we spent X dollars. We saved X dollars here. And they were able to go straight to concept testing because there was all this foundational work that had already been done. So that’s an example of operational efficiency. And I don’t know that we, there are people talking about this, some people talking about this, but I don’t know that we’ve spent as much time on those calculations as a field as I think we could. Steve: Are there impacts that are not measurable or not easily measurable but still kind of make your list? Carol: I’m sure there are. I think I’ve pulled my list down to three. I mean, if you look at some of the things that people have been writing about, there are like these much more detailed models. I think it goes back to what are you going to do with this impact? Like if we want to be able to go back to leadership team, or we want to be able to put, you know, at the end of a quarter on an OKR spreadsheet, like what our impact was, we need to make it digestible by other teams and leaders. And so I think we can, I feel having studied this for a while, that everything kind of rolls up to one of those three areas. So I haven’t found something that doesn’t roll up to those three areas, let me put it that way. And I think that if we keep it simple like that, we’re much more likely to be able to say we can see, you know, like we saved X dollars by not doing a bunch of extra research on this project. And that’s something that we can talk about very clearly. I think the related to this is that it can be hard to measure impact period. And we know that, you know, so if you’re a researcher who’s a shared resource across multiple teams, you work on one thing, you go off to work with another team, how are you going to know what team A did a month later, unless someone, you know, comes back and tells you, you may have to go back and ask, hey, what happened from that study? So you know how to describe the impact that you’re having. But we need to be making the effort to try and find out. I mean, it’s hard. As a consultant, it’s hard for me to know what the ultimate impact is of these workshops and the coaching and the advising unless people tell me. And I also know quite well from my teaching experience, sometimes people learn a thing and then it’s not until, you know, a while later that it actually kicks in for them. So I think that when we’re talking about training that doesn’t have a direct sort of relationship to work that’s happening right now, yeah, it’s hard for me to even know what impact I’m having. But I think it’s really, really important for us to continually try to make sure we can get as much as we can about that. Thank you. So, obviously, none of us knows the future, and we can’t talk about the future unless we talk about how we got where we are and where we are now, right? So I think I actually want to back up to a bit of like the difference between nine years ago and now, because I think it’s relevant to this. So when you first invited me to do this sort of redo, have this redo conversation, one of the prompts was what’s changed and my first thought was everything. And then I went back and listened to the original conversation from nine years ago, and I realized, oh, more than everything has changed. And nine years is a long time. So we would expect that there would be shifts. But aside from the obvious, like the pandemic, remote work, that kind of stuff, just listening back and thinking about the way I talked about the work then, the way we all were talking about the work then, and the way we talk about the work now, we’ve been talking about impact. We have not, I haven’t used the word qualitative research or design thinking. And the last conversation was all about that, because that’s where we were at that point in the industry. And so that was what was making the work successful then. But we were, if we look even further back, the internet, the history of the internet, right? I was at GeoCities in 1998. We were making it up as we went along. And I reading the IPO paperwork and it said, we have no idea how we’re going to make money from this thing. And so that was normal. And then we had the boom and we had the bust. And then, you know, so through like 2000s, everybody was talking about design thinking into maybe late 2010s. Now it’s all about impact. So the way that we characterize the work has really shifted. I think for me also, when I think future, being at this point in my career, I start asking myself, what is my legacy? Which sounds really fancy. It’s not like I think I’m capital L legacy, like I’m a celebrity or something, but I think we all kind of go, I’ve been doing this for a long time. Like, what am I going to leave this field? What am I contributing and what impact do I want to have now as I go along? And then what am I going to be leaving whenever I decide to stop this? So I kind of look at all of that and I go, where are we now? What does the future look like? Obviously AI. I mean, we don’t need to say much more about that. We need to figure out how that’s going to, how do we use AI tools and that’s changing every single day. How do we use those tools to help the work that we’re doing now? I mean, when people ask me, what do I need to do? I actually had a call like this yesterday, person got laid off. What do I need to be thinking about and what do I need to do to position myself for my next role? It’s like, you need to be studying AI tools. And like, if you haven’t already done that, like get jumped in. Right. So that’s one kind of really obvious thing. I think another thing that we’re seeing now that’s not going to go away, that’s going to be in the future is this idea of people who do research, right? Non-career researchers collecting some of their own insights. We have to just, we can’t stick our heads in the sand and say, make it go away. It’s not going away. It’s here. It’s been here for a while and we need to figure out how to jump on that. We need to be mixed methods researchers. You know, it’s funny because when I started, I came out of human factors school and that was very quantitatively focused. And then when I started working, I just started at an era when the work was very qualitatively focused. And so now we’re shifting back towards generalists. So I think everybody needs to be some kind of mixed methods researcher. And I think most people are going to end up being sort of T-shaped, like you’re very strong in some areas more than others, but I don’t think we can go out anymore and say, I only do ethnographic, deep qualitative research and I don’t know anything about writing a survey. Like I just don’t know that that’s going to be possible moving forward. And another area that I think is really important for us is to, for people who haven’t already been doing this, because some of us have been doing this for a while, but triangulating insights across different sources. So knowing how to dive a bit into analytics data, you know, understanding something about behavioral science, if you don’t already, you know, making friends with the people who run customer . So you know what they’re hearing, like, do you have a market research function, you know, like all of these other insights functions that I personally think and have thought and have seen work really well, where we’re like, totally working together in a very collaborative way. You know, I think at a minimum, like knowing what they’re doing, if you’re not in an environment where their culture is that collaborative, but having some way to look at things across multiple types of insights functions. So this is a bit of a personal aside, but it’s very relevant to this question. So I went public in January with the fact that I had lung cancer late last year, and I decided to go public with it, because I thought it might be valuable to people. And I’ve gotten, I mean, you know, in of personally what that did for me, it’s, you know, it’s just sort of I could have gone through a full examination of my whole life and career. Oh, my God, do I want to keep doing this? And what I realized is, I’m happy making the contributions that I’m making, even though it’s hard to directly measure impact. I’m hearing from people that they’re finding value from the work that we’re doing together. And so that’s what I want to leave the world with. I want to leave people with this idea that the work that we’ve done together is valuable to them, whether it’s tomorrow, they’re taking the prioritization worksheet back to their company, or we have this coaching conversation and two years from now, they see value in it in some way that they couldn’t have anticipated. So I think that’s really vague and broad. But, you know, I’m trying to be as clear as I can about focusing in the areas where I think I can make the best contribution and have the most impact. And that what zaps my energy, what gives me energy, like you were talking about earlier, I really like teaching these public workshops, as well as doing the work internally. So I’m going to keep doing the public workshops. Yeah, just keep reexamining what, how do I feel about the work that I’m doing? And what am I getting back from people? Steve: I think you’re saying that talking about or going public with your medical situation prompted people to reach out to you in a way that highlighted the importance to you of the impact of the work that you’re doing. Is that correct? Carol: Yeah, it was one of the things that did that. I mean, and also just literally in of impact of that article. Many people have told me, oh, I went and got a checkup, because I realized I hadn’t been taking care of my health. Oh, I smoked like many years ago, I should go check that out. Or I hugged my child more closely, I called my parents, the human elements of it, as well as the physical health elements were that was really rewarding. And I don’t know what I expected. But I don’t know, for some reason, I didn’t. I don’t know why I didn’t necessarily expect all of that. Steve: Well, yeah, you have no template for, no prior in what the response to that is going to be. Carol: No template. I mean, just to throw this out there. And just as another, like, I didn’t write this in the article, but I didn’t even know how to tell the clients that I was working with. And I and that’s where I said, I’m going on sabbatical. And then people thought I was taking a fancy vacation. And then I said, Well, I’m taking a medical leave. And then they worried a lot and started slacking me. Are you okay? What’s going on? How are you? What do you even say when you need to take two months off or whatever it was, if you don’t want to disclose because I wasn’t ready to disclose that. So I don’t even have a template for that is what I’m saying. Steve: Yeah, now you’ve had that experience, so you’ve learned from that experience. Carol: Hopefully helped other people. Steve: We’ve been talking in and around impact at various levels and yet this article, the examples you just gave from your writing of maybe it’s outcomes, not impact. I don’t know. I don’t want to jargonize it, but the kinds of things that happened as a result of your action that you found meaningful and that people reported back that they found meaningful. I don’t want to take a personal experience and try to force map it into something professional, but I guess I’m just seeing echoes throughout our conversation. Someone saying I hugged my kid is very interesting. That was the action they took. That was something they shared with you, and that was something that had meaning for you as a result of it. When we started off talking about founding your consultancy and determining what you wanted to do for that, what kind of offerings you had, I was just struck by the fact that you used the filter of what excited you. Now we’ve been talking about changes and even looking ahead, present moment to “future.” I guess just maybe try to tie those things together. Are there things about the near future, the distant future, whatever time horizon we have for future, are there things about that with the work that you’re doing that excite you? Carol: The thing that I’m excited about for this year is to actually do more of the public workshops. And so I think I mentioned I’m going to roll out the research, you know, lean research for designers and PMs to be public. I’ve got some other ideas that I’m working on, like, you know, some of the pain points that I hear from customers are finding the right people finding the right participants for research, which is a lot easier and B2C than it is in B2B. But there are some things that we can talk about. That’s going to be a workshop being having more conversation around knowing when do you do this yourself? And when do you hire a career researcher? What are the operations that you need to put in place to have your conversations with customers be effective? Like, there are topics like that, that I’m exploring for either short workshops or longer ones because those are things that I’m hearing about. And I like that public forum. So I’m excited to be rolling those out later this year. Steve: Carol, it’s really great to have this chance nine years later and talk about what’s changed more than everything and the work that you have done and that are continuing to do. Carol: Yeah, thanks so much for including me. Steve: Thank you for taking the time. It’s great to chat with you. Carol: It’s been really fun. Steve: That’s it for today. I really appreciate you listening. Find Dollars to Donuts where podcasts are podcasted, or visit portigal.com/podcast for all of the episodes, complete with show notes and transcripts. Our theme music is by Bruce Todd. The post 41. Carol Rossi returns first appeared on Portigal Consulting.
01:02:48
40. Gregg Bernstein returns
Episodio en Dollars to Donuts
In this episode of Dollars to Donuts I welcome Gregg Bernstein back for a follow-up episode. He’s now Director of Research at Hearst Magazines. The thing that I always come back to is that there is no one way to do research. And I also think there’s no one way to do research leadership. So often when I post a video or write something, it’s a knee-jerk reaction to something somebody else might have said that I feel like is going to discourage folks or paint this industry in a negative light. I don’t want to sound like a Pollyanna, but I love this field. I think it’s invaluable. I think more companies should have a research function. And so anything that I write is usually meant to show that there’s opportunity, there is value in this work. – Gregg Bernstein Show Links Interviewing s, second edition Steve Portigal on the Understanding s podcast Gregg Bernstein on Dollars to Donuts, 2015 Gregg on LinkedIn Mailchimp Vox Media Condé Nast Hearst Magazines Research Practice: Perspectives from UX researchers in a changing field Sian Townsend on LinkedIn Nicole Fenton How to Make Sense of Any Mess by Abby Covert Gregg’s site Help other people find Dollars to Donuts by leaving a review on Apple Podcasts. Transcript Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead research in their organization. Today, I’m chatting with Gregg Bernstein, nine years after he first appeared on episode one of Dollars to Donuts. For context, here’s a tiny clip from that episode. Gregg Bernstein: And I’m a little disappointed that you didn’t start this interview off by saying this is two Jews talking about customer research. Steve: But before that, did you know that there’s a new edition of my classic book interviewing s? The modern day book tour seems to be in fact the podcast tour. And so recently I chatted with Mike Green for his Understanding s podcast. Here’s part of my conversation with Mike. Mike Green: And you mentioned the world of work and how it’s changed. And the one thing that we haven’t touched on so far is obviously the pandemic and the COVID years, if I can call them that. I’m interested to get your sense of how that impacted research as a discipline. So speaking for myself, obviously the work has continued and it’s continued at pace. But I can’t the last time I sat in somebody’s office or place of work or even their home and actually interviewed them face to face. Which, you know, some ways it speeds up research. You can get more done remotely. People are perhaps more relaxed if they’re sitting in their own homes on Zoom. But there’s a loss, I think. As a researcher, I find not being in the context of the individuals surrounded by what’s on their walls and what’s around them and the kind of movement of the environment. It’s harder in some ways to get the insights. But I’m interested to know kind of what’s your perspective on how the pandemic changed for good or ill, kind of what we do. Steve: I mean, 100 percent to everything that you just said about loss. I mean, that’s the word that I use. I mean, I don’t know that it’s permanent. I think the world of work is continuing to change as we’re sitting here on this day. It’s the beginning of the year where we’re talking. I haven’t seen 800 RTO articles, return to office articles. But it seems like, you know, there’s a constant discussion about that. And it’s interesting because like for sure the pandemic changed work. But it also triggered lots of bigger and more uncomfortable sort of discussions about power like bosses and property owners that, you know, have a stake in how work takes place and where it takes place. And worker power kind of pushing back on that. And depending on where you live and what industry you’re in, you’re going to see that more or less. So like I’m saying that remote research is being affected by these much larger shifts that I don’t have any sort of brilliance on. But I think the work continues to be in the middle of. So I have not sat with someone in their place of doing whatever it is that they’re doing and interviewed them. And I just said at the beginning of our conversation, embrace how other people see the world. Well, that’s the way to do it, right? You let go of your thing and go to their thing. And it is harder. And it’s harder for me, they’re clients. But for other people, it’s their colleagues. It’s harder for us as researchers to facilitate that, oh, kind of reaction that we’re going for. We want people to know that their assumptions are wrong. And you can get these really jaw on the floor moments that we work to facilitate. We work to create those, you know, uncover those narratives and have our teammates let go of their biases and their assumptions and their aspirations. And that’s hard to do without taking people out. It was not only what we got to do, which meant that we could connect with people. We could see stuff that we didn’t know we wanted to ask about. We could be uncomfortable. We could be forced as researchers. And then we could create, I think, effective experiences for other people to also make the work transformative. And that’s a big fancy word, but we’re all changed by doing this. Yeah, I really miss doing that. You know, I have peers that are like, oh, someone today was talking about some overseas trip they were doing to do field work. Like, I don’t even have to go to some exotic environment, like different than my own. I just would like to sit in an office or, you know, walk around a firehouse or something like that. So I think these things are going to continue to change. But I think there’s two fronts is what I’m trying to say here, right? What do we experience in the field, but also what do we experience with our collaboration and facilitation of the people we work with? And I think this also happens after the field work. If everything that we do takes place in a remote workspace and not, you know, more often is asynchronously, we’re also having fewer of those. I mean, I can think of just times where I’ve had like clients and colleagues and we’re off site. We’re spending several days in a room going through this stuff and trying to make sense of it and just having like life changing insights come up. And that is so grandiose, my language. I mean, when someone comes up with something that riffs off of something that someone else says, and you can just sort of like feel a bunch of ideas come into alignment. Like it’s a really powerful intellectual, creative moment. And I haven’t had that for a while since I’ve been working where my participants are in a Zoom room and my colleagues are, you know, before and after the research. And so I don’t know, personally, I’ve struggled with the work feeling a little more transactional. And I think that is sort of coincident with other pressures on the work of research. So I don’t know, I’m throwing everything into like a big, hairy, ugly ball of smooshed stuff together. And I think when you bring up like remote and pandemic, it like, oh yeah, there’s all these things that are kind of connected to that. And I don’t know how to tease them apart in a sensible way. I think I’m, you know, I’m being buffeted by those forces, I guess the way everybody else is. But yeah, I miss it. I think that’s my bottom line is exactly what you said. Like there’s a loss there. And I hope we can evolve to a point where it is a necessary part of what the researchers do, what the team does, and to kind of have those experiences, which are so inspirational. Again, that was me on Mike Green’s Understanding s podcast. Check out the whole episode, and of course, pick up a copy or two of the second edition of Interviewing s. To learn about my consulting work and the training that I offer to companies, visit portigal.com/services. Now, let’s go to my recent conversation with Gregg. He’s the Director of Research at Hearst Magazines. Gregg: This is Gregg Bernstein, and you’re listening to Dollars to Donuts. Steve: Could we get a two Jews talk about research? You think you would do that? Gregg: Again, me from nine years ago, not the sharpest tool in the shed when it came to naming things. But sure, you’re listening to two middle-aged Jews talk about research. Everyone’s favorite podcast. [laughter] Steve: All right, well, what a way to begin. Thank you for that. When we talked nine years ago, you were working at Mailchimp, and I think you were the maybe second or third person I interviewed for this podcast, but you were the first episode that was published. So it’s really cool to have you back and talk about what’s changed for you, what kind of things you’ve learned. So thank you. Do you want to talk maybe about some of the different places that you’ve worked and maybe compare and contrast what work was like and what you’ve kind of seen in that intervening time? Gregg: Yeah, first of all, Steve, it’s a pleasure to be back on your podcast. Steve: Great. Thank you. Gregg: So when you and I first spoke nine years ago, I was the research manager at Mailchimp. And it was my first time as a research manager. Gregg: And I also, I don’t think I realized at the time just how unique the Mailchimp situation was for a researcher. And what I mean by that is, I had an almost unlimited budget to hire videographers to film our customers and make short films. We would create these artifacts of personas that we would hang up around our office, so everybody would learn. We had a CEO who was a designer before he was CEO, who understood the value of deg for people and knowing who those people are. So he ed research. He wanted us to make the best designs, which meant knowing our customers. And I was spoiled rotten. And I realized in subsequent jobs that that was not how most research roles are. And I think when I left Mailchimp, I ed Vox Media. And we went from being really precious about the deliverable of the research to being scrappier at Vox. And I don’t mean that we were precious. It’s not that we weren’t precious about how we did research in either organization. We were thorough. We made sure we spoke to the right people. We asked good questions. We did solid research. But I think the difference was we wouldn’t — at Vox, I spent much less time on a project. If maybe I spent a month on a project at Mailchimp, I would spend a week on it because we had a very long list of projects that needed research. We had a pretty — not aggressive, but we had a quick-paced cadence of work. And so I learned very quickly that I had to work faster. I didn’t have to spend as much time creating these amazing artifacts as long as I was answering the fundamental questions and putting them in Slack or even a very poorly formatted Google Doc. As long as people were learning from the research, that was great. That was the gold standard. Did we learn from this? Did we make good decisions from it? If yes, move on. So that was a huge change in how I thought about research. And it also made me, I think, a better research manager or leader because I realized budget is not commensurate to quality. You can do amazing research fast, scrappy, on a budget. You don’t need those unlimited resources. And that’s not to say I would love a bucket of money to be at my disposal. If I had to choose, I would take the high budget all the time. But I had to quickly learn how to get by with less, less time, less money. And you know what? It was a great experience, great learning opportunity. And something that I feel like made me a better researcher. Steve: What are the circumstances in which putting that effort into the deliverable is, I don’t know, necessary or appropriate? I think you’re listing the times when it’s not. There’s a big demand and people are willing to kind of consume it in the form that it comes and act on it. Gregg: Yeah, that’s a great distinction you’re making. At Mailchimp, I think it was necessary to put so much effort into the presentation of materials because it was a young company that was growing fast. And so, yes, we wanted people to learn from the research. But we also wanted people to understand who our s are. So if you’re an engineering manager, if you work in ing, you still need to know who we’re serving every day. Like what is the reason we’re coming to work? And I think that that knowledge was maybe not distributed evenly. And so putting films together, creating posters, and making everything so public and investing in it sent a signal. Like you need to know who we are working for. Your job depends on it, directly or indirectly. And I think for that time in the company’s history, it was absolutely the right call. And just like at Vox when I ed, moving fast and just banging out study after study and saying this is what we need to know, okay, this is what we need to know. And saying, okay, now we know this. Let’s build a thing and move on. That was the right approach for where Vox was when I was working there. Steve: So that’s a little about Vox and kind of the change in culture and already a big impact in your approach that was suited to how you all worked and the people you needed to have impact. What was the next sort of major role where you, maybe your practice evolved yet again? Gregg: I think I’m going to stay in Vox because I feel like my time there — I spent four years there. And my first two years I was working on a team that was creating tools for all of our writers and editors. It was a content management system. And that was very similar to the work I was doing at Mailchimp, which was software for creating content and publishing it. At Mailchimp it was newsletters. At Vox it was content, news content. Or food content if it was for Eater. Or tech content for The Verge, to name a few of the brands we worked on. But at the heart of it, it was how do I understand the editorial process? And how can we make a better set of tools for publishing content, whether it’s an article or a map or a video or a podcast? And that was the first time I had worked on an internal team. So recruiting was no longer difficult. I could just get in Slack and talk to anybody in the company and say, hey, I’d like to talk to you about how you write articles. You know, there was very little difficulty in setting up — in finding participants and setting that up. That was the first two years of my time there. The second two years, my role — the mandate for my role changed from understanding how we create content to how do people discover and consume content? And, again, this was in a remote-based organization that was a little scrappier. So I really had to think about how do we build out a process of getting from our hundreds of millions of visitors to our various websites? How do I work with not just my product organization but our editorial organization to understand what information would be valuable to them? How do I get from executives to do this research to make sure that once it’s done, they will have an appetite for it and learn from it? And so it was the first time I had to create, I guess, demand and awareness and opportunities where none existed. Because it already existed within my product organization to build the content management system. But as far as, like, doing research that would discovery and consumption, it was research that ended up ing marketing and sales. Because we could — if we knew more about our audiences and what they valued and what they came for, we could put ads on our pages that kind of aligned with who was coming to our sites. And that’s not to say we didn’t have demographic data, but we didn’t really have an understanding of why is somebody coming to The Verge? What is the next action that they’re going to take after they come to The Verge? And how can we make a better experience for them? So if they’re researching headphones and they’re looking for product reviews, if we know that that’s what they’re coming for and we know that they spend a certain amount of money after the fact, we can sell ads against that. And we have a better understanding of, okay, people are — they trust us for our ads — I’m sorry, they trust us for our product reviews. We should probably think about a better product review experience. None of this really was, I guess, designed. We didn’t have a designed research process. And so for the first time, I was having to chart a new path with the of my manager and my colleagues, but I kind of had to figure out how to make this happen within a large organization and figure out which people I needed to talk to, who I needed to get from, who I needed buy-in from. And it was a fantastic learning experience because there was friction. Not a lot of friction, but like I had to convince some people of why we were doing this. I had to figure out if somebody was resistant, how can I get them to this? And then how can I ask questions that will lead to insights that don’t just benefit me, but other parts of the organization? And so I feel like that was the moment when I really understood how to — I don’t want to say lead a research function, but how to get and buy-in for research activities where maybe that didn’t exist before. And that’s what set me up for future research leadership opportunities. I feel like that’s when the training wheels came off and I understood the bigger job of being a research leader. Steve: sometimes comes out as people being blocked from doing research. And so I think you’re talking about you had your own team, your own manager, your own team that’s doing your research, but you’re trying to make connections and help people see and engage so that the work that you do is valuable and they’re gonna act on it. Am I getting it right? Gregg: Yeah, you’re getting it totally right. So one example is on the website Eater, which is about restaurants and food culture, there is something called a map — well, it is a map, but there’s a product name for it, which is escaping me now. But you might have like the 20 hottest restaurants in New York City. Or, you know, the 10 best restaurants that you should go to in Minnesota or Minneapolis, to be more specific. And a project might be, okay, let’s make the process of building maps better. But at the same time, let’s look externally to how do people actually use these maps to understand how can we improve the experience. So we’re trying to make a better editorial experience as well as a better experience. So part of that is understanding, well, why does somebody use one of these maps in the first place? What is their goal? And to do that, we might need to get from the editorial staff at Eater, which means working with their editor-in-chief or, you know, one of the editorial directors and saying, hey, we want to put a banner on Eater that says, help us improve Eater for everyone. We need to get their buy-in so they’re not going to their website and wondering why is there a banner on the top of my page. So just taking people away from the articles that we’re publishing and pushing them to a survey or a screener to participate in an interview or usability study. So we need to get their . But to do that, we also need to offer some sort of carrot. Like we want to talk to people about their Eater maps experience. But while we’re talking to them, is there anything that you’re curious about? If you had an Eater reader sitting next to you, what would be on your mind? So I’m trying to throw in questions that will help my editorial colleagues, but I’m also focusing on what I need to know to improve the map experience for my team. So that’s where I need to get their buy-in and their . And it means clearly explaining what we’re trying to do, but also saying this is also an opportunity for you to learn about your audience. And this way I’ve got their buy-in. There’s no surprises when they see some sort of banner or call to action to participate in research. And they know that they’re going to learn something. And at the same time, our product organization is going to know something and learn something. Did that make sense? Steve: That’s a great clarification. Do you have any examples of overcoming hesitancy or uncertainty in those folks that you were needing their from? Gregg: The hesitancy is usually just around, let me understand what this is going to look like. So showing an example of this is what a banner might look like on your website for a mobile or a desktop . So that there’s not — there’s this fear that maybe there’s going to be a screen takeover. And it’ll say like, don’t read this article, click here and take a survey, which we’re not trying to create a bad experience. So it’s, I guess, demystifying the research process and showing, hey, this is what we’re going for. This is the goal of the study. This is how we’re actually — this is what it’s actually going to look like on your website. And we’ll work with you on the language we use, you know, help us improve Eater or make Maps better for everyone. Thinking further, like sometimes we would do these big studies of our audiences where we would need the editor-in-chief of one of our sites to write a call to action. Like, hi, I’m Nilay Patel, I’m the editor-in-chief of The Verge. We’re doing an annual survey to help us improve our site, not just the coverage that we’re writing, but also the experience of visiting our website. And we need your help. So if you read our content or listen to our podcasts, help us out. So it’s always a matter of over-explaining and saying this is exactly what we’re going for. This is an opportunity for you to learn as well. Let’s work together. And this way we’re all going to learn something. And worst case, we get a bunch of responses that maybe they’re not exactly what we wanted to hear, but we’re still going to learn from real humans who read our content or listen to our content or watch our content. And we’ll be able to learn from it. Steve: When you started off describing these last two years, the train of wheels came off, and you ended and I didn’t really pick up on it. And I went back to the earlier stuff, but you ended with saying something to the effect that this was really where you learned about design research leadership. Does that take us into the next role? Gregg: I think it does because I ed Condé Nast as their research lead. Condé Nast is another publishing company. And the job I interviewed for was to be research lead for just their subscription brands, which are brands like The New Yorker, Bon Appetit, Wired. But shortly after I ed, in talking to my boss, the vice president of product design, we realized that I was the highest ranking researcher. And so if we were to have a holistic research process for the entire product design organization, we couldn’t just focus on subscriptions and subscription products. We needed to have research across the board, across all of our brands and divisions. And I was able to articulate that this is what the role should be. You need to have somebody who is looking at subscription products, but the other parts of the company, like commerce, which means selling products through product reviews, which is something that Vogue does. Or some of the other fashion magazines where you’re not just selling a subscription to a magazine. The magazine makes money by reviewing products and saying, like, here are the 10 best bags to wear or backpacks or high heels or computers. The company makes money through that type of content. And so going back to what I was saying, I was able to articulate that we should have research in subscriptions, but also in commerce. But also having one research leader in charge of all research means that we can instill quality control. We can make sure that the researchers are collaborating so that a researcher who’s looking at commerce and a researcher looking at subscriptions, they’re not working in a vacuum. We’re not a siloed organization. We’re one research team that can collaborate or maybe move people around as needed based on what are the most important questions of the day. So I was able to see how getting buy in, putting processes in place, managing a team, how I could take what I had done at Vox and apply it to Condé Nast and kind of create a larger role for me at Condé Nast that was really necessary to make sure that the research was holistic and that the researchers were collaborating and that insights from one part of the organization were making it to other parts. Steve: You’re describing this point at which you go from having pockets of research, for example, to building a role that’s a leadership role where there’s a person responsible for taking care of and ensuring all those kind of qualities of research that you described. And that sounds like a point of evolution, a point of transition in the overall organization’s research maturity. Gregg: It was because it was a point where I was able to work with my manager to look at the entire organization, all the content that we publish, and see where the gaps were in our knowledge. So I had a researcher who was embedded with The New Yorker. I had a researcher embedded with Vogue. That left something like 25 other magazines that there wasn’t research, at least not research. And so I was able to make the case that we really should hire somebody to look at all commerce. How do we sell products? What would make for a better commerce experience for our s? I was able to make the case that we should have somebody who is looking at our subscription brands like Wired and Bon Appetit. And then make the case that we should have somebody who’s just looking at the member journey because when you have so many different titles and so many different ways people are selling content, it takes some effort to know what’s going to resonate with somebody who they’re either thinking of buying content for themselves or to buy a gift for somebody else. Do they want a digital subscription? Do they want to actually receive something in the mail? So working with my manager and with the other design leaders, it became clear exactly where we needed to have resources in order to make sure we’re learning and ing the designers and the product managers and the engineers to build the right product for the right people. So it was an inflection point. And it was personally great because I got to hire some really awesome researchers to fill those roles. Steve: What are some of the ingredients or elements that you are uncovering and articulating when you are making the case for those kinds of structural changes or role changes? What does that include? Gregg: I mean, first there’s pointing out that maybe we have a number of designers and engineers and product people working on a product with little to no with the humans who use that product. So just pointing that out and saying there’s an imbalance here in staffing. Or pointing out that a lot of designers and product managers are asking for research but not getting it because there isn’t the headcount or enough hours in the day to those efforts. So those are usually the two places to start. There’s a demand or there’s an imbalance and a vacuum of . I also have used interns as a way to gauge demand for research. So if I bring in a summer intern and I put them on a project with a team and then the intern goes away, the team will suddenly realize that void in their life where a researcher used to be. So then you can make the case, hey, this team got used to working with a researcher. It’s really not ideal for them to go back to trying to do research on their own. If we were to open headcount, this is where researchers should sit as a backfill for the intern that we lost. So that’s something I’ve done that at Mailchimp, I’ve done it at Vox. It’s a good tactic to test the waters and build demand for hiring a permanent researcher. Steve: Yeah, that’s kind of brilliant. It’s almost like a prototyping process. Gregg: I’ve also seen it where the intern did a great job, but maybe after they left, there wasn’t as much demand as we might have guessed. And while I always love to make the case that I want to hire more people, sometimes that proves that maybe that wasn’t the right place to hire. So it is like a prototyping process. Steve: I’m curious if you have any perspective on Condé Nast culture in of how work was being done, about how you were engaging with different stakeholders or anything about research that is a compare and contrast with the first two companies we talked about. Gregg: I think what I can say about Condé Nast is it was the largest company I had worked for at that point in my career. So culture, I realized, is not set for an organization. Culture is maybe at the team level. So that was a, I don’t want to say a shock, but it was very different where you realize that other teams have very different ways of working, of communicating, of ing each other. And so I feel like I was able to instill a really strong culture for my research team. I feel like, you know, among my design manager peers, we had a really nice relationship, but I would not want to generalize the culture based on just the people I was working with. It was large and, you know, your mileage might vary depending on who you spoke to on any given day. I’m trying to be diplomatic, Steve. Steve: I like hearing you how you’re unpacking it because, yeah, culture is this big label we kind of stamp on things. This organization is this culture, these type of people have this culture. But it is more local than global. Gregg: I would say that I was able to create this very ive, warm, amazing culture. And I mean, partly it was we would get together, you know, every three to six months. So you would get to have human , you know, real life with people. But somehow I don’t even, if I could replicate it, I would. Even remotely, there was such a feeling of these folks have my back and I have theirs and I would do anything for these people. And that made its way into how we hired. Like, I don’t want to bring somebody in who is going to ruin the feeling of this organization. So let’s be really rigorous in how we hire. Not that we weren’t rigorous elsewhere, but it takes a special set of skills to communicate warmth and empathy remotely in Slack messaging, over a Zoom. And that’s something that I really cherished and something that I’ve been really mindful of ever since. Thinking about culture is a good way to transition to where I am now. I ed Hearst Magazines, yet another publishing company, in January of 2023. So I’ve been there for a year and two months. And what stuck out immediately is the warmth of every single person I’ve spoken to or spoke to in the interview process. And since I’ve ed, it’s such a warm organization, which is a massive organization. Hearst is huge. It’s 130 some odd years old. But from our legal team to our president to our executive leadership, everyone is just, they seem to care. And that’s what stood out to me from the moment I started speaking to the people at Hearst to people I’m still meeting. I mean, it’s a giant organization. I’m still meeting new people a year and two months into this job. But culturally, I feel it’s the closest I’ve felt to what I had at Vox, where I know that the team cares about each other. They care about the work. They’re invested in making a great employee experience, and they really want to make a great experience. And when you can find people who care, not just about the work, but the people they’re doing the work with, it’s special. And I have a great set of colleagues, and I just, I feel like I’m in a great spot, which means you know that in like two months, people listen to this podcast and realize we have layoffs or something, and I’m no longer there now that I’ve jinxed it. I’m kidding. I’m kidding. It’s a great place, but it is weird because it is such a huge organization with so many tentacles, and I’m not quite sure how the company has managed to achieve it, but it’s a pretty special place. Steve: As you talk about culture, I hear this attribute of, my words, not yours, like welcoming to humans. And I think a theme of this podcast and part of this conversation is culture that is welcoming to research. And I like what you’re kind of getting at, that people care about each other and they care about the product and the experience that they’re making. I’m paraphrasing you badly here. Gregg: I think that comes from company leadership, and I realize this isn’t going to be the same across the board, but having a president of the company who says, “I really want us to know our s.” And to have the highest ranking executive in the company say that, it gets buy-in, and it makes everyone realize this is important, and it just makes embracing research that much easier to achieve. And so that’s the reason my job opened at Hearst was our president saying we need to know our s and the company investing in a research function. And it also means that everybody I work with is curious to know how to incorporate research into their processes. And so for the last year, process is what I have been focusing on because we have the mandate, we have the buy-in, okay, now we need to put the pieces in place. And for me, the pressure is on to deliver because it’s different than other jobs where research had existed in Condé Nast. Research was something that we were doing at Mailchimp. At Hearst, there wasn’t research at scale when I ed, which meant we had the mandate, but we didn’t have the ability to do it or do it well. So I’ve spent the last year in many meetings with our legal team just to put a process in place to get consent from people who visit our websites to engage them in research activities. I’ve been having a lot of meetings with our tech team on where PII will be stored, which of our products should we use to even do research that will be secure, where recordings will not end up in somebody’s hard drive that they shouldn’t end up in, or in a cloud service where maybe it’s not locked down to our preferences. So this has also been a learning experience for me because I have spent so much time doing operations work just to make research possible. And it’s also been a little bit stressful because everybody wants research and I’m constantly having to say, let’s hold up because we don’t have all the pieces in place yet. We can’t put an intercept on our website. We can’t email a because we shouldn’t have their PII in our individual Outlook or Gmail s. We need to use the right tools to engage with them that is secure, where we’re not just going to be leaking email addresses and phone numbers in the wrong places. So let’s really get buttoned up and dial this in so that we are protecting our participants, but we’re also protecting the company. And we’re not putting the entire notion of research at this company at risk because we’re making mistakes. Steve: At what point did you, when you sort of started on this journey of yours to build this scale, did you do so with the expectation that operations was gonna be a key order of business for you? Gregg: I did not. I also thought maybe this was me coming in with a little too much confidence. I thought that because I had created consent forms in the past, I could just email my legal team and say, hey, we’re going to start doing research. I’m going to put an intercept or a call to action on our websites. Here’s the consent form I created in Google Forms. And immediately my legal team said, timeout, why don’t we talk through this? And it was a setback because I thought we were ready to go, my second week on the job. And it turned out that we were many, many, many months away from actually being able to do anything at all outside of maybe a platform where we’re not using our s. We used platforms like Testing where we could research with a of random people. But as far as engaging with our known s, that took a lot of logistics. But it was also a great learning experience. And I have some amazing legal colleagues who were really helpful in pointing out ways that things could go wrong and working with me to come up with a process that we’re all happy with to some degree or, you know, to the most part. Steve: Is there anything about your industry or the culture that even though you had this and this collaboration, is there anything that might have led to the amount of the scale of the effort that you’re describing to get there? Gregg: I don’t know if it’s media or just legacy enterprise, you know, historic organizations. Because Vox was a media company, but it was a new media company. It started in the digital age. There was never print magazines. And so it very much operated like a startup where, you know, if there was budget and I could get my manager’s approval, we would just buy a product with a credit card. Here at Hearst, that is not how things work. Like we’re not just going to click through an agreement and agree to some random SaaS company’s and, you know, suddenly we’re using their product. Everything has to be examined and negotiated and approved. So things move slowly. I think that might just be because it is such an organization — such an old organization that wants to be around for another 100, 200 years. So the mindset is let’s be slow but sure. You know, it’s better to take our time rather than get sued for a million dollars because we violated somebody’s privacy or we, you know, we used a product that we shouldn’t have been using. So I think that’s the whole idea of the community thing. Steve: Well, I love hearing you describe slow in a way that is like deliberative and collaborative. I think, you know, there’s sort of an archetype of, oh, I want to get this thing done, but I couldn’t get, I couldn’t get anyone to help me or legal drag their heels that, but you’re, and maybe you’re being diplomatic, but I guess that the perspective I’m getting from you is that it’s not resistance to overcome. It’s the natural, you know, culturally appropriate way to do things, which is, which does take time. And another company might be faster, another company might be slower, but for different reasons, less, you know, more ive resistance. And here you’ve got slow, careful , which is an interesting kind of way to have it be. Gregg: Yeah. It’s never no, we’re not going to do that. It’s yes, we could do that, but let’s think through every step of this process to make sure that we’re not overlooking something fundamental. So this will sound maybe tedious to people listening to this podcast. I apologize in advance. But if you think about a generic news website, okay, you go to an article. Let’s say there’s a call to action. Like you’re looking at a recipe on your favorite cooking website. We want to improve our recipes. If you have three minutes to spare to answer three questions, click here to take a survey. Okay. What survey tool are we going to use that we have an enterprise agreement with where we know that all of the data that’s collected is collected in a way where we know it’s secure? Because we have a license that we negotiated where we know exactly where the data is stored and who can access it and who has liability if there is some type of data leak. Okay. So there’s the survey tool. Okay. Maybe we want people who took the survey to opt into a follow-up interview. So we can add that question. Are you interested in ing our recipe ? If so, click here. And, you know, it takes you to a page where you can add your name and your email address. Okay. Where is that going to be stored? Who’s going to have access to it? And I realize, like, this is not — these are not new challenges. But my simple ask of we want to do a study led my legal team to work with me to say, okay, then what happens, then what happens, then what happens? Because in previous organizations, I would just be scrappy and say, yeah, they’ll pull out a Google form, it’ll go to a spreadsheet, and then I’ll email them and I’ll send them a link to my Calendly and they’ll schedule a time. And now it’s, no, we don’t have Calendly here. You can’t use that. So what else could we use? We don’t use Gmail or Google Calendar, but we do use this other product. What are other ways that we could create an inbox and create a link to a calendar? We don’t have an enterprise license with Zoom, but we have this other thing. So it’s really just looking at the menu of possibilities and picking the least bad options. Ideally, the better options, it would create a better experience. But making sure that from initial to when we promise to expunge data, no stone is left unturned and we can for every step of that process and know exactly what’s happening. And again, I know other people do this all the time, but for me, it was a learning experience to go from the scrappy or the let’s just throw money at this way of doing it to, okay, we really need to be buttoned up because a lawsuit is the worst possible outcome here. And I don’t want that to happen. I don’t want the company to lose money. I don’t want to ruin a reputation. So let’s make sure that whatever we’re doing is rock solid and is durable so that once we put it in place, anybody can do it and everybody can do research going forward. Steve: Do you have a sense in your year and two months, what’s the progress indicator for you about building these processes, building these kinds of operations and infrastructure? Gregg: I won’t claim that we have it perfect yet because it still takes a while to get an intercept on our sites just because there’s a lot of people to go through and there’s some engineering lift. It’s not just flip a switch. So I would say that doing it is not easy, but there is now a process that we can follow to do that type of research. I think the better marker of success is I’ve been able to open headcount because even with the technical ability to do research, research is still not anyone’s primary responsibility except for me and my team. So product managers are managing product. I don’t always have time to do research nor give it their 100% of their brain. Same with product designers. But because there is such demand for research, I was able to open headcount. And I think that’s the real sign that we’re making progress. Everyone wants to make better decisions, and the company has put the money into hiring humans to help us make better decisions. Steve: I love that. Let’s switch topics a little bit as you brought us, I think, up to date and even where you’ve been successful and where it’s taking you in this current organization. Let’s talk about your book. Research Practice, Perspectives from UX Researchers in a Changing Field. Gregg: Yay. Gregg: That is my book, Steve. Yes, I published this book in January of 2021, so we’re now three years out from it somehow. I suddenly had a lot of time on my hands during the first year of the pandemic to publish this. But this was a book that — maybe this — I’m assuming this happens to you. So what I found is I would write a blog post or I would give a talk, but the thing that people always wanted to ask me about was how do I get a job as a researcher coming from academia, from being a psychologist, from being a marketer? How do I make myself attractive for a research job? What do I need to do? I’m a team of one. I don’t know how to make the case that I shouldn’t be a team of one anymore. What do I do? I’m so lonely. I also don’t have any mentorship. Or I’m a new manager at Help. Nobody knows what to tell me on how to actually be a manager in a research team. How do I make the case for more headcount? How do I manage a team? How do I hire? Those were the questions that came in constantly. And I had my stock answers that I would give. I had articles I would point people to. I would try to anticipate what people were going to ask and write blog posts about it. But the questions kept coming. So I thought what if I created a book that just talked about what a career in UX research might look like? And I realized very quickly that I am not the person to write the all-encoming guide to a UX research career. Because at that point, I had been a designer who transitioned into UX research. I had worked at Mailchimp. And I had worked at Vox Media. That is a small sample size to talk about all the places a UX research career might go. On the suggestion of a very smart friend of mine named Sian Townsend, she suggested why not ask other people to contribute their own perspectives and open source this? Which was such a great idea. So the book that I had started to write about what a career in UX research might look like became a collaborative effort to get multiple perspectives on a UX research career journey. From getting the job to the challenges of the job to where you might go next. And it was a fun project to work on. I wrote my own essays. I solicited essays from many other research leaders. I worked with an amazing editor named Nicole Fenton. Nicole edited Abby Covert’s book on information architecture, which is one of my favorite books on product and research and just thinking about information. So I sought out Nicole. Nicole helped make this book fantastic, in my opinion. And it was a really good exercise in publishing content, project management, and working with a host of other research leaders to create something that would be a good and evergreen artifact for the research community. Steve: When you’re in that role and you get these different perspectives, are there situations where you don’t agree with the guidance that’s coming in one of these essays? Gregg: No, it might not be what I would do personally, but I also, I mean, I wasn’t trained as a researcher, first of all. So if somebody is going to talk about a rigorous quantitative research approach, then who am I to say that’s not the way I would do it? And that was the whole point of the book was you’re going to get conflicting opinions. You’re going to get different perspectives. And I guess maybe the biggest takeaway is there’s no one way to do UX research, which is also the name of a blog post I wrote last summer, because we were also seeing a lot of comments on the state of the research industry. But the more I talked to research leaders and from talking to them about this book, there really isn’t one way to do research, nor one path for UX researchers. So I wanted this book to have differing opinions and perspectives, whether I agreed with all of them or not. Steve: You talk about evergreen, and yet Changing Field is in the title. So what does the book look like to you now, kind of three years after it’s out? Maybe that’s what you’re kind of getting at, you’re having these conversations with people later than when you wrote the book about what the world is now. Gregg: And I think, you know, when I think about how I would tackle the book today or what I think would be different or what I’m hearing from other researchers I speak to, there is such a drive to prove the value of research and make sure that it’s worth the economic investment in hiring a research team or a research person. And I don’t love that mindset that we have to always be proving our worth. But I think that’s a theme that comes up in the conversations I have, and it’s something I would expect to — there would be more content about that if I were to publish the book again. I also think that the financial realities of today mean that people are working leaner. They don’t have as big a research budget as maybe they once had. Teams — we all saw there’s been layoffs over the last two years. Teams are smaller. They’re sacrificing headcount or being forced to sacrifice headcount. So I think teams are also seeing that they have to get by with less, less people, less tools. So I think we have more constraints and more expectations to make the investment worth it. Steve: I mean, you made the point that the book is lots of people’s perspectives, but you are continuing to share your own expertise and guidance in, I see you on all the online things with blog posts and videos and so on. What are you focusing on in the questions that you’re trying to answer yourself? Gregg: I think the thing that I always come back to is that there is no one way to do research. And I also think there’s no one way to do research leadership. So often when I post a video or write something, it’s a knee-jerk reaction to something somebody else might have said that I feel like is going to discourage folks or paint this industry in a negative light. I don’t know if that’s the right way to phrase it. But I often want — I don’t want to sound like a Pollyanna, but I love this field. I think it’s invaluable. I think more companies should have a research function. And so anything that I write is usually meant to show that there’s opportunity, there is value in this work, and make sure that the folks who are curious about UX research maybe aren’t being sold a jaded or maybe geographically focused perspective. I think I just want to provide balance. Whether that’s coming through or not, I’m not sure, but that’s usually what prompts me is I see something and I think I don’t know if that quite captures it. I don’t know if that’s the whole story. I wonder if I have something I can say to add a different perspective. Steve: So in the notes to this episode, we can point people to the book, but where are you writing or creating other kinds of information and guidance for people? Gregg: If you go to my website, gregg.io, that is my blog, which I was on a roll last year when I was, it was funny, like as I was doing all the heavy lifting of putting processes in place, I was so motivated to create content and share what I was doing. So I had this streak last year of a lot of blog posts. I’ve kind of tailed off as I’ve gotten busier. But you can go to my website and I post content there. There’s also a newsletter you can sign up for, which just takes the blog posts and sends them to your inbox. I also post on LinkedIn from time to time. Those are pretty much the main places right now. I’ve created a number of videos for the Learners app, but that too has kind of tailed off as my day job has demanded more of my time. Steve: And do you think another book is in your future? Gregg: I don’t know if a book is in my future, but I do think that there is an update of sorts that should happen. Like I said, there’s been layoffs. People are tightening their belts and spending less on research. So I’d be curious to talk to research leaders. Although you’re already doing that, so maybe I’ll just feed you some questions to ask people. But I do think there should be an update of sorts. I just don’t know if the book is the right vehicle for that. Steve: So you want to be like a Daily Show correspondent, right? Gregg: I would take that job in addition to my current job. Steve: And by Daily Show correspondent, I meant for this podcast. Gregg: Exactly. Just put me on spot assignments and let me help, Steve. Not that you need it. Steve: Okay. All right, all right. The syndicated media network that is giving birth to right here, everybody. This is the moment. And Gregg, what shall we brand this larger effort? Gregg: I’ve already been thinking about the larger extended universe. So the book is called Research Practice. My newsletter is Research Practicing. So maybe it’s Research Perfecting. Maybe it’s Research Repractice. Again, I’m terrible with names. So let’s not tie me to any of these terrible names I just threw out. Steve: All right, well, Gregg is giggling politely, and so that might be the sign that we’re kind of coming to the end of our conversation. Any last thoughts for this time together, Gregg, or anything to kind of throw in there? Gregg: No, Steve. I just want to say, you had me as a guest nine years ago, which at the time I thought it doesn’t get better than this. And I’m fortunate to have developed a relationship with you where you provided so much good advice and an astounding board. And so to come back nine years later and do another episode with you, I’m thankful and I’m excited. So thank you for having me. Steve: Thank you very much for taking the time. It was really great to get your perspective, and I think people are going to learn a lot from hearing you today. So thank you. Gregg: Thanks Steve Steve: Yes, there we go. Thanks for listening. Tell your friends, tell your enemies about Dollars to Donuts. Give us a review on Apple Podcasts or any place that reviews podcasts. Find Dollars to Donuts in all the places that have all the things. Or visit portugal.com/podcast for all of the episodes, complete with show notes and transcripts. Our theme music is by Bruce Todd. The post 40. Gregg Bernstein returns first appeared on Portigal Consulting.
58:10
39. Mani Pande of Cisco Meraki
Episodio en Dollars to Donuts
This episode of Dollars to Donuts features my interview with Mani Pande, Director and Head of Research at Cisco Meraki. We used to do these immersion events where we would bring everybody who worked on, who was our stakeholder, to come and listen and talk to our customers. And we would do these focus groups, they were like a whole day event. There were folks from marketing and ops team who ran some of these focus groups. And when we got about the immersion, it was very clear that everybody realized that when researchers are not doing the moderation, the kind of data that you get is not good. And the conversations were not that interesting. They didn’t feel that it was a good use of their time. So I think you can have your stakeholders experience it, that it’s not that easy to do moderation. – Mani Pande Show Links Interviewing s, 2nd Edition UI Breakfast Podcast. Episode 280: Interviewing Techniques with Steve Portigal Mani on LinkedIn Cisco Meraki the Double Diamond CSAT On-Prem SuccessFactors How To Influence Without Authority In The Workplace structural equation modeling latent class analysis FigJam Miro Institute for the Future Comscore Help other people find Dollars to Donuts by leaving a review on Apple Podcasts. Transcript Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead research in their organization. In case you don’t already know, I recently released a second edition of my classic book, Interviewing s. This new edition is the product of 10 more years of me working as a researcher and teaching other people. It’s bigger and better. It’s got two new chapters, a lot of updated content, new examples, new guest essays, and more. As part of releasing this new edition, I’ve been on a number of podcasts myself, including a conversation with Jane Portman that was part of her podcast, UI Breakfast. Here’s a quick excerpt from that conversation. Jane Portman: As you’re training other researchers, you’re training experienced researchers, I’m thinking, what do you feel is common knowledge that they’re mastering well and that we’re all like good at and what things are surprisingly difficult? Steve: You know, I think especially for people that are, they have a little bit of experience, but they’re kind of starting to blossom a little bit. One of the things they often need the most is confidence. And so people will often describe a scenario that they were in. People are messy and people are unpredictable and so all these things happen and so they go in with the best of intentions and plans and then things change a little bit. Somebody mentions their divorce, they don’t know what to do. And so I feel like my job isn’t to tell people that they’ve screwed up and that’s not how you do it. I think my job is to tell people that the thing that you encountered is very common. It’s a thing that a lot of researchers struggle with. I try to handle it this way, but there are situations when I handle it this way. Like I think I have a lot of like specific guidance and best practices, but all of those come with a lot of subjectivity and that it’s sort of the nature of the work to be a little confused or uncertain and to have to try things. And by the way, there’s no right choice. When someone mentions their divorce, you know, not even me versus you, like me versus me. The next time I would do that interview, I would handle it differently. There’s a moment and if the divorce was brought up in minute three versus minute thirty, like it would play out differently. We’re not algorithms, we are, I think improvisation is a big part of it. So I think I want to help junior researchers feel okay about that there’s no one right way to handle this and that their way of, the fact that they felt confused and uncertain in a situation and they made this kind of, here’s how they addressed it, it’s, I rarely tell them like, well that’s the worst thing you could have done. It’s usually, they usually are doing their best. That confidence to make a different choice is kind of what I want to help somebody with. You know, more experienced interviewers, I think I like, I like working with them because I think we can have a better, richer conversation about what are all these choices and what are the differences between them and you know, I love being in a workshop where I’ve got people with different experience levels because then I might give some guidance and then we can have different people suggest, well you know, here’s what I’ve done. Everyone can learn from each other and sometimes we can debate and I don’t mean that in like a right and wrong way but I think someone with experience, we can have a really interesting conversation where we look at one scenario and four or five different ways to handle it and we might disagree on what is sort of the optimal way. I think what that surfaces is that as individuals, as interviewers, we’re all wired differently and we all have different instincts and different personalities and you can get away with things that I can’t because of my age or my gender or my energy and I can get away with things that you can’t and getting away is maybe there’s the wrong framing but there’s just so many interesting choices and it’s I love hearing about other people’s things and thinking like, oh could I have that amount of friendliness or that amount of stillness or that amount of curiosity or that amount of empathy, you know, could I present those things in different amounts? You know, that’s part of being an expert interviewer is you have your own personality and your own strengths and you can exercise different facets of that as the situation requires. I think, you know, a more experienced interviewer has more adaptability to that, has their core, you know, but also can put on different authentic, true to themselves faces and energies to kind of different kinds of situations that are going to come up in these interviews. Again, that was me speaking with Jane Portman on UI Breakfast. You can check out the whole episode and you should totally buy a copy of this new edition of Interviewing s. You can also check out portigal.com/services to read more about how I work with teams and companies. But now, let’s get to my conversation with Mani Pande. She’s the Director and Head of Research at Cisco Meraki. Well Mani, thanks so much for coming on the podcast. It’s great to get the chance to chat with you today. Mani Pande: Thank you, Steve, for having me on the podcast. I’ve been listening to your podcast for several years, so it’s great to be a guest on it. Steve: Excellent. Do you want to start us off with kind of a little introduction to you and then we can build a conversation from there? Mani: Sure. So my name is Mani Pande, and currently I lead the UXR team for Cisco Meraki. And, you know, Cisco is a really big company, and I am part of the networking division. And within the networking division, like, there are two types of primary products, I would say. Meraki, which is their SaaS offering, and then enterprise networking, which has primarily been their on-prem offering. And my team works across both Meraki and enterprise networking. So it’s a pretty big team, and it’s a big part of Cisco’s business because for Cisco, networking is still bread and butter. So there’s a lot of interesting work that the team does, and a lot of the work that they do does impact the products that we ship and also has a — hopefully has a lot of positive impact on Cisco’s bottom line. Steve: Do you have any examples you can share about situations where research impacted something that the product was doing? Mani: Yeah, so one of the — my manager is a big believer in the double diamond approach, you know, starting from doing foundational work and then moving on to doing — you know, once you have the design, testing it, doing concept testing, doing usability testing, and once you have shipped it, you are also — you also have some kind of metrics that you have used to define success and trying to measure those. So there are several examples that — where the team worked throughout the double diamond process and was able to make an impact not only just in defining, you know, what kind of product do we want to ship, but once we had some concepts, they helped define what are some of the hypotheses that we want to test, did a lot of concept validation, did a lot of usability testing because we had a lot of designs that we wanted to test, so pressure tested them in front of our customers. And then, you know, once we had shipped the product, like we wanted to see that, you know, our customer’s happy with it. Do they really like it? Was it worth the effort? So, you know, doing some kind of like customer satisfaction surveys as part of getting continuous from customers. And also, you know, even when we did those customer CSAP surveys, like we got a lot of open-ended comments, and we got some from customers such, “Oh, you know, this product, like this is not working or that is not working.” So those were good early signals of what we needed to change before it escalated into a issue. So there are several projects that we have worked on. At Meraki, we always call our projects by planet names. So there’s a project called Jupiter. There’s a project called Aurora, which is all around, you know, providing more visibility for on-prem devices to show up on the Meraki dashboard so you can see them as well as manage them through the SaaS product. Steve: And so is it the same, roughly the same, group of researchers that are working through the stages of the double diamond like you described? Mani: Yes, I would say like it’s like, for example, for one of the projects that I mentioned, Aurora, it was a same researcher who worked through the whole process and obviously in very, very close collaboration with the designer. So they were a tight-knit team that worked throughout the double diamond process. So, you know, yes, they are. Steve: Is that an example of a researcher being embedded? I know that’s kind of a buzzword. Is the researcher embedded with that team in that case? Mani: And I know there are a lot of people have a lot of, especially research leaders have a lot of opinion about whether you should have an embedded model versus not an embedded model. In our case, I think just because of the complexity of the domain, having an embedded model is extremely important. I, you know, I have worked across, like I’ve worked at Wikipedia, I’ve worked with that Lyft, I have some B2B experience, I used to work at Success Factor where we made HR software, I worked at Samsung, I worked as a consultant where, you know, you just need to know a little bit to be effective. But I have never, ever worked in such a technical domain, which is networking. Like every day at work, I feel like I, sometimes I feel like, okay, I know a little bit, but then there are days where I feel like I know my thing. So for us, I think the embedded model is extremely important because you need to have a little bit of domain expertise to be able to do research a little more intelligently and more meaningfully. And another thing like I feel, this is my perspective that an embedded model works better. And, you know, even when I worked at Lyft, which I would say in of complexity is nothing compared to enterprise networking, we still had an embedded model because what an embedded model enables is relationships, which are harder to form if you’re not in an embedded model. Like for researchers, one of the things that, you know, we do is that we lead or we bring about change without authority. So for that, I feel like having relationships is extremely important. In fact, you know, like there’s this article that I came across recently from Harvard Business Review, which, you know, they had listed like the three things that you need to do to lead without authority. One of them was relationships. Like having relationships with your PM partners, with your design counterpart, engineering, data science, it’s extremely important to be able to bring about change, to be able to show that, you know, what you are hearing from your customers matter, to be able to change hearts and minds. Because I sometimes feel that we are in the business of changing hearts and minds. You know, a lot of people, like I have worked with a lot of PM partners, they have very, you know, some of them have very strong opinions. So to be able to bring about change, I feel that having strong relationships is extremely important. So I am a big believer in the embedded model. Steve: Are there other things that they have to do or that you encourage them to do to build the relationships in the way that you’re talking about? Mani: There are various things like I have done all my career and then I also encourage my team to do. So one of the things is I feel like to have a good relationship, you need to bring, and this is more important if you’re in IT and you’re doing the research yourself, is to bring your design, to bring your stakeholders along with the right when you are conducting research. So that’s one thing that I always encourage my teams is invite people to come to your research sessions. Make sure that they are involved in helping you come up with the insights. Like obviously the researchers are going to do the heavy lifting. Like you don’t expect the PMs or the engineers or the data science to do the heavy lifting. But do a workshop with them and ask them, like what did you hear from some of the interviews that you attended? Like what resonated with you? And also the other thing that comes out with that is that you have less resistance towards the end. People are less likely to challenge you. So it ensures that everyone is kind of on the same page from the beginning. So I feel that it’s also good for relationship building. So that’s one thing that I always tell the ICs to do. And myself as a research leader, like I obviously try and build relationship with whoever are my counterparts. And I also try, I’ve always done is like build relationship, especially with, you know, PMs like who are like the head of the product that your team leads. So for example, when I was at Lyft, like I worked on the Driver app. So I used to meet with our head of product management for Driver at least one er to make sure that I also had a good relationship with them. Another thing that I learned through that was working with them, it was easier to figure out what we should do long term because a lot of product managers are only thinking about, you know, what they have to deliver within the quarter or maximum the six months. Like if you build relationships with the leadership, like it enables your team to work on more long term projects. So that was just a learning that I had. And I always try and do that is like have a good relationship with the head of product, like people, you know, two or three levels above me. I mean, they have a lot of ideas. Steve: How does the relationship the longer term conversation? Mani: Like they are thinking more about, you know, like where the business needs to be. They’re not so focused on the product roap. They are not so much thinking about, you know, this is the feature that I need to ship tomorrow. And also, you know, you can get them to say yes to something that you feel that the team should be working on, but that they might be a little bit of pushback from the product team. So if you get their blessings, you know, you can be working on projects. Like, you know, as researchers, we have a lot of opinion. And I always tell people like you have to have a point of view. You are spending so much of time with the customers. You’re talking to them. Like if you don’t have a point of view on what research we need to do or what matters to our customers, then you’re probably not doing a good job. So like let’s say if you have a point of view of some research that needs to be done, but it’s much more long term. You know, you would probably not going to see the impact or nobody is thinking about in of their roap for the next quarter or the half. Like it’s easier to get a buy-in from the executive. And to be able to get that buy-in, like you have to be able to get that buy-in like you have to have a relationship with them. That’s what I have experienced and it has worked for me in the past. That’s what I do, but I also encourage like my ICs to do it. Steve: So it’s you as the leader are having the relationship and the buy-in for the longer term pieces. This is not things that your ICs and researchers are focused on. This is your role as the leader. Mani: But you know, a lot of people feel intimidated to go and meet the VP. So I encourage them to do it. And in fact, you know, like one of the other things when I used to do this was I would take like if I did this conversation, I would invite the ICs who were relevant for that conversation to be part of that conversation. So they didn’t feel intimidated to be having that conversation and they could also participate in that conversation and hopefully meaning going forward can do it themselves without me being there. Steve: So when you talk about relationships, you’re creating them yourself. You’re encouraging others to do that. And then you’re, I guess, enabling or facilitating relationships between other people. You’re talking about a number of different fronts. for building these relationships– the workshops, inviting people to come to sessions, having these sort of, I guess, planning meetings or discussion meetings. Mani: Yeah, ultimately my role as a research leader is to help my, enable my team. That’s how I think about it. That is one of my important goals. I would say not the only goal. So whatever I can do to enable that, I always try and facilitate that. Steve: One thing that I’ve heard a lot and that I experienced myself is that the kind of relationship building you’re talking about, whether that’s workshops, participating in interviews, or just kind of meeting, is harder or it’s at least different when work is remote. And I wonder, have you seen changes in how you’re doing this relationship building or how you’re helping others to do it over the past few years? Mani: Yeah, I mean you can think of it as a glass half full or half empty. That’s how I think about it. I think it’s still possible. In today’s world, like we have so many more tools. Like I would say this would have been so much harder 10 years back. Like doing a remote workshop is so much easier. Like we are all like you can use FigJam, you can use Miro. Like even our stakeholders, like everybody knows how to use those tools. And there are so many templates that you can leverage. Actually, you know, in some ways it’s easier to do it remotely versus do it in person. But I agree like, you know, in person there’s a lot more energy to it. There is something about being in person at the same time. Like the wife is very different. But with the tools that we have, I think it’s just so much easier to do that. The last two jobs I have started, I’ve started them remotely. Steve: That’s a good reframe on my question. I think glass half full, glass half empty is a lovely way to look at it. Mani: And honestly, like if I compare them to the last two jobs that I have, like did I feel any difference? I would say it’s a little harder. It takes a little longer, right? But it’s not impossible. And you can get to that same level of relationship over time. So the one thing that I have done, the one thing that I have done is I have done a lot of work. And the one thing that I have done though is like I’ve also had this privilege like, you know, when I worked at Lyft or at Meraki, I live in the Bay Area. The offices are in the Bay Area. So when I go to the office, like I don’t, I think of those days as days of relationship building. So I go and I meet a lot of people. And at home it’s obviously a little more focused work or you could be in bigger meetings. And I think about work on how I spend my time in the office has also changed a little bit in the last four years. Steve: Right, we might not have done explicitly relationship building, maybe not even having that as an intention, going into the office to build relationships. Mani: Yeah. So not only just with stakeholders, I think it’s also important for the team. Like when I was at Lyft, our team, you know, like when we went to the office, we all went to the office. So it became more like a team day. Not very productive, I would say, but it was good for, you know, team bonding and it was good for us in of, you know, coming to know each other and just building our relationship and figuring out what kind of a team we are together. Steve: You mentioned this article about change without authority and you said that they listed three things and one of them was relationships. Putting you on the spot, do you what the other two were? Mani: I do actually because I did a blog about it. So I’ve given it a lot of thought. The other one was expertise. I think that’s also an important one for researchers because in the last, you know, few years, there’s all this, everybody talks about DIY research, you know, you can farm out research, everybody can do research, and there’s also this perception, you know, how hard is it to do research? Like, you only have to talk to people, right? Like, you and I are talking. We could be doing research right now. So I think it’s important for us as researchers to show our stakeholders that, you know, research is not easy as it appears. It’s very easy for us to do bad research and get bad insights from that. So one of the things that I did at one of my previous jobs was we used to do these immersion events where we would bring, you know, everybody who worked on, who was our stakeholder, to come and listen and talk to our customers. And we would do these focus groups if they were like a whole day event. And in the beginning when we started that, there were folks from marketing and ops team who ran some of these focus groups. And when we got about the immersion, it was very clear that everybody realized that when researchers are not doing the moderation, the kind of data that you get is not good. And the conversations were not that interesting. They didn’t feel that it was a good use of their time. So I think you can have your stakeholders experience it, that it’s not that easy to do moderation. In fact, I feel like when I was in IC and if I times, like I would do four to five interviews in a day, I would be mentally exhausted. You know, doing moderation is one of the hardest things to do because you’re multitasking at another level. Like you’re trying to, I mean, I’m a little old school, I would take notes, I would try and listen to what the person is saying and figure out like, do I follow my interview script? Do I change it? So just having people experience that is important. So I think show and tell could be one way of showing that, you know, research is hard. And then when it comes to quantitative research, like, you know, writing surveys, I think that’s a pretty specialized skill. I have seen people, you know, think that they can write surveys, but there’s a lot that goes into it. You can get absolutely wrong responses if your question is not well designed, if your scale is not well designed. So for quant research, like I have a little bit of a strong opinion on that, that I don’t think it should ever be DIY. It shouldn’t ever be unskilled stakeholders to write a survey. So that’s one thing, like, you know, just showing your expertise is one thing that you can do to leave without authority. The other one they mentioned was about business. Organizational understanding is the third one. That is very important. That, you know, one of the things I will, as I said, you know, I have a long career, I worked for 20 years after grad school. I have seen often like researchers resist or don’t want to have a good understanding of how the company makes revenue, earns profit. They’re a little wary of that part of the business. I think it’s also important for us to understand, like, how does the company make profit? Like, for example, if you work for a B2B company, like, I would say that you should have a relationship with the sales team to understand how do they sell, what do they sell, like, what is some of the that they get from customers. So I think having a little bit of understanding of revenue and profitability is important. Like, for example, like, if you work for a company that’s in the gig economy, which has, you know, which has a marketplace, like, it could be Lyft, Uber or DoorDash, when that is the crux of the business marketplace. That’s how the company makes money. So understanding the dynamics of the marketplace is an important thing that a researcher should do. Let’s say if you are a researcher at Uber and you’re working on the Rider app, you should still have an understanding of the marketplace. Usually, you know, it’s the marketplace is kind of on its own, but irrespective of whether you work on the Driver app or the Rider app, you should have an understanding of how, you know, Riders and Drivers are matched, because ultimately that’s where the secret sauce happens and that’s where the company makes money. I think a lot of us come from academic background and maybe a little bit of purist background. So maybe it could, this is just a hypothesis and that is where it comes from. I mean, I would say to a certain extent, even I had it, like, early in my career. It took a while for me to realize that understanding revenue, profitability is important. One thing that I would say, like, most researchers do agree on is understanding business priorities and, you know, doing research that aligns with business priority. I don’t see, I have never seen that resistance on that, but when it comes to revenue and profitability, I have seen researchers, like, have a little bit of resistance to that. Like, for example, I have some friends who work at Facebook and Google, you know, how they make money is ads, but many of them did not want to work on the ads team. Steve: That seems different to me and just thinking about myself. That seems different to me than understanding like, how does the system match riders and drivers? When you say that, that kind of sparks my researcher curiosity. Like I think we would want to know that because what’s behind that secret wall and how do the gears all mesh? But again, my own bias here when you say working on ads is not, I kind of get that feeling too. Like I just think like, ugh, ads. And so to me that seems different. And I don’t know, I’m not trying to pin you down on something because you’re talking kind of subjectively what we each have our impressions. But I think this idea that, to your point about changing without authority, that there are these important things to understand. I guess there’s a difference between wanting to work on Facebook ads as your project and having enough of an understanding of the revenue model, which impacts everything you would ever do research on at Facebook I think. Mani: So I think both. What I’m trying to get at is understanding the money part of the business is important. Like, and the money part of the business for different companies is different. Steve: Yeah. Mani: For a gig economy, it’s the marketplace. For Google and Facebook, it’s ads. So having some understanding of how your company makes money is important for researchers. Steve: And not to flog a dead horse here, but I feel like companies like Google and Meta or Facebook, it seems like their culture is such that how the company makes money is sort of kept in a separate box and we’re going to come here and work on whatever the latest amazing thing that’s going to change the world Internet through balloons, you know, some amazing project. And so speaking out of my hat here, but I have some empathy for people that don’t want to think about the money because they’re not being sold that as an employee. I don’t know if that’s true. I’m hypothesizing that the company culture kind of keeps those things in separate buckets. But if you work at Lyft or Uber or DoorDash or Meraki, there’s a product and a service that’s much more essential to the conversation they’re having. Again, this is not my direct experience. I’m just kind of — I’m just giving you my biased interpretation of what you’re describing. Mani: I think just the basic level is probably, I’m sure, like, you know, they keep it under wraps to a certain extent, but just like having a basic understanding and not having an aversion to it is important. Steve: So when you talk about these factors, right, the understanding the business, so there’s some specificity, the expertise and the relationships, and you talked about Cisco and Meraki being just the level of complexity, the sort of technological and I guess industry specific stuff. Does that complexity become a compounding factor or something in trying to achieve those levels of those three factors that you’ve brought up? Mani: I think especially for something this technical, having some domain expertise matters. And that goes back to having some, that goes back to the first one that I was talking about, which is expertise. Like having some domain expertise becomes important because if you want to have a meaningful conversation, you know, if you’re doing an interview, like you need to have some basic level of understanding to have a meaningful conversation. I know in research we say that, you know, no question is stupid, but if you have zero understanding, you’ll only have stupid questions, at least in this kind of a domain. So I think having some understanding is important, and that’s what I tell all my researchers. Like after I ed Meraki, I have this cheat sheet I call “Mani’s Cheat Sheet” about networking, and every time I hear something that I don’t know, I get a version from Google and then I get a chat GPT version, like please explain it to a middle schooler version, which actually is the version that works for me. And that cheat sheet, as I’m like, “Huh, like 20, 30 pages long?” It just keeps on increasing. I’m not going to be an expert on networking. I don’t want to be, but I just want to know a little bit to be able to have meaningful conversations and to be also able to provide, you know, to my team. Like sometimes, you know, when we have an important presentation, like I work with them to figure out like what are the insights, like what are some of our action items, but if I have no understanding of the domain, I can’t do that. Steve: Can you talk about from the period of time that you came to Meraki, like what that progress has been and what research has gone from to where it is now? Mani: So Meraki has seen incredible growth in design as well as research in, I would say, in the last one, two years, which is the opposite of where, you know, design as a field and UXR as a field is going at other companies. So we have, it’s also because, you know, as I was saying, like the company has really adopted the double diamond approach. So they see research as being an integral part of how you build products. So that’s one of the reasons why we have seen this big, massive growth in the last one year. And the reason why I came here was, you know, as I was telling you earlier, you know, like Cisco had these two products, like the SaaS product, the on-prem, and the strategy now is to, you know, convert, which is, you know, have like some of the on-products be able to be managed in the cloud. So they brought these two teams together, the design team for the on-prem side and the design team from Meraki. So that’s how I got hired as the head of UXR because then, you know, the team grew because you had two different teams that merged and the complexity of the business and the complexity of the problems that the team was expected to answer grew because now you had two different businesses that you had to . So that’s how I got hired and, right, and, you know, our team has grown quite a lot even in the last one year. Like I’ve hired, like just in March, three people have ed the team. And it’s also because, you know, everyone agrees that, you know, you need to do research if you want to build better products. So there is this hunger for research, so that’s the other reason why we’ve been able to grow despite, you know, the market going in the other direction. Steve: So you came into this newly formed, newly merged organization that already had an appetite for a belief and a commitment to research. Mani: Yeah, and that appetite has been growing. I would say it’s been a steady state because there are more and more teams that we are working with because Cisco networking is huge. As I was saying earlier, it’s like it’s their bread and butter. It’s the major chunk of their revenue. So there are more and more teams that we are working with and that’s one of the reasons why I’ve been able to hire researchers to teams that did not do research previously but want to do research now. Steve: Are there things that you are doing in your role that are behind that? In addition to there being these other teams, but do you think you’re responsible for the increasing the demand in any way? Mani: I would say it’s a team effort. I would not put — I won’t say that it’s just me by myself. But obviously, you know, as a research leader — or I would say as a research professional, we have to evangelize research as much as we can all the time. I can talk about my time at SuccessFactors. You know, when I ed SuccessFactors, they already had an IPO. Then they were bought by SAP in 2012 for like $3.5 billion. So the company was pretty successful, but they never had research at that time. And one of my — like what I had to do in the beginning was, you know, just evangelize research and make sure that it became part of how we build products. So in the beginning, I did a lot of like evaluated research to show impact and then, you know, slowly as our team grew, we — I mean, continued to do obviously evaluated research, but we did a lot of generated research as part of new product launches. Steve: So you talk about that situation, they were bought into research enough that they hired you, but it doesn’t sound like the hunger that you’re talking about now wasn’t there at that time. Mani: And also it’s like, you know, we’re talking 2013. I know the field has changed quite a lot in that time. But yes, the hunger wasn’t there. So my job was to create a hunger for research. And it was like very different tactics. Like I did a lot of usability research because that gets you — it’s a quick hit. That’s how I would describe it at times, you know, usability research. If you are looking for quick impact, you can get it pretty fast. And you can get — you can find people who are on your side who become evangelists of research pretty easily. So that’s what I did a lot when, you know, when I had to build a team there because, you know, there was no research team. I wouldn’t say that — I mean, there was no research majority. Steve: I hear relationships and expertise at least in that story. Mani: Yes, there was because I when I ed, there was some usability testing that had been done. But whoever did it, they did not know how to define usability tasks. So when it was an unmoderated test and testing, people were very confused about what was expected from them. And it was very clear that you needed somebody who knew the basics of usability testing to have — who should have set up the test. Steve: Just switching topics slightly, you’ve kind of brought us back in time to some different roles that you’ve had and I wonder if we could go further back and maybe could you talk about how you entered the field of UX research. Mani: I did not think I was going to become a UX researcher, honestly. Like I have a PhD in sociology. I thought I was going to become a college professor. That was my goal. But I’m — my husband is an engineer. So we moved to the Bay Area. And I looked for teaching jobs in the Bay Area. There were none. So I moved — I went — I got a job at Institute for the Future. I don’t know if you know about them, but they are a forecasting research company. So they do a lot of tech research. And I started working there for five years. And I did not even think the kind of research that I was doing was, you know, experience research, like product strategy research. But that’s the kind of work that I did there. One of the biggest clients that we had was Nokia. And I did a lot of ethnographic research, you know, going across — traveling to people’s home in India, U.S., and Brazil, and, you know, trying to understand, you know, how do they use mobile phones? Because this was like 2006, 2008. It was still a new phenomenon. And that helped — that was to help define Nokia’s strategy, especially for these markets for the next one to three years. And from that, I kind of transitioned into UXR. And in fact, you know, when I finished my Ph.D., I did not even know that there was an HCI field I completed in 2004. So it was still pretty early days of — at least from my perspective for the field. Steve: When you found yourself doing global ethnographic fieldwork for Nokia, did you see any difference between how you were working in that context and the skills and approach that you had developed in your PhD? Mani: There are cultural differences that you have to be aware of. One of the biggest things is you should be ready for anything. There are a lot of unknowns. I , like, we were doing this project, but we were working for Nokia. Like, we had — I was working with our clients, and we were supposed to go and interview someone, and the person didn’t show up. So there’s a lot of those things that you got to be ready for, like, you did not prepare for. Like, very rarely it’s happened to me that, you know, I did ethnographic research in the U.S., and we had no shows. But it’s happened in India so many times. It’s happened in Brazil. So you got to have these contingency plans and go with the flow. And I think those are some of the things that you have to be aware of. And also, you know, especially, like, when I worked in Brazil, like, I obviously don’t speak the language. Like, I do not — like, for me, obviously, India is easier because I have lived in India. I know, you know, what the culture is, like, what are the things that you do. Like, for example, if you go to India — to anyone’s house in India, they are going to offer you tea, coffee, something to eat, and it’s polite to eat it. And it’s impolite if you don’t eat it. And when I went to Brazil, like, I had to work with translators, which I had never worked with before. So one of my big learnings was that it takes you twice as much to do the same interview because they are translating it for you. So I think there are a lot of cultural nuances that you have to be familiar with when you go — especially when you do research outside of, you know, U.S. and maybe to a certain extent even Europe. Steve: And did you see differences at that point in your career where you’re coming from an academic environment to a commercial environment? Were there points of transition for you as you moved from one to the other? Mani: Yes. In of, like, I when I wrote my PhD dissertation, I took a year to do the analysis after I had finished my qualitative interviews. So I — and, you know, when I — this Nokia one that I was talking about, like, we were sharing research inside from every interview every day. So I think that is the big transition that you have to make. Like, if you come from academia is the pace. And also thinking about, you know, how — like, how does this impact the business? Like, how does this impact the product? And also working with so many stakeholders. Like, those are some of the big changes. Like, when I was in academia, the only time, like, I worked with others is my advisor provided me for my dissertation. And when I published, I had these three reviewers who gave me for some of the articles that I published. But, yeah, you know, the way you work with stakeholders is so different. And also you got to be, you — okay, we’re doing a lot of quick and dirty work, which obviously, you know, you don’t do it in academia. Especially if you do that, it won’t get published. Steve: So after this Institute for the Future experience, was research that was your career path at that point? Mani: Yeah, and then I ed Wikipedia, which was actually a very amazing experience in of what I was able to do because — I mean, Wikipedia is like I have never met anybody who doesn’t like Wikipedia. I, you know, I’ve worked in different — as I said, you know, I have worked in different industries. Often we more than often, like, meet customers who say, like, I don’t like this about your product. I don’t like that about your product. But for Wikipedia, it was very different. So one of the things that I did when I ed Wikipedia was I did the first ever survey of Wikipedia editors. So Wikipedia is, as you can probably guess, from their mission, unlike most companies, they barely do any tracking. Like, they don’t use, for example, cookies. So we used to rely on ComScore to even get the numbers for active readers for Wikipedia because they were not using cookies. So they did not know at that time, like, what percentage of editors are women because Wikipedia, even today, has a gender problem. So I did the first ever surveys of Wikipedia editors, and the answer at that time was 9%. The only 9% of people editing Wikipedia were women. And as a result of that, a lot of — there are fewer articles about women on Wikipedia versus men. So there’s a big gender bias in Wikipedia. So that is some of the work that I did when I ed Wikipedia, which I thought — and I also worked on the redesign of the mobile app. So I again did ethnographic research in the U.S., India, and Brazil for the redesign of the mobile app for Wikipedia. But all that work was very fulfilling just because of the amount of impact that you could have on s. Because I don’t the numbers now, but at that time there were 250 million active readers on Wikipedia every month, and there used to be like 7 to 8 billion page views. Steve: As a researcher over your career and maybe as a leader now, I don’t know, how do you think about different kinds of methods or different kinds of approaches to getting the data that you want to bring your team? Mani: I mean, I’m trained as a mixed methods researcher, and when you think about mixed methods, I like to believe it’s on a continuum. There are some people who do small surveys and think that they are mixed methods researcher only doing descriptive statistics. There are some people who do regression analysis, who will do, I don’t know, structural equation modeling, latent class analysis, who are also mixed methods researchers. So it’s a little bit on a continuum, even for qualitative, right? Like you can be an ethnographer. You can be somebody who really believes in participant observation, or you can just do qualitative one-on-one interviews. With that said, like I do both because I just happen to be trained. When I did my PhD, I specialized in research methods, which meant that I had to prove that I could do both quant and work. So I did a lot of statistics. I did a lot, you know, I took a lot of qualitative courses. And in fact, you know, my dissertation was a qualitative dissertation, despite, you know, having a very deep background in stats. So I do both. But with that said, I would say you should be methodologically agnostic. Like it should not, depending on what’s the problem that you’re trying to solve, you should figure out what’s the research, right research method. So I think that is what is more important. And that’s why I feel like having a mixed methods background is good, because then you just have more methods in your toolkit. So you can figure out like, you know, for this problem, probably a survey is a good solution. Or for if I want to find the answer to this problem, maybe doing ethnographic research, qualitative interviews is the right thing to do. And, you know, you can use all these methods throughout the process. Like you can do a survey, you know, along with some foundational research. Like I can give you an example. When I worked at Wikipedia, I’m very proud of the feature that we released. I’ll talk about a little bit about that. So when I worked at Wikipedia, I interviewed this person in Salvador, which is, you know, in north of Brazil, in the Bahia district. And he was a Wikipedia editor. And obviously, like you’re a Wikipedia editor, you’re a big reader too. And at that time, he told me that, you know, he wanted to read Portuguese Wikipedia, but he felt that it was not mature enough. Some of the articles did not have the kind of detail that he was looking for. So he would often flip to English Wikipedia. So I thought that was an interesting thing that he mentioned. So after we did the ethnographic research, we followed it up with the survey and we asked people like, you know, do you, how many languages Wikipedia do you read? And, you know, we got data. Like if you read English, what else do you read? So obviously it was clear that, you know, English is the primary one. But more often than not, people read other language Wikipedia’s too. Especially outside of, you know, US I would say, even in Europe, like German Wikipedia is very big, but the German readers are also reading English Wikipedia. So, and that was also very clear in the survey data that we got. So we introduced, it’s even there today in the app. We introduced what we call the inter-language Wikilinks, which made it really easy to toggle between different language Wikipedia. So, for example, like if you search for an article, it tells you that, you know, this article is also available in this other language. You can set your primary languages, you can set your secondary languages. And all this came out through this, you know, ethnographic research that was done, followed it up with a survey, gave us more confidence that, you know, this would be a useful feature. And it still exists like 10, 12 years after we did that research. Steve: That’s a great story and one detail that excites me is that small data point from one method. You find new questions in research. You found something and then you did some more research to kind of understand that phenomenon that you didn’t know. Mani: Yeah. Steve: That’s such a great example of that. That the ethnography was not sufficient to make something new, but it did point to a whole new effort to understand something, which then led to that feature. Mani: Yeah, but ethnographic is that insight, which I don’t think I would have got from a survey, right? So it was, the survey just gave us more confidence that, yes, I think that’s worth building. Steve: Yeah, that’s a good clarification. We’re going to talk about mixed methods and about different kinds of tools kind of feeding together. You talked early on about relationships, so it just makes me think about some of the different roles that we have around data science as a big part of what so many companies are doing. How do you see kind of the relationship between UXRs, mixed methods and otherwise folks and data science? Mani: I think there’s a lot of synergy between what UX cards do and what data scientists do, because ultimately we are both researchers. It’s just that we are looking at different type of data. But if we bring it together, I strongly feel that we can have a much more holistic understanding of our s. And, you know, in my career, like I’ve had, especially at Wikipedia and Lyft, I was lucky to work very closely and actually even at Meraki, work with data scientists. So there are various ways we can work with data scientists. Like, for example, like if you’re doing segmentation, you know, often we do segmentation of our s just based on attitudinal data. But you can get the behavioral data from data scientists and bring that together to have, to just have a better segmentation model of your s. Various companies, they have dashboards to, you know, look at how their customers are doing and tracking. Many of them, especially in B2C, tend to be more behavioral based versus attitudinal. So I think there’s an opportunity there to have dashboards that not only just tell you what are the s doing, but also how are they feeling. I think that’s an important story to tell. So there’s that one can do. Then there’s also an opportunity to work with data science during A/B experimentations. Like in my past companies, I have worked with data scientists when they have an A/B experiment to, you know, do a survey along with the experiment to see like if there are any differences between the control and the experimental group on not only what they’re doing, but how they are feeling. And the one thing that I learned is that there’s a little bit of lag. Like if people are unhappy, it takes a little while before they stop doing the thing. So like, you know, doing that survey gives you an early pulse that, okay, maybe this experiment is not working as well as it should. Maybe we should tweak something so that the s are happier. So I think that is one thing. So those are some of the opportunities that I see working with data scientists. Steve: Is there anything else today in our conversation that I should have asked you about or you want to make sure that we talk about? Mani: I can’t think of anything else right now. I’m sure I will think of something later. Steve: Well, Mani, it’s been really lovely to speak with you and learn from you. I want to thank you again for taking the time to chat and share all your stories and experiences. It’s been great. Mani: And thank you, Steve, for having me on the podcast. Steve: There we go. Thanks for listening. Tell your friends. Tell your enemies about Dollars to Donuts. Give us a review on Apple Podcasts or any place that reviews podcasts. Find Dollars to Donuts in all the places that have all the things or visit portigal.com/podcast for all of the episodes complete with show notes and transcripts. Our theme music is by Bruce Todd. The post 39. Mani Pande of Cisco Meraki first appeared on Portigal Consulting.
54:20
38. Vanessa Whatley of Twilio
Episodio en Dollars to Donuts
This episode of Dollars to Donuts features my interview with Vanessa Whatley, UX Director – Research & Documentation at Twilio. For many years, I had anxiety and regret around not starting my career in the field that I’m in sooner because I felt very very lost stumbling through all of the different fields and roles, and only in hindsight do the dots connect. I’m better at what I do now because I learned the lessons in all of the different jobs. Even something like being an executive assistant, I was able to sit in on more senior leadership meetings, and I really early picked up on short attention span, How do you get your point across concisely, What do they care about? And I think that made me a better researcher right away, even as I was still learning the practice because it taught me something about communication…I think all of those little pieces along the way just shaped how I interact with people and I think has made me better at what I do today. Maybe just know that it’s all connected somehow. – Vanessa Whatley Show Links Interviewing s Workshop at Advancing Research 2024 Interviewing s, 2nd Edition Steve on Darren Hood’s World of UX Vanessa on LinkedIn Twilio Segment Help other people find Dollars to Donuts by leaving a review on Apple Podcasts. Transcript Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead research in their organization. As part of the Advancing Research Conference, I’m teaching a full day in-person workshop er research. It’s March 27th, 2024 in New York City. This is a rare opportunity to learn about interviewing s from me in person. You’ll also have the chance to engage with other researchers at different levels of experience in an interactive environment. I’ll put the link in the show notes for the Advancing Research Conference with more info and information about how to . If you know someone who would benefit from this workshop, then please this info along. The newest version of my workshop makes use of the writing and rewriting I did on the very recent second edition of Interviewing s, which you should absolutely buy several copies of. Shortly after the book came out, I had a conversation with Darren Hood for his World of UX podcast. We got into the intricacies of asking questions. I’ll link to the whole episode, but I’m going to include an excerpt right here. Darren Hood: And the topic of chapter six, the title is the intricacies of asking questions. And I love this because this is probably when I’m teaching people about research, it’s one of the outside of the classroom. When I’m talking to people about research, this is the topic for me that comes up the most. This one particular thing that you mentioned, and I’m going to read another excerpt from the book. And there’s a heading here, “There’s Power in Your Silence.” Oh my God, how many times have I talked to people about this? Steve says, “After you ask a question, be silent. This is tricky because you are speaking with someone you’ve never spoken to before. You’re learning about their conversational rhythm, how receptive they are to your questions, and what cues they give when thinking about an answer. These tiny moments from part of a second to several seconds are nerve-wracking.” And I love that because it’s one of the things that I see and people will ask a question. And it’s funny to watch people grind their teeth when the participant is silent. To watch people, hem and haw, the researcher, wants to help the participant and things get out of hand sometimes. I have seen people jump practically across the table to try to guide somebody because they just couldn’t stand the silence. But the title says it, the subheading there or the heading in the chapter, “There’s Power in Your Silence.” And so I’ll hand it over to you to elaborate on this topic. Steve: Yeah, and I think you described some of the phenomenon pretty well. And there was a moment in this conversation, I think you, because we’re on video, even though we’re recording audio, we are in video and we’re looking at each other and nodding and doing all the, as best we can over video kind of . There’s a point in which you said, “Oh, I see the gears in your head are turning, Steve, and I’m going to turn it over to you.” And I think, as a trained interviewer, as an experienced podcast, I was like, “You learn what that is.” And there’s been moments where I’m asking a question and I’ll just stop. I don’t need to finish my question. The person is ready to talk. And I’m going to ask a follow-up question. I’m going to ask dozens of follow-up questions. So if the thing that they want to say is not exactly what I want to ask them about, it’s better for the whole dynamic to have them go and then me follow on and follow on and follow on as opposed to like, “No, no, no, wait, let me make sure you understand exactly what my question is so the information that you give me perfectly conforms to the parameters which I am articulating.” Like that’s not how research works. It’s this sloppy interface between people that kind of goes back and forth. And so you have to understand your role is to kind of draw that out of them. And so that hem and haw thing, or even trying to help them, as you say, I think is really important because if you can’t allow for that silence, the anti-pattern or the bad behavior that comes out is asking these run-on questions. And the run-on questions are deadly. And in those run-on questions, people start suggesting, so you’re going to ask what could be an open-ended question. Like what kind of microphone do you have for your video calls? But the run-on question is what kind of microphone do you have? Is that a USB mic or is that a Shure microphone or is there that part of your headset? Like you start suggesting possible answers. Darren: Yep, yep. Steve: And the motivation for doing that, I think there’s a lot of, you have to pay attention to yourself. It feels like you can kid yourself that you’re being helpful when you do that. I’m being helpful. I’m just showing them what examples are. But in fact, it’s because you are, and I shouldn’t say you, I should say we, this is, I’m in this all the time. It’s uncomfortable to stop and just say, what kind of microphone is that? For all the reasons in that quote that you described, like, I don’t know what’s going to happen. I’m going to lose space. I’m going to be seen as an idiot. That person’s not tracking with me. My boss is going to watch this video. There’s all this risk in that moment. It’s kind of like a little, it’s a little abyss that you’re kind of beeping into. But if you start suggesting things, it messes up the power dynamic. It says that for one thing, the participant is required to listen to the, to the interviewer going on and on and on. Darren: Right. Steve: And also starts to say over time that their answers should be, it’s multiple choice question. So you participant should be giving answers within the format that I have outlined. So you might think that’s ridiculous, Steve and Darren. If it’s none of those mics, the person’s just going to say, no, this is just an old karaoke mic that I brought up from the basement. They’re going to give you an answer that’s outside that list. And the first time they will, and maybe the second time they will, but eventually you are training them as to how to do a good job, which they want to do. They want to do a good job for you. And so they’re not being squelched from sharing their truth about microphones. They’re just trying to get through this interview and do a good job. So the more you teach them indirectly what a good job looks like, in other words, one of the following, then the more you risk not hearing from them and not kind of getting stuff. And you don’t, we don’t realize the power we have over people despite being kind and self-deprecating and telling them at the beginning, I just want to hear from you. Just tell me your truth. And, you know, there’s no wrong answers. You can do all that. It doesn’t matter once you start training them what good looks like, then that’s what you’re going to get. So I think it’s, it’s hard. And the quote you read kind of explains why it’s hard and we trick ourselves that we’re helping. So that makes it harder. The risk in this, I think is significant because it accrues to changing the dynamic and the interview and changing what it is you’re going to hear. Darren: Yes, yes, absolutely. Steve: Again, that was Darren Hood’s World of UX podcast. Now let’s get to my conversation with Vanessa Whatley. She’s the UX director of research and documentation at Twilio. Vanessa thanks so much for being on Dollars to Donuts. Vanessa Whatley: Thank you for having me. Steve: It’s really great to have you here. So I’m going to just do my cliched opening. The only thing that’s really yet effective, I think, is just to throw it over to you right: away and ask you to give a bit of an introduction to yourself. Vanessa: Sure. My name is Vanessa Whatley. I lead our research service design documentation team at Twilio. So my title is UX director, but I have all the UX functions outside of design is how I like to explain it. Steve: And what’s Twilio for those of us like me that don’t know? Vanessa: Yeah, great question. Usually people think it’s Trulia, like the real estate company. So we’re not that. Twilio is a communications API company, and specifically the portion that my team works on is second So that was a product that was acquired by Twilio a few years back, and we are essentially a customer data platform. Steve: What companies or people or roles use Segment and what do they do with it? Vanessa: Yeah, we are B2B, so lots of large as well small SMB companies use Segment to essentially get a better picture of their data. So a lot of times companies are collecting data in a lot of different tools, a lot of different systems, and then it’s siloed and they have a hard time reconciling. So if you are, you know, Steve on the mobile app and Steve on the website, and then you’re interacting by replying to an email, you might look like three different people. And so Segment really helps bring all of the data sources together, unify it into a single profile, and then from there it lets companies interact with you in a, I guess, a more intelligent way because they know who you are as a single person rather than three different IDs. Steve: Can you briefly put Segment in the context of sort of the larger set of things that Twilio as a company does? Vanessa: Yeah, so if you think about Segment as kind of the data layer, so that can be your foundation of you understand who your customer is really well, what they’ve done with you in the past, or even predict what they might do with you in the future. And then from the data piece, a lot of companies want to actually activate on that data. So they might want to send you a text message or send you an email, and so the rest of the Twilio portfolio kind of has more of the communications APIs so that you can go on ahead and like use that data to actually engage your customer. Another good customer example could be someone like a DoorDash where DoorDash is trying to connect a restaurant, a driver, and the person who ordered the food. Instead of them building everything from scratch in their app, the communication piece of me being able to text a driver without having to give my own number and the driver seeing my number and vice versa, you can use an API so that they can communicate without DoorDash as an app having to build all of that native into their platform. Steve: But so if you’re a company like Twilio and just thinking about, you know, research point of view, you’ve got those kinds of s that you’re describing in like that DoorDash example, the end and the driver, the food purchaser and the driver. But you also have the — I guess it’s some kind of IT or development team that’s using Twilio tools to build that so that their end s or their drivers can all communicate. Are you — from a research point of view, where are you focused, if at all, on any of that? Vanessa: Yeah, so on the Segment side, like I said, we’re more so the data piece that is powering a lot of different things. And so you’re absolutely correct. A lot of our customers end up actually being either the data team or engineering team within a company because they’re the ones that are essentially most likely collecting the data, manipulating the data so that there’s protocols and that it’s actionable. And then ideally it goes all the way through to a business use case. So that might be a product manager or marketer then making decisions on that data set and deciding, okay, we want to run a marketing campaign or we want to analyze this cohort or this audience. Steve: I want to go back to the beginning and you were describing sort of the structure a little bit or the areas of the organization that you’re focused on. And I’m sorry, you had a great catchphrase, which I should have written down because now I’ve forgotten it. Can you go back to that? Vanessa: Oh, I was saying everything in UX outside of design. Yes. Steve: Okay. Vanessa: So there is a different person that manages our full design org, but content design, service design, technical writing, and research all sit within my team. Steve: Can you say a little bit about the research team? Vanessa: Yeah, so we currently have a team of about five and the makeup of that team has changed a lot over time. So we’ve grown with layoffs. We unfortunately lost a few people on the team but overall I think as the size of the team changed, our operating model has kind of shifted along with that. So we started out being a little bit more embedded and really aligning each researcher to a specific area or product area features. And then I would say last year we really decided to go all in on more foundational work and take a little bit less of our demand from product to try to answer like larger strategic questions. And now we probably sit somewhere in the middle where we do a mix of product work as well as foundational work. Steve: I was expecting that embedded and centralized would be the sort of contrasting , but as you’re kind of relaying it, I kind of hear you. I think you’re contrasting embedded and foundational. And maybe you could explain what those endpoints look like as you’re kind of moving between them. Vanessa: Yeah, I think you’re right. I never really thought about the fact that usually people say centralized. I don’t love centralized research orgs. I’ve worked in that manner before, but I’m a little hesitant to call it that with how we’re currently structured because oftentimes, at least my experience, I’m sure there’s many ways to do centralized, but in my experience that often means it’s a little bit more like an intake request type of situation where you act more like an internal consultant almost and you can cover a lot of different breadth. And what I try to do with my team and why I call it foundational is a lot of times the researcher might have still had a focus area that they are stronger in or just cover an entire flow, like a journey and sit within that part of the product. Or maybe they specialize in a set of personas, but they’re not necessarily bouncing around to any project that comes up and we’re managing bandwidth that way. It’s a little bit more driven by where we think product strategy is going to go and then trying to still align people to spaces that they can gain deeper knowledge in just because of the complexity of our space too. It’s really, really hard to bounce around and be the expert in the marketing persona, but then also the data engineer. And yeah, I think that’s why I use those two even though they’re not actually polar opposites per se. Steve: So centralized might mean an intake process, which is challenging if that means that anyone gets assigned to anything. I’m kind of steamrolling over the nuance that you were depicting to kind of check and see. Because I feel like when you’re talking, there’s sort of a couple of aspects. One is like what projects are we going to do? And the other is who’s going to do them? Vanessa: Correct. Steve: So I don’t know. Are you — does the idea of intake, is that in itself limiting or something that you would try to avoid? Vanessa: I think it’s a mix. We definitely talk to all of our stakeholders and try to understand what feature level work and even what foundational work do they want us to kind of produce or participate in, collaborate on. But I actually sit down with my team every quarter and we end up doing anywhere from like 90 minutes to two, three hours of brainstorming where I really encourage them. What are the gaps that you see? What are kind of like big strategic questions or areas where you feel like there hasn’t been enough emphasis or we’re not connecting the dots properly because I do think the risk of operating at the feature level means everything’s a little bit more siloed. And as we know, customers experience things in a series of steps or flows or have an entire journey they need to navigate. And so I just try to position my team so that it can really think at that level. And I find that sometimes when we do more of the intake model from design and from product that they tend to focus on their scope and their area of focus which might more so sit at the feature level than it does cross-product. Steve: Right. So it’s the proactive versus reactive aspect. So people that ask for help from research that aren’t — that don’t know about research as much as you do or your team does are going to ask for the problems to be solved that they think research can help with. But if your team brainstorms, here’s what we’re seeing, here’s where the gaps are, here’s how we can get ahead of what’s going on, then — so now I think the more we talk, the more I understand kind of how you — why you characterize that as foundational. That it’s not reactive, feature level. And you haven’t said this, but I feel like when those questions come, sometimes they come late or when you do that intake model, there’s other ways that you could have helped if you were, like you said, reaching out to those stakeholders and talking about what they’re doing and how you can help them. And you’re saying that you’re kind of in the middle now, you’re somewhere in between if embedded in foundational or sort of endpoints, you’re kind of in the middle right now. Vanessa: Correct. Because I think at the end of the day what I’m trying to balance for is impact. And so there are some, I guess, areas of the product or even feature level things that we know are very critical for us to get done this year. They’re highly complex. They need someone that is thinking in that space day in and day out. And so there are often one or two researchers that are embedded in those spaces. And then there’s broader strategic questions that maybe you’re not getting directly from the PMs. Maybe that’s coming even from senior leadership where they’re thinking about the landscape and where do we need to go and broader, less scope, less defined questions. And so some of that will be, I guess, covered by us or we’ll just create bandwidth for so that we can operate at those different altitudes. And then we’ve also kind of launched a bunch of internal programs so that the feature level work that does need to get done can still be ed. So we have office hours. We have rolling research. We encourage designers to do their own research or PMs to do their own research. I know that is often a hot topic in the industry. I think we try our best to make sure all of the things that would benefit from a researcher’s kind of attention actually gets that attention. But then we also try not to gatekeep due to the size of our team. We’re just not able to get to everything. Steve: I want to follow up on that, but I want to just go back to the more you’re describing. I’m having another reflection, I guess, because you started off saying that you started off in one mode and then you kind of shifted to another and then you made another shift. And, you know, as people try to ask, like, what’s the right model, you know, to hear how over a fairly short period of time, you know, you have — you’ve iterated or evolved and that it makes me think that the answer — there’s so many — there’s so many it depends on, you know, what model to have, and it depends on your company, depends on your team. But also that these are things that change and that there’s no reason to pick one and stick with it, but to adapt as it sounds like you have to changing conditions and right, you know, in another X amount of time, you might go back to fully embedded because the company’s here or your — I guess other factors like your team or other changes in the strategy might lead you to choose a different model. Vanessa: A hundred percent. I think those are some of the deciding variables. Team size. So at one point the team grew to 12 researchers, which is why we were able to embed because we had enough to go around almost. And then when we reduced in size, that meant, okay, we need to find the highest value areas. And then I think, yeah, company strategy, thinking about whether your product is in a place where it’s more stable or if we’re in a place where we have a lot of pressure to innovate. All of those things matter. And kind of I think I took into consideration on how can we work to still make it sustainable for the researchers too. Because of course we could have played the volume game and tried to crank out three to five projects every quarter. But I’ve really been emphasizing let’s choose quality over quantity. And we need time for deep work and for thinking, even if that means you’re cutting down to one to two projects for the quarter and you’re spending much more time synthesizing across past work and doing more foundational or longitudinal work. And luckily we’ve been very much ed by our cross-functional partners and leadership because I know for some companies it’s like, no, everything needs to be tested before it ships. There’s many different reasons why you might get blocked from something like that. I think we’ve been able to answer questions and show areas where we should pivot because of how we choose to work and the types of insights and the clarity that that provides that we’re continuing to see and we’re not really being forced to play the volume game. Steve: Is there anything you can point to that helped you establish that footing where there is that ? Vanessa: I think a project I talk about internally often happened about a year and a half ago where it was my first time to really decide to go rogue a little bit and just grab two people on the team and say this is the area that I think we should investigate and publish research around. And then because I think with research sometimes there’s an ask or there’s like a push model. Like no one asked for this, but we’re telling them anyways. And so I think about a year and a half ago was the first time I did that for a larger scale project and we just found different avenues to communicate the information. Essentially the more we got it in front of the relevant stakeholders and leadership I think the more the problem became clear and people were bought in and then over time we also saw in one particular area like oh the quantitative data starting to that story too because it was a newly launched product. And so we were almost ahead of the trend in being able to point out here’s some of the challenges we’re going to encounter. And I think just having a few case studies like that helped us prove the value and kind of earn the respect, get the traction for having that flexibility. Of course that could have backfired. It could have not been received well, but I think in that particular instance that to me was almost my personal proof point for we should keep going down this path and like helping also me gain confidence that this is the right path to take the team down. Steve: The path in this case is referring to what kind of work and how you’re helping the company. Vanessa: Exactly. The path is essentially choosing our own topics to go investigate even if no one’s asking for the work. And I mean there’s still ways like we don’t go away for three to five months and then come back and like ta-da we have something cool to show you. There’s still ways to gain buy-in along the way so we are doing our due diligence by crafting a proposal, shopping it around, seeing who at the company could be a good stakeholder to actually implement some of the changes that we’re suggesting. So I’m not saying just go rogue and hope it works out, but I guess it’s more so again the proactive versus reactive model like really taking ownership and saying we actually have things that we think are really important to go after and then advocating for that and pursuing it. Steve: And in that first example, the first Rogue project a year and a half ago, I heard you talk about, you know, finding this area and choosing to spend resources and people’s time to go do it, but then you, I think also are highlighting, communicating that to some group of people to kind of highlight it. Yeah. Is there a way to sort of compare the proportion of effort in the — and I don’t know if it cleanly breaks this way, like doing that research and communicating that research? Vanessa: Yeah, I don’t know if there’s a split. I will say for research that other people ask for, the effort and energy to advocate for it is way less. Like if you’re being pulled into something then people are organically just going to show up more, be interested in the findings. So I think there’s a lot more effort and design up front that goes into really making sure that what you do learn is strategically still a place that we can go. Because I think that’s the other thing you want to make sure of, right, is you don’t want to pursue a project and either everyone already knows the information or people didn’t know the information but were already locked into whatever plan when it comes to the product. So I think the up-front work is really important and takes a lot more energy as well as the communication of findings because now you have to create your own forums and audience whereas other work just is very organic. Like yeah, you go to the product manager who owns this feature. Steve: I’m hearing in your answer, and now the second time you’ve kind of explained this, that my question was flawed. My question was kind of about research and then communication, but you’re really emphasizing, even when it’s a study or especially when it’s a study of your own sort of discovery or advocacy, there’s that upfront due diligence. There’s doing the research and there’s the communicating. But you’re doing all those three pieces and I guess just to repeat what you’re saying, when it’s a rogue-style project, the upfront and the after part are significantly more effort. Vanessa: Yeah, I would say so. And I’ve encouraged my team to do this across any project but I think we also do more work to design the artifact ahead of time to think about how this information could be presented or even just have something tangible for someone to react to before we kind of double down and we’re like, again, for foundational work it might be a much higher end than like testing a little feature and so we don’t want to be 15 plus interviews in and realize like, oh, this is not going to work out or people already know this. Steve: So what kinds of things — you’re talking about what kind of output you create as a result of the research? What kinds of things might your team create as output? Vanessa: Correct. Yeah, I think a lot of the traditional artifacts, so we do create a lot of decks. I think beyond decks we work in Figma a lot so we try to prototype different styles of outputs. So sometimes it might get really visual and we’re trying to bring in more graphs and charts or if we’re doing persona work, deg templates and stuff ahead of time to think about what data do we want to collect along the way. I mentioned we also have a service designer on the team so sometimes that is like a full-blown journey map where we’re bringing in all of the layers. We’re bringing in the product touchpoint and like the external guides and people like touchpoint when they’re talking to a salesperson or AE, executive. So I think we try to remain pretty open, sometimes try to get creative in of, you know, what the medium should be. Should it be pre-recorded? Should it be video? But I think, yeah, we really try to do some like content design ahead of time to also use that as part of the conversations when we shop around a project. Steve: So to clarify, this is happening in the upfront portion of the project, so you’re thinking about what the output might be in order to have these conversations with people about this research, which you have identified as important, but they haven’t asked for. So what are you showing them? These are sort of — these are samples of what a deliverable might contain, but there’s no — there’s nothing in it because you haven’t done the research yet. Vanessa: Yeah, either there’s nothing in it and it’s essentially like a template just to show like placeholders of the type of information we can show. But sometimes again we’re not starting from scratch. Like the reason we’re initiating this project is because we kind of have like sprinkled evidence popping up across the team, but we’ve never deliberately investigated this area. And so part of it is almost like an early synthesis of like we have a hunch because we saw this, we saw this. And so sometimes the story just emerges from that already where at high level we can kind of come with an outline of like, okay, these are the group of people. We think they’re experiencing this problem. So of course that’s not like how you do research all of the time, but I think it could be very effective to do that some of the time, especially in an environment where it’s like people are looking to make decisions. If our research is just like, oh, that was cool and people move on, then we didn’t really do our job that well. Steve: So it’s interesting. It’s almost like a — I don’t know, like a trailer for a movie or something. Like, here’s the decision you’re going to be able to make. Here’s where we are today. If we do this, then we can fill in these gaps. And so you’re not selling them research. You’re selling them the thing that they care about, the decision they need to make. Vanessa: Correct. Steve: Way back when I said I wanted to go back to something and then we got into this interesting thread, you said that part of what your team does is, you know, help other cross-functional partners and folks do their own research. And I think you said this is kind of a hot topic right now. But yeah, what’s been effective for you? What have you seen work well? Vanessa: That’s a good question because sometimes it still scares me too. I think the things that have worked are especially stakeholders that have previously either had experience with research or have been close to previous work that we’ve done, tend to already have a better sense of like how to go about it. We’ve created templates just around like here’s how to write a study guide. And the ideal version of that would be they take a stab at it and then bring it to something like an office hours or talk to a researcher they know so that we can at least coach them a little bit on like are these the right questions and like is that the right set of people that you need to talk to? How do you recruit properly? Things like that. And so I think if all of the setup goes well, I sometimes have like less concern about the actual sessions themselves if they’re able to just follow a script. I think the analysis piece then again becomes a little bit risky in of how people analyze, synthesize information. And I think we could probably do more there internally to like guide people through that process because I think it’s just yeah, challenging if you haven’t experienced it often enough or I’ve seen people overgeneralize or grab that quote that s exactly what they want to push or need to do and kind of ignore the rest. So that’s why it’s definitely mixed, my feelings towards it. But at the same time, I have seen it also be very useful when it’s something we just can’t or take on and they do have kind of the proper resources and guidance to get to some of those answers themselves. Steve: size and availability of your team weren’t an issue, what would you do to help, you know, people who do research be effective with analysis and synthesis? Vanessa: Oh, good question. I think I would probably be in the form of a workshop or just live debriefing. Multiple jobs go. We used to do a lot of that as in groups like in person with sticky notes and really go deep on that front. I think now that everyone’s remote, we try to do that more in digital tools. But I think having more guidance around how to approach that or even providing them with a framework or a plan on what notes do you want to capture, how do you properly set that up and capture them, especially the more participants you have. I think that’s where I’ve seen lots of people waste lots of time because they didn’t have a plan. They just went through, captured all the data, and then they’re kind of in a now what situation where it’s like, do I rewatch 20 videos? I’m like, please don’t do that. And so I don’t know. I’ve joked with my team this week that I’ve equated research to party planning, but it’s like the more you plan up front on all the things that are going to be happening and go well and to have a schedule around it and have a plan for how to get to the end, I think the smoother it goes. Steve: I really like party planning as a kind of a framing. And so yeah, analysis and synthesis is part of the party plan. Vanessa: Yep. Steve: I think it’s interesting to hear you and this is not meant to be presented as a disagreement or not, just a reflection that when you think about kind of how to help people move forward with that are not familiar with analysis and synthesis, you’re talking about the, I don’t know, like the tools and tactics of managing that data. That’s I think what I heard you kind of emphasize. I like that because those are things that can be described and be enacted. But there is this part to me of analysis and synthesis that feels, it scares me to, even though I do train people to do this, it still scares me because it feels like it’s creative. I’m even like sheepishly using that word and speaking to you, but it feels like it’s creative and a little bit magic and a little bit hard to describe. Vanessa: 100%, which is why I still say it’s also the scariest part for me to let people go on that part of the journey. So I don’t know. Like when I was first entering research, I think a lot of things helped structure my thinking, at least, around the difference between hearing something verbatim versus having your own interpretation of it and kind of going back and trying to be a little bit more rigorous around what did you hear, what does it mean, and how do you navigate that? But I love hearing you talk about that it is a creative process because I think, yes, once all of the kind of analysis is done and the data is on the page, understanding what’s important, what story you want to tell, and how to put it all back together is, I think, an extremely creative process because regurgitating everything you heard is not going to work. Like you have to make it compelling and you have to find a point. And I actually think that’s the thing that most junior folks struggle with is they want to share everything they learned. And like what are your top three things? Because in reality, we can only action probably one to two, if anything. So I’m a huge fan of pushing for prescriptive findings and having ideas around what should happen next. Steve: Can you explain prescriptive findings? Vanessa: Yeah. I think so. Rather than having a recommendation like XYZ needs to be clearer on this page or like s were confused by X, which is a little bit more just like describing what happened and where the problem lies, telling the rest of the team, here’s how I think we should solve it. Like, are you saying you should write something differently? Are you saying a human needs to help them? Are we going to introduce AI because that makes it better for them? Like, I think there’s so many different ways to kind of get into solutioning. And I mean, maybe one could argue by just presenting the problem, you’re leaving that solutioning piece open. And in that case, I would say follow up with a workshop and get a room full of people together to do that in a more deliberate way. But if you’re limiting yourself to a readout and you have reasonable confidence that you actually know the next step forward, I ask people to just say it rather than hold back and just try to be more neutral. Steve: Yeah. Vanessa: Yeah, that’s kind of where I land on, being prescriptive. Curious if that resonates with you. Steve: Yeah. Vanessa: [Laughter] Steve: I mean, I think that, you know, you’re kind of getting at what are some of the hot topics in research and I think do researchers give recommendations is a hot topic. At least for me, it feels something I’m sensitive about. But you’re providing some nuance here and you’re kind of giving some of the, it depends. And I think I’m hearing you saying like, if it’s clear what the thing is like what the solution is right. Then, then yes, there’s no reason to hold back on that. I think where I, if I were to think about my own practice. I would, I agree with you. People are confused by X is, it’s just a description. But I, and again, we’re speaking very about generalities here but I think what I try to do is say, people are confused by X, because they understand this word to mean this and you’re using it in a way that means that. Vanessa: Yeah. Steve: So, you know, you can go further and say like if you want people to understand this, you know, the language has to line up. And I think that’s very different than change the label on this button from A to B so people know how to use X, I tend to not. And this is also, I’m a consultant I don’t have the same relationship with the product team that your folks do. Vanessa: Yeah. Steve: And I might be more likely to like, to like, give them the whole to decompose it a lot so that they understand, and maybe what to do is obvious. But I think I don’t ever know enough to say, I shouldn’t say ever I often don’t know enough or what all the possible solutions are what the roots of that are. And I want to sort of hand them. You know that thing about helping somebody telling somebody what to decide but making it feel like they’re the ones that are deciding it. Vanessa: Yeah. Steve: You know that’s easier for consultant or that’s maybe more appropriate for an outside person for in house in house person. Vanessa: I love that, though. I think I’ve actually done a mix. Like, back when I was still doing a lot of research myself, I would probably initially land at kind of the fidelity that you’re talking about, of like being very specific about what’s wrong and how they’re misunderstanding it, and even the prevalence of the problem and just really making it concrete of, where are people getting hung up? And then I would just use a little idea, you know, bubble icon, and then separate that and be like, “What idea is this?” So that I’m still getting it in there, but they can at least anchor on, “Okay, this is the finding, and then I’m taking it one step further, but they’re not so tightly coupled.” Because I think if I would jump straight into, “Hey, you need to do this, this, this, and this,” and they don’t have the context behind it, one, it loses credibility, and then two, if I was off, now they don’t have the anchoring problem to actually address it differently. So I kind of like the combo. Steve: I really like the putting the recommendation or the suggestion in a separate area the, the idea bulb, or the light bulb kind of call out. Because I, you know, I think sometimes what I’m trying to activate is to get them thinking about that transition between research and action like, and that there are, you know, I love your example of the workshop and we can always get the workshop so can we at least put this forward as, for instance, for instance, if we did this this would we’re not saying this is the only way but for instance this would. This is a way that you could use the tools of design or copy or whatever to solve this problem. I mean, I had an experience a year or so ago where, you know, I put some for instances into something for a client that I really wanted them to riff on what was possible because I just didn’t know what was possible. And I was really torn between like, you know, as a researcher if somebody build something based on what you’ve recommended like, that’s a tremendous win. So I was really proud of that and, but also, I wasn’t right, it wasn’t a recommendation go to x it was like start thinking about solutions in this area. So I had this sort of mixed reaction, you know, in that experience. Vanessa: That’s understandable, but I agree. I think that’s always like, counted as a success of like, “Hey, my idea made it into the product.” And ideally, due to our background and our proximity to customers, it’s like a pretty legitimate idea. It’s not like we just pulled it out of thin air. So I like that. Steve: Maybe we’ll switch topics a little bit. You’ve, you talked a little bit about, you know, you’re just talking now about some ways that you practice in previous roles. You know, now you’re in a leadership position and thinking about right driving you’re advocating for the practice. But if we could rewind, however far back we want to go but you know, can you talk a little bit about, you know, what was your path to get to the role that you’re in now. Vanessa: Yeah, my path was definitely not a conventional one, but I think we are in a field where that is often the case. So I think my first real entry into, I guess, being in a tech environment was after I quit my personal training job. So I had a anthropology degree from Berkeley as my undergrad degree, and I did not know what to do with that. I would go to job interviews, and they barely knew what that meant. Someone asked me if that means studying dinosaurs, and I was like, “Not quite. It’s actually the study of people and cultures.” So I struggled quite a bit after college and kind of had like this ion for fitness on the outside, was a personal trainer, and did not love the working hours and how that schedule tends to play out because you have to train when everyone else is not working. And so I found a temp agency that had this office manager job at a startup called BrightRoll, and within a few months of me being hired by Yahoo. So that was my first exposure on talking to people that were working in product and UX and marketing because I had ambitions of other things but I just did not know what I wanted to do. And then fast forward, I actually worked at another company that got acquired by Capital One, and I was still kind of in an office manager role and then shifted to an istrative role. So I ended up becoming the executive assistant to a product VP and that was really where I figured out that UX is what I wanted to do because I essentially told them I wanted to learn the business. I wanted to be close to the work that was happening and they had a very heavy design thinking culture. So I was participating in trainings and design sprints and home visits and just like really being immersed in UX and decided that research felt a little bit more aligned with my undergrad experience kind of in anthropology. And so about a year after doing that, still at Capital One, I was able to connect with a few folks in UX and get a research operation job. I wanted to jump straight to research but the head of the department said you need a master’s degree for that. So I ended up doing a lot of research operations which gave me a lot of proximity to research and pursuing my master’s at the same time. So I was, you know, doing a lot of the recruiting, writing screeners, managing our tools, like procuring new tools. So really it was like all of the research. I almost felt like a research assistant in many ways while I was getting my master’s in human factors and information design and then from there once I did have my degree I formally became a researcher at Capital One. So there I was working a little bit on the mobile app and then on the website Capital One dot com. So that was like a really fun time in my career and then after when I ended up at Google I switched very quickly from being an IC to kind of first managing a few contractors to managing a qual team to also managing folks that were quant researchers to now my role kind of expanding even beyond research. So it all kind of fell in place very, very quickly as I yeah, kind of navigated my career and didn’t spend as many years doing IC work as I expected. Like a lot of the time when I was at Google I was kind of in a hybrid role so I was still conducting my own research, managing a few vendors that were doing kind of consulting projects for us and then had a team. Steve: How did you learn the managing part you kind of moved into that and that’s a different skill set than what you had been got your master’s degree and I’m assuming. How did you learn that. Vanessa: Yeah, I think very, very different skill set. But I think I was gravitated towards that. Like even when I was younger I think kind of leadership came natural to me like in high school I was like president of this club or captain of the volleyball team so I always kind of gravitated towards that and I was a tutor in high school for many, many years. So I think there was always the spirit of like helping people, coaching, collaborating that came natural to me and then on top of that I think on the job training of just like understanding how to navigate different situations. I mean I don’t think I could have ever imagined the situations you find yourself in because of course there’s performance related stuff. There’s stakeholder related stuff, politics, and then there’s personal stuff and it all kind of comes together and then there’s the actual like looking at your team as a whole are we actually operating and performing in a way that is like helpful to the individual but also to the company and so I feel like there’s a lot of different variables to juggle, but I would say I kind of picked it up from like observing my former managers, realizing what wasn’t working for me, and then just like workshops, trainings, and on the job experience. Steve: How did you distinguish between management and leadership were sort of using both those words in this conversation I wonder, do you have a definition or an explanation for either. Vanessa: I mean I guess I tend to agree with like the ones you typically hear where it’s like anyone can lead or be a leader and then management is kind of described more as like a distinct set of facilities, so right now I’m using them pretty interchangeably but I know there are formal distinctions on how that works. Steve: When you and I talked in anticipation of having this conversation. One thing that you brought up was both an inclusive research practice and things like inclusive hiring and I guess inclusive as a big term here, but it seemed like you had had some experience. I had some experience with that and some perspective on that, not just a Twilio but I think throughout your career and I’d love to hear you illuminate some of what you’ve done and how you approach it. Vanessa: Yeah. I’m glad we’re touching on this topic. I think kind of inclusive research, workplace, all the things has always been really important to me. Unfortunately I think in 2024 we still see lots of forms of discrimination whether intentional or not. I think it’s a big conversation again now that all of the companies are thinking about AI and thinking about okay how are these data sets, like where are they coming from, like how are models being trained, and a lot of it is always looking back, and so I think just throughout my career I have been very mindful of the folks that are marginalized whether it’s like in the workplace or as consumers of certain products that you know sometimes things are not designed with women in mind or people of color in mind or certain disabilities are not thought about which often leads to problems for again the people who are supposed to be building these products for but then also I think in the workplace like I have seen these things manifest and I think these things go hand in hand as people from different backgrounds or different walks of life. I think there has been so many studies now published on you know having a more diverse team actually leads to more creativity and better solutions because you’re able to see things from different angles and bringing all those perspectives together kind of creates a better outcome and so yeah that’s just been something that has been important to me throughout my career, that I have been mindful of with any of the teams that I have been on that we take that to heart and also design our research to reflect that. Steve: So when you say design your research, that makes me think about sampling but that may be a very small view on what you’re talking about, how does it show up when deg a project? Vanessa: Yeah, I definitely think sampling is a big part of it. I think where tech companies are located it’s easy to do convenience sampling and just say you know we will do something here in a lab. Luckily now remote research is more popular but back in the day where like a lot of physical labs were being used to you know test an app or show a new site that you just don’t want out there on the internet. That often meant you know you are sampling a geographic area where the income might be higher or there’s a certain distribution like whether it’s a gender ratio or a race ratio that might not reflect the full population. I think being very U.S. centric is often a thing with companies and research that we do and there’s a lot of additional hurdles sometimes to be inclusive. Like if you want to do a study that is conducted in people’s native language and you’re an international company there goes many dollars for recruiting and translation and kind of the scale that it now takes rather than doing something that’s local in your area. So that’s kind of what I mean by like the design, but then it can also mean you know all the way down to who is actually conducting the research. Again like if it has to be a native speaker or if it’s better with sensitive populations who might have mistrust when it comes to testing medical products or things like that like it all then requires a different level of planning and consideration in order to also make the participants feel comfortable and to collect that data in a way that’s respectful and yeah I mean I think I could go on and on. That’s kind of what comes to mind for me and what I have kind of pushed for and I think for the most part there hasn’t been any push back on whether it’s the right thing to do per se, but it really comes down to then the timing constraints and the financial constraints that push a lot of companies to say you know what? Not right now. So I have been a pretty strong advocate for when it makes sense to let’s take that time and kind of broaden our scope in order to make sure it works for the broader set of people, not just the convenient sample. Steve: And you made a point about right diverse teams are more creative and then even thinking about who does the research. Vanessa: Yeah, definitely. I think right now I’m in a context where we’re actually B2B and so I would say some of those factors have been a little bit less prominent for us and we tend to you know kind of go for the people who are using the product and try to just like expand our sample to that, but I’m sure that’s your experience too with B2B often you just have a way smaller population than in consumer where you have often hundreds of thousands of people you can like B2B might be. You have a list of 5,000 and by the time you add a few criteria now you have like 30 people that qualify. So that makes it a little tougher, but I think in the past when I’ve worked in SMB spaces or consumer type products then that has been more of a consideration and I think that’s also where vendors have sometimes come in for us where maybe we’re not the appropriate set of people to have this conversation or we might miss some of the nuance that exists in this conversation, so we’ve like when I was at Google we definitely would get outside help to kind of round out some of those conversations or kind of share like the interviews across a native Spanish speaker if we don’t have that on the team. Steve: Does this come into how you approach hiring. Vanessa: That’s a good question because I think with hiring I’ve not deliberately been in a situation where I’m like saying we need this race or this gender or this language because I think that can get very tricky and also potentially lead to discrimination in other ways, so hiring has largely been like merit based, but I think the part of hiring that I still find problematic is often people talk about pipeline issues and not being able to source candidates from different backgrounds and I often find like that there is actually something we can do about that and I think that’s where I focused more efforts is if I’m seeing candidates come through that are a little bit more narrow in kind of let’s say education. They’re all coming from a set of Ivy League schools or they’re kind of like located in a specific part of the country then I have kind of like worked with recruiting to say hey, can we kind of reach different hiring pools? Do you have access to like different communities where you can plug these jobs to kind of diversify the pipeline so we can make sure that at least candidates that have different levels of education or different background can at least have their resumes reviewed, be interviewed, things like that. Steve: You know, in the time that you spent in your career so far. Have we made progress on these issues like how, how do you contrast what you see 2024. Like you said there’s still discrimination is not gone away. Vanessa: Yeah. It’s hard to say. I’m not sure if we’ve made progress or not. I think during the pandemic when George Floyd was murdered there was definitely a heightened interest from companies to think about a lot of these topics. So one way that worked out is a lot of companies including Google and I was actually participating in like some of the hiring initiatives was to think about product inclusion specific roles and like really carve out space to say like we have people in the company who are kind of like trained and qualified to deal with maybe like more sensitive populations and then who are also actively putting programs together to like teach the broader company like how do you make sure that your product especially again when it comes to like language or anything AI related or kind of like anything that’s like visual is not discriminating against like oh the photos can’t pick up on darker skin or they can’t understand people with a certain dialect. Like how can we get that education out there? So I have seen a push in that. There was more consulting firms, more job postings that were specifically around product inclusion and bringing that academic knowledge into the tech field, but as far as like broader hiring trends I don’t know the data well enough across the full population, but I don’t know if I have seen like a uptick per se. Steve: So maybe one last area to loop back to something else that we were talking about it and you described, you know, going from personal trader to attempt job that gave you exposure to, you know, things that led you into you accidentally research. In some ways it seems, I don’t know sort of circumstantial or opportunistic or something like it’s hard, it’s hard to plan for creating the conditions for that. And that’s, you know, what, is there any lesson or, you know, advice to people that that are listening or that you come across in your in your travels anyway about, you know, going from to me they seem like very different worlds to go to going from one bridging from one world into the other, you know, for you is, it was, I think a certain amount of happenstance and having the, you know, having the right lens on it. But I know does this lead to any guidance or advice for other people from your experience. Vanessa: I think for many years, I had anxiety and regret around not starting my career in the field that I’m in sooner because I felt very very lost stumbling through all of the different fields and roles, and only in hindsight do the dots connect. As you were saying, they seem very, very different, but I think I’m better at what I do now because I learned the lessons in all of the different jobs. Even something like, again, being a personal trainer or executive assistant, being an executive assistant, I was able to sit in on more senior leadership meetings, and I really early picked up on short attention span, How do you get your point across concisely, What do they care about? And I think that made me a better researcher right away, even as I was still learning the practice because it taught me something about communication. Then even reaching back to personal training, I think that made me a better manager because I like to think of myself very much as a peer to my team, where we are thought partners. They come to me with things, I come to them with things, and because they are all so driven, we don’t really have a lot of issues where I just have to enforce things or tell anyone to do anything. But circling back to the personal training piece, I think that really put me in a headspace of okay, someone has a goal and they’re struggling with this, or they need through this. How do you tap into the human psychology of – they want to get from A to B, we want to make it as sustainable as possible, do something that hopefully they enjoy enough that they can do it on their own. And so really just getting into that mindset of how do I collaborate with another person and find common ways to address a problem and align with them so it doesn’t feel like I’m pushing you or forcing you, we’re kind of going towards this goal together. I think all of those little pieces along the way just shaped how I interact with people and I think has made me better at what I do today. Maybe just know that it’s all connected somehow. Steve: Yeah. Yeah, I want to normalize part of what you said at least that, yeah, I also had regrets for not starting my career earlier I felt like I was late. And, you know, I’m, but that was long enough ago that many people I know didn’t know me then or wouldn’t know that about me but, you know, I had a complexly directed path to do what I do now. And, yeah, I think you’re right that we’re sort of all products of all of our experiences is one of the things I like about research but I think you’re right to extend it towards management and everything that there are a lot of different paths and then there are a lot of different ways of being. And that, yeah, if I wasn’t who I was I wouldn’t have gone on those paths and I wouldn’t be able to be all the weird mixes of things that makes me whatever kind of researcher or leader or, you know, all the things that you’re kind of bringing up. I like that about, I’ve always enjoyed that about our field that we’re just, we all come from different places. Even over the generations I think there’s still an interesting mix of that and then means like oh the next researcher that you meet, you know, done something that you didn’t know about like, and you get to work with them and like, you know, there’s something about personal training that shows up in an analysis or in a planning meeting or something and you get all these great stories from people. So I enjoy that. But yeah, I just, I started off by talking about myself here that I have, I know what that regret for me is like that I didn’t come into it out the gate early on and had to play catch up in a lot of ways too. And maybe that’s a great place to kind of wrap up our conversation. Thank you so much for taking the time, sharing your own experiences and your own perspectives on the work that you’ve been doing all the way along and yeah it’s great to get to chat with you. Thank you. Vanessa: Thank you so much for having me. it’s really an honor. And even when I was telling folks that Dollars to Donuts was coming back on my team everyone was really excited because a lot of us have your books and have listened to you and learned from you so I really appreciate being here. Steve: All right, well, great. I hope they enjoy listening to you talk about all the great work that you’re doing. Vanessa: Thanks. Steve: Thank you. All right, that’s it for today. I really appreciate you listening to this episode. If you like Dollars to Donuts, recommend it to a friend or colleague or post about it on the social medias. You can find Dollars to Donuts in most of the pla
01:13:08
37. Nizar Saqqar of Snowflake
Episodio en Dollars to Donuts
This episode of Dollars to Donuts features my interview with Nizar Saqqar, the Head of Research at Snowflake. For a domain that takes a lot of pride and empathy and how we can represent the end , there’s a component that sometimes gets overshadowed, which is the empathy with cross-functional partners. With every domain, product design, research, there’s people that are better at their job than others. I dobelieve that everybody comes from a good place. Everybody’s trying to do their best work. And if we have some empathy to what their constraints, what they’re going through, what their success criteria is, how they’re being measured and what pressures they’re under, it makes it much, much easier for them to want to seek the help of a researcher to say, “Help me get out of this. Let’s work together and let me use research for those goals that are shared.” – Nizar Saqqar Show Links Interviewing s, second edition Nizar on LinkedIn Snowflake How to use MaxDiff analysis You Might Not Like What Jon Stewart Has to Tell Old Man Yells at Cloud Noam Segal returns Help other people find Dollars to Donuts by leaving a review on Apple Podcasts. Transcript Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead research in their organization. Today’s guest, Nizar Saqqar, actually brings this up in our conversation, but I’m going to remind you myself that there is a new edition of my classic book, Interviewing s. It’s now available 10 years after the first edition came out. Of course I’m biased, but I highly recommend it. And hey, it makes a great gift for the researcher or research-adjacent person in your life, or persons. If you haven’t got the second edition yet, you can use the offer code “donuts” that’s D-O-N-U-T-S for a limited time to get a 10% discount from Rosenfeld Media. But now, let’s get to my conversation with Nizar Saqqar. He’s the head of research at Snowflake. Well, Nizar, thank you for coming on the podcast. It’s great to get to chat with you. Nizar Saqqar: Thank you for having me. I’m really excited for it. Steve: Let’s start with an introduction from you. You want to say a little bit about your role, your context, anything to kind of get us rolling? And we’ll go from there. Nizar: Absolutely. I’m Nizar. I lead research at Snowflake. I’ve been here for about three years. It’s been a pretty exciting adventure. When I started first researcher at a company that’s been around for 10 years and really doubling down on showcasing the impact of research, why we need to scale, and we’ve been scaling nonstop in today’s environment, which has been a pretty exciting challenge. It comes with the fun of it, but comes with the challenges as well, and I think more to come. To take a step back and try to simplify it as much as I can, Steve: What kind of company is Snowflake? Nizar: Snowflake enables organizations to store huge amounts of data from many sources in one place. So it empowers organizations to make the most out of that data. And as we’ve scaled the company, we’re continuing to push the envelope on platform-level offerings that try to enable native app developers to do the development of data applications. Steve: What are some examples or vertical scenarios that we might know about? Nizar: So the simplification of it is get your data in a place, make the most out of it. Snowflake will help companies do that as efficiently as possible with as many use cases as we can. It’s definitely not in the day-to-day conversation. Steve: What are some examples of what Snowflake is? Nizar: It’s not a B2C product, but at the core of it, it really starts with the data warehousing of getting the data engineer brings in all of the data from many different places, many different sources into one place for storage, and then making it usable for other s, maybe like the data analyst or the data scientist who makes something happen out of it as an outcome. And as I mentioned earlier, building the native app development framework, it’s been exciting to see all of the, think of them as the more classical kind of software developers that are now into our ecosystems to get closer to the data. So it’s a pretty complex ecosystem. We also have a marketplace that then kind of introduces the dynamic of a provider and a consumer and the business decision makers who are coming in for that transaction. So it’s a pretty intense ecosystem that magically all connects into just making the most out of your data. Steve: So when you came in as a researcher, what did you observe about how this company was thinking about its s or thinking about what it knew or didn’t know? Do you that early process? Nizar: Yeah, and to be honest, that process started before I even started. Even in the interview process, I really wanted to be sure that the company is thinking a lot about their s. They’re thinking a lot about how research can integrate. They’re thinking about challenging some of maybe the perceptions of what research can bring to the table and just having some of these tough conversations even before. And I will say that where we are definitely lucky is that Snowflake does put, do what we can to make it as great of a product for our s as like a core value of the company. The interesting thing with that is that it brings in a lot of data points from a lot of pieces. Now you have a lot of perspectives from the sales team, directly from the product team. Then you have all of the metrics and dashboards that you’re following. So you actually get a lot of, you get a lot of data. You get a lot of points that might actually make it harder for the product teams to action on or prioritize. So as I started, I kind of wanted to first take a moment to better understand the domain, really kind of find my footing, know what’s going on, build the right relationships and start with something that’s very low-hanging fruit of saying, “Hey, let me just build credibility. Let me just come in and say I can add value very quickly and then scale that up.” It’s been interesting to see how the role has continued to evolve since that day one. It really started off with, we’ve been here for 10 years. This guy is here. And it’s really kind of evolving into research is just a critical component of how we think about product development. But it’s taken many phases that we’ve had to adapt as we continue to go, starting off with the very tactical, then zooming out into something that perhaps is more strategic, then shifting focus into our hiring strategy and our hiring rubrics and how we interview, going all the way into what we define as success criteria, performance evaluation and how we integrate research into the overall product process. And it just doesn’t stop. So the role itself has been changing over the past three years and I perceive it to continue to do that. Steve: Can you give an example of a tactical, sort of quick win that you would approach kind of coming in, in those early days? Nizar: Yeah, absolutely. And I think for me it really starts with what is something that is tactical enough, close enough to wanting to launch. There’s enough resourcing, but there’s some level of disagreement in the organization how to proceed. And seeing if there’s an appetite to make research kind of be a tiebreaker of sort or really find the right balance between the two. I found that it’s very rare that there’s option A or option B wins and there’s always like components of each that kind of resonate that when you bring them together, you kind of find something that really resonates to the actual flows. And if I correctly, one of my very, very, very early contributions was kind of around something as simple as a concept evaluation. And I think those are the methods that are just going back to the basics and some people take for granted, but you’re just coming in and you’re saying, “Let’s just test it and see what’s resonating and what’s not.” And coming up with some actionable steps that align multiple teams that might have dependencies on each other to find the solution that may not make everyone happy, but at least everyone is aligned that, “Okay, this seems to align on a path forward from a lens.” So then it just continues to evolve. I was talking recently about how coming off the bat, just seeing that there’s an overwhelmed product manager who says, “Hey, I have 20 features that I’m asked to ship.” And my role there was to come in and say, “Let me help you do a MaxDiff survey to just make a case for some things that you should actually deprioritize so you can make progress towards some of the top features that you want to run through.” And I think that was part of the evolution of, “Okay, we could use research for many different use cases and in different areas where we could integrate with the product roap.” Steve: So I think it’s super interesting that you’re using the interview process, I guess, to understand the context that you’d be coming into. Yeah, I’m wondering, and I don’t know, I’m going to ask it like a binary, obviously it’s not, but, you know, how much of a mandate were you given versus how much you were trying to figure out what the needs were and, you know, make recommendations appropriately? That’s a terribly leading question. Nizar: It’s not, though. It’s kind of interesting, because there’s– when someone opens a position, when somebody asks for a headcount, for the most part, they have an idea of what they’re looking for. They have an idea of what they think the success criteria is. In my case, I was hired by a head of design who had an idea that I could help elevate the design team. That was kind of like the primary premise. And the interview process, that comes out, and it’s a really exciting thing. And a head of design is really excited to be like, “Now I get to finally get to design with research.” I started digging into the appetite of, “But how do we kind of expand beyond the design?” If we’re to look at the pie and we’re to say, “Instead of making that piece more efficient, how do we just make the pie bigger? How do we get the design team holistically more involved and have that impact from earlier stages as well?” So look beyond the design research component and don’t worry about it. I’ll make sure you have a good story for where your organization is growing in of impact beyond the pixels. And I think that was a really good back and forth that early on showcased that there’s a lot of action and appetite for, “Hey, if you can define something outside of what I have in mind that you perceive could be even more value-adding for the organization, that’s what we’re optimizing for.” And I think that was a good start of saying, “Okay, I won’t be in a situation where somebody comes in and says, ‘This is what you need to do.'” I do hear a lot of stories of, “The researcher comes in and all you can do is one-week sprint and stuff, just do a study every week, and it’s non-negotiable.” And it was pretty important to gauge that, “Hey, can we just align on value for s and value for the company and value for the team as the criteria, and let me do what I need to do without a specific framework of how I should be operating within those objectives?” Steve: I love the phrase impact beyond the pixels. That’s like, that’s a pull quote or that should be a title of your next talk. So that sets you up then to find that overwhelmed PM. And if I understand it correctly, you’re kind of saying to them, like, did you know, hey, this is a situation you’re in, like, here’s an approach that would help you. That’s kind of where my, maybe where my mandate versus discovery question comes from. It sounds like you are finding opportunities or finding places to have impact where that PM is not going to ask you, hey, can you do a MaxDiff survey? You’re coming in, seeing the situation saying, yeah, here’s a way that research can unblock you. Nizar: This was a fascinating story altogether, and there’s some more context behind it, which is kind of funny when you look at it. That PM was super excited before I started, messaged me before I started, started telling everybody that me ing is going to be a game changer, was the friendliest PM I’ve ever met when I started. And then back then, my manager was saying, “Hey, we think this is the most ambiguous thing. We need to redefine a roap. You need to put a really found– I think this is a really foundational research problem.” So when I went to the PM and I told him, “Hey, we can work together on this,” and I’m actually excited to team up, he actually said, “No, I’m not interested.” And I told him, “Let’s take a step back and let’s speak about why you’re not interested, what’s–just what’s on your mind. Let’s not talk about the research. Let’s talk about the problem you’re solving.” And his take was, from my experience with research in the past, a lot of the times it does take a lot of attention to keep up with all that’s happening, be part of the interviews, and then you come back with a lot of insights that I frankly don’t have resources to do anything with. So if you come to me and you say, “Here’s 10 things you need to build,” I’m just going to put it at the end of the JIRA board as items 21 through 30, and I’m not going to get to them. So the key learning for me back then was, okay, everybody kind of perceives my role and how I can solve this problem very differently, and I really need to set some shared language and shared expectations of why I’m here. So that’s when I was like, “Hey, how about we do this? How about I’m just going to go into your JIRA board? I’m just going to steal the things that you have there. I don’t need you to be involved, and let’s make a case of why you don’t need to pursue all of these features at once. And let me do the heavy lifting. We’ll team up on it.” And then going back to my manager back then and saying, “I don’t think I need a multi-month, huge effort to start. Let’s just kind of help him get out of the weeds for a bit and just aligning the expectations over what we can do with the research.” And in this case, it was a core example of research really used to de-scope, to de-prioritize, to say that not everything is equally important. At a high level, when you take single-off, one-off stories, they all come up as high needs, but are they all at the same level of importance for when you look at our needs and the business value that they bring? And that’s essentially what came out of that Maxx a lot of these are way below when you compare it to what’s really bubbling up to the top. And how do I make the case that doesn’t say everything is there for a reason? But let’s make a case that with the limited engineering resources that we have, we can drive the most value if we really focus most of it on those very specific areas and get those to a place where our end s are really happy with the experience that we’re offering. And that was a different mindset. That was a different principle for that PM where it’s like, I didn’t know that we could do that. I didn’t know we could do research that helps me tell a story to executives of why I shouldn’t be doing work or why I should say no to some of the work that’s coming up. And then that led to me really wanting to define some of the languages used around why research exists and why research is at that company. And the wording that I tend to use, which may not apply for everyone, but I try to take it around driving the allocation of limited resources into the most impactful efforts for our s or organizations. And if we can have that be a shared language mandate for what research is optimizing for, it takes away some of the misconceptions here and there and where there comes some of the tactical things like changing the title names from UX researcher to researcher or changing some of the way that we present decks or reports or internal documentation. So there’s some tactical things that come with it, but at the core of it really linking it to the intersection of value and organizational value. Steve: So to get somebody unstuck, when we get so overwhelmed, we can’t even see our way out of something. I don’t have time to do your solution. I’m just, you know, I’m treading water here. So I love that aspect of the story that you found an approach that also is about limited resources and that took that person where they were at. And I don’t hear complaining about a stakeholder that wouldn’t commit to the project, you found an approach that you had a 10,000 foot view and it could kind of see how you could add value. And still, I think you were fairly new to the organization at that point. Is that right? Nizar: I almost had no idea what was going on. I needed to rely on them to make sure that the items in my MaxDiff actually made sense. Even when I was mentioning it earlier, and I come from a B2C company. I come from a place where that ecosystem was extremely new to me. So of course there was some collaboration there, but I tried to keep it as lightweight as possible as I make sure that I have the right pieces but without overwhelming them. And I love what you’re saying. I love the way you’re describing it. For a domain that takes a lot of pride and empathy and how we can represent the end , there’s a component that sometimes gets overshadowed, which is the empathy with cross-functional partners. With every domain, product design, research, there’s people that are better at their job than others. Sure, for the most part, I do believe that everybody comes from a good place. Everybody’s trying to do their best work. And if we have some empathy to what their constraints, what they’re going through, what their success criteria is, to be honest, how they’re being measured and what pressures they’re under, it makes it much, much easier for them to want to seek the help of a researcher to say, “Help me get out of this. Let’s work together and let me use research for those goals that are shared.” And at the end of the day, it is still -driven. It’s still based on the data that we’re getting and we’re able to drive direction. Finding ways to go with the flow while still having a strong perspective over what’s best for the s, rather than feeling that the role of research is always to be on the opposing end of cross-functional partners, could be a really powerful tool. And in these cases, all it leads to is that intersection of product impact and impact, which I think is the end goal. Steve: It is a great story that this person was enthusiastic for you and reaching out and was excited about research. And when you think about what opposition is sometimes, I think it’s easy to sort of demonize someone and say, well, they don’t, they don’t get it. They don’t believe in research. They don’t like me, whatever kind of escalate. And here you started with the best possible out the gate dynamic with this other person. They were a fan. They were like welcoming you and they couldn’t wait. And still they had a concern. And so by identifying that and coming up with the right approach that suited all those constraints, you get to them the kind of impact that you’re looking to have. Nizar: And it kind of makes sense. I mean, if you really think about it, the definition of what a designer does, the definition of what an engineer does, for the most part, is pretty material. You finish your effort, you it on. Eventually, the thing that that designer or engineer touched is the product that you end up using. What that does is that for areas like research, there’s more fluidity, and the perception of what you’re here for. So that fluidity could be a great thing and could be an awful thing, because at the end of the day, it opens up a lot of opportunity to set the expectation of why the researcher is here. But it takes a lot of work to get people to align, because they’re also basing it on their past experiences, basing it on their biases, basing it on whatever good experiences they had, but also whatever bad experiences they’ve had with research, and a lot of that bias of how the work may or may not be precisely presented to what the end sees just opens a lot of these gaps and unknowns that I think just plugging these holes and making sure that the narrative is clear around why the researcher is coming in, to me, I see it as an opportunity. It’s not always the most fun process to cover some of these holes and make sure that there’s no gap in the perception of why the researcher is here. Steve: We started with the foundational, this tactical stuff that you’re doing, but maybe we can look at the whole arc of creating that more evolved understanding in the organization across all these folks about what research is here to do. Nizar: Building up on that first story that I was saying, I made my success criteria be less about the research output and more about how’s the research being used, how’s the research actually integrated directly to the roaps, and some of that lives till today with the team. You’ll see a lot more emphasis on how often is your research referenced in a cross-functional document. Then you’ll see about what is the quality of your report, for example, as some of the success criteria that we have. But taking it back to that initial journey, there’s a disadvantage and advantage of being the only researcher back then. The disadvantage is that it’s overwhelming. There’s a lot to cover. The advantage is that gives you the ability to say, “I’m not going to do it all. I can’t do it all.” And you get to pick and choose a bit in of where you can foresee some of the most opportunity is. And at the same time, you look at where some of the path of least resistance could exist, too. So if there was a huge problem where the path of resistance is pretty significant, the question that I need to ask myself is, is this where I want to continue proceeding? Should I continue butting heads to be included there, or should I go find that place that is a bit more welcoming to changing their processes and their approach and just kind of use that as a case study? Once that case study lands, how are we showcasing that case study? And I’ve never been a fan of visibility for the sake of visibility, but especially in the earlier days of research, there’s a lot of advantages for visibility as a case study of how research could work to empower those around. And that became key. And basically what happened right off the bat is we started hearing the sentence, “Well, I want research. Where’s my research? Why don’t I have a researcher?” And the demand for research started coming more organically from the cross-functional teams. So it wasn’t on me to necessarily say, “I need people. I need to grow the team. I need to do this.” It wasn’t an ask from the research department to grow. It was an ask from cross-functional partners who have seen how much more effective and how much more efficient they could be with the appropriate level of research . And that just kind of creates more of that shared language, the shared narrative of what the organization is looking to do with the research and work closely with it, but also what’s the success criteria for the research team. And as we started to scale, it became more and more important to set pretty stable goalposts to gauge what success looks like and what our objectives are and being very intentional about kind of not falling into the trap of making research the end goal where you’re out of the loop of what decisions are actually being made and you’re trying to do like a one-size-fits-all approach to research that says, as you get more senior, your research gets more complex, which I don’t believe is the best definition of researcher seniority, but really kind of anchoring it in how we’re able to continue driving the product roap forward. Even then, there’s a lot of back and forth that goes into it. Steve: When you started to get these requests from people, we want research, where’s my research, the kinds of things that folks were hoping for or asking for, did that line up with what you would want to or hope to them with? Nizar: When you’re starting the team from scratch, the default is actually about, hey, since you only have one researcher, there’s only two of you. Do you need to do some intake form to take everybody’s input and then try to cover as much as possible? And I put my foot down that I don’t think this works. I don’t think that’s the most effective way to do it, and I don’t think hiring a researcher starting off with a service model that says, hey, you’re not part of the team. You’re an outsider who will do research and come back will be the most effective way to drive meaningful change. So I approached it with a lot of and I approached it from a point of view of let’s continue that as a proof of concept. Let me hire a researcher and embed them directly in one of our most critical strategic teams that has significant strategic significance for the company as a whole, but also has a lot of open questions and a lot of things that we can benefit from a researcher, and that’s all they’re working on. And of course, you get the pushback of, well, what about these other teams? And my take is you’ve been operating with that research for 10 years. We can weigh it a bit more, and let’s continue gauging how things go there. And that kind of starts it off with setting up the researcher for success and empowering them as a core member of the team, challenging the notion that they’re there to take requests or answer questions and have them be able to actively predict where there will be blockers and how they can get their research ahead, maybe like three months ahead, six months ahead, to be able to actually be ready for the decisions for when the actual time to make the call comes. So it’s essentially making sure that research is proactive rather than reactive. And that model worked. That model worked great with that team. We hired a phenomenal researcher. Until today, you’re always excited about the first hire being a phenomenal person on the team, and you start replicating that model with different teams for areas that also are strategic to the company and have a lot of ambiguity, and that kind of becomes the framing in of unblocking, creating alignment, efficiency, and how we can just continue to scale from there. Steve: At the risk of oversimplifying, I guess I’m hearing in your answer where I was going wrong on my question, it’s the difference between this team needs a researcher and this team needs research. I was starting with this team needs research and you’re putting a researcher in there and they are figuring out the questions, being proactive, that’s very different than that service model intake form thing. Nizar: Yeah, correct, and generally I think teams that start off being -centered at times think they’re doing research. There’s a lot of types of research, right? So sometimes you’re talking to, for us, you know, we’re a B2B company that has great relationship with our customers. So you think that, hey, I’m talking to their sales engineer or somebody called me for a meeting. Like, I’m doing some level of research. I kind of have a take of, you know, I’m not here to keep, go do your thing, but at the same time, I’m not here to demarcate. I’m not here to, like, empower as many people to do research as possible. You know, my role is to be a cross-functional stakeholder, and I will jump in with what the problems are that we need to solve together, and let me find ways to deal with them. So I think in this specific case, there was always an acknowledgement of, hey, there’s stuff that we don’t know, and we need some form of research. The definition of what research is and how it’s going to be incorporated is the thing that needed to be tightened up a bit more, and then integrating the researcher as in the framing of this is a cross-functional partner, not a source of research, if that makes sense. I started changing the language around the expectations of even when they’re invited, even when people go to them, even when what topics they’re having in their one-on-ones. So it’s less about, like, here are some questions that I want and more about, hey, I’m struggling with this thing and we talk through it. So they all tie in together. I think to my point earlier where there’s a little bit of the path of least resistance when you’re starting and you can pick one team, you know, that was, you know, you can call it a lucky privilege of saying, okay, there’s a team that could be ready. I’m seeing conditions that are priming a researcher to be successful here. Let’s go with that model, with that team, and continue scaling from there. Steve: I mean that reminds me of just your interview process, you’re looking for those conditions to understand what that is before you start and now as you grow your team, you keep looking for those conditions within different parts of the organization to see where research is going to, could go next and have the most, again you’re really focused on the impact and the product and the experience in the company. Nizar: That’s a good summary, and it feeds into our interview process. I do, you know, we try our best to make our interview process as applicant-friendly as possible where it’s not convoluted, but at the same time, it covers a lot. So a key part of it is who s the team and what’s their approach as well. We do tend to see, like, there’s a specific type of researcher that tends to do best, and usually we look at researchers who do have the depth and soundness of research methodology as kind of like a core expectation. But then they layer on top of it the -centric process and thinking, you know, when do you integrate at different stages of product development? You start seeing the kind of the business sense of really wanting to be integrated deeply with the team and solving the problem at heart rather than solving the open question. And then cross-functional collaboration as a core area. I do think that every researcher needs to fully understand what resources the team is working with, whether it be engineering, design, any other blockers, to be able to come forward with the most effective size of recommendation. And then we always have that over-layering, overarching umbrella of leadership and teamwork, really kind of looking for people who have a growth mindset, who are looking to help others succeed, who don’t necessarily see it as like their world and like their thing, but just really collectively looking for everybody to succeed together, which I think has been pretty key as we’ve scaled the culture of our team into a team that’s pretty collaborative, a team that’s looking to help each other, and a team where people aren’t competing. And there’s no incentive for people on the team to compete. There’s actually an incentive for them to make each other better and learn from each other. So that’s been an exciting part of scaling from a cultural perspective within the research team. Steve: I want to ask you to clarify, you used the phrase the problem at heart versus the open question. Nizar: Absolutely. Steve: Can you explain what that looks like for, what does that mean for any particular problem? Nizar: One thing I’ve become really sensitive to, maybe too much so, is when I see kind of a research plan that says our objective is to answer these five questions or six questions, and my take is that’s actually a step removed from what you’re going to do with the answers that you’re going to get. So I like to start it off with what’s the perceived outcome? What’s the perceived objective? What are you looking to learn and why in of what’s actionable? And then take that to go a step backwards and say, okay, to get to that effectively, let’s now go into what questions do we need to ask? And based on that leverage, what method is the most efficient and appropriate for what we’re trying to accomplish? What I’ve seen a lot in the past is even your stakeholders think they’re asking you the right questions. How many people have been asked, can you create personas? Can you tell me the different types of s? And a researcher goes off, does this for a month or two, and then they come back and nobody knows how to use them. And to me, that’s the problem that I’m trying to avoid as much as possible and just saying, okay, you want personas. What are you going to do with them? What’s the decision you’re trying to make? And often coming to the conclusion that you don’t need that at all. What you need is something much more simplified. Or we could actually get a pulse check to start getting you some signal of the answers that you’re looking for that will help with that decision-making process. And then we can decide when to iterate or if it’s necessary to iterate. With open questions, I find that there’s sometimes the danger of over-scoping research efforts. For what you’re trying to do with it. It’s just that outcome of you show up with a deck that has 100 slides, but the team can only act on the first two. And so my question becomes, was this the best use of the researcher’s time versus trying to focus on those first two slides and then connecting that to a longer-term program that we can then create follow-ups on as we continue to learn throughout the process. So in a way, it’s forced efficiency and early hypotheses of how we connect to the impact before even starting off with prioritizing the effort. Steve: I want to follow up something else that you said, you were describing a lot of the qualities that you’re looking for, the mindsets and the kind of abilities, you know, how do applicants demonstrate that information in your process? Nizar: I could talk about that for a long, long time. So I generally don’t believe that these buckets are a or a fail. I don’t think it’s are you good at -centered thinking or not. How I see it is everything kind of sits on a continuum. And what I’m trying to optimize for is for the level of seniority that this role will require specifically, am I seeing enough of ability to handle different situations effectively that then puts them in a position where they’re going to be able to know what they need to do regardless of what’s thrown at them. So, for example, let’s say somebody is, we hear this a lot with the break apart of the tactical versus foundational or that, where it’s like, I do this and not the other. And my question becomes why? Why create that separation between this form of research and the other if you’re able to tell a story around pretty much your ability to come in at multiple different stages of product development and say that I can help you across every stage and I know exactly how to do it. And I can help you across every sort of limitation that you have and I know how to do it. And I can help you address multiple different type of issues that we’re facing, whether they be needing of some qualitative research or something that’s more quantified or something that’s quick and dirty or something that just needs a brainstorming workshop and I’m able to just be flexible in where I integrate with the team. So I know that was a long-winded answer that went all over the place, but it’s really hard to — the reason it’s hard to describe is, I really don’t think research is good or bad, yes or no kind of domain in general. And what I’m really trying to optimize for as much as possible is, does the research applicant have the breadth to be able to tackle as many problems as possible? To me, that’s a much better predictor of seniority and success than somebody coming and saying, “I did this multi-country 12-month research project that was really complex logistically,” which is impressive in its own way. It’s great, but for me, it’s not what we’re trying to optimize for in general. Steve: And so you’re looking at past experiences that the applicant can, I guess, describe to you, or those kinds of clues to the breadth. Nizar: We look both at past experiences and we look at some of the hypotheticals as well. So we do have some scenario-based questions where we try to gauge some of the thought process. It’s the thing that I tell people that there’s no right or wrong answer. You’re just going to get a hypothetical and I just want to hear how you think about it. And I want to hear what are the different considerations that you take into or into place when you’re making your mind up about the best approach and what you’re going to do. How often are you coming in and having the hard conversations about what needs to be done versus steering the conversation in a completely different way versus just saying, “You know what? This isn’t worth the back and forth. Let me just do something quick and move forward.” So at the end of it, when we’re combining the hypothetical with the past experiences, I’m really looking for effectiveness and efficiency under the umbrella of strong and sound research. Steve: Those are words that are sometimes seen as at odds with each other, but I think you’re talking about how they’re in of each other, that effective and efficient doesn’t mean that you’re not sound, doesn’t mean that you’re not, like you said, solving the problem at heart versus the open question. That seems like a key mindset that you’re bringing to this. Nizar: A hundred percent. And I hear that sentiment every now and then. I hear the sentiment of, “Oh, if you go too scrappy, you’re doing really terrible research.” Or, “It either has to be great research or it’s terrible research that is fast.” And I don’t agree with that mindset or that context. For me, it really depends around what you’re trying to learn, what your objectives are. If you’re trying to do something that is an extremely small pulse check, for example, you don’t need to boil the ocean. I still earlier in my career, I ed a team and they just had no idea, they knew nothing about their s. Absolutely nothing. And I was telling them, “Do you have any hypothesis? Do you have any open questions? Do you have anything there?” And they’re like, “We just don’t know. We just know that nobody’s using this feature. That’s all we know.” And we look at our dashboards, we look at our metrics, we have a target addressable market of millions, and we have tens using it. So we don’t know why. And I just came in and I said, “Look, the best use of time right now for me is just to do some sort of a small single pulse check survey. One question, pretty much trying to understand the state of everything. Just for me to have context to get started, just give me some perspective. Am I planning to use this in road-napping? Probably not, but I need some form of context from end s to be able to tell me, “Okay, I have an idea of what’s happening there, and I have an idea of the value add, I have an idea of why they’re churning, I have an idea of why maybe they’re not seeing some value. Let it be scrappy.” And you get the pushback of, “Well, this is qualitative. You need to do in-depth interviews for that.” I’m like, “No, I don’t. I really don’t. I don’t need to invest 40, 80 hours just to get an idea of what’s going on if you can do this in 24 hours, and then take that as an entry point into something that’s more detailed, that’s more rigorous.” So for me, it just goes back into linking the amount of effort to the projected outcome and really just finding the thing that works for next steps. And in this case, we did end up actually needing to go very in-depth with foundational interviews and a full design sprint, and then going to concept evaluations and stack rank. It ended up being a really complex process over maybe the course of a year that really turned around a product that wasn’t used, a product that had millions of s. But at the start of it, I did not have the luxury of saying, “I just need to go away and do in-depth interviews,” because the research domain says that qualitative is not allowed in a survey. So sometimes I think just breaking the rules in our domain is very okay, as long as you know why you’re breaking the rules and what you’re going to do with the insights that you have. Steve: This research effort that you’re scoping at any point may not be, or probably isn’t, the only time you’re ever going to learn anything. And so, you know, as I’m taking that away from you, then I can sort of feel some of my own anxiety just like ebbing away, like, of course, right? If you think of research as a longer-term thing, like, what’s the question we need to ask now? What’s the right amount of effort for right now? Okay, everybody, we’re not going to get everything. We’re not going to boil the ocean, as you said. There’s more to do, but here’s where we are right now. And so, yeah, that good versus bad research thing is it says we’re only going to do it once, and it’s kind of this monolith that’s either going to answer everything or not answer anything. And these gray areas you’re describing are, it’s a gentle reframe for me, I think, about where I sometimes feel anxious about trying to tamp down the commitment or the investment. Nizar: As long as we have the right data point for the right decision that’s being made, if we’re coming in and saying, “We need to invest all of our engineering team of this one single customer satisfaction open-ended box,” I get an anxiety attack. I get it. But sometimes that’s not the decision that you’re making. Sometimes the decision — the pros and cons that you weigh, the cons of having something that’s scrappy and fast are justified when you look at the pros of being able to get ahead and then establish a research roap that actually gets you ahead of the product team. So that’s the consideration for me. And to your point earlier, product development is iterative, and I think people forget that sometimes. People forget that even if you launch something, that team is still there, and that team will still continue to want to optimize it in some shape or form. So if anything, I feel researchers should take some comfort in that and saying that, “Okay, if I miss the boat now, how do I get ahead and say, ‘Okay, for the next iteration, for the next thing that’s happening, I’m able to get ahead and have some things ready in time?'” And acknowledge that the same way that product development is iterative, even the most foundational research efforts, you’ll end up having to iterate on in some capacity. I haven’t seen a world where a research deck is still relevant years later, and nobody has ever touched that topic again. Of course, you want to minimize how often we redo work that’s already been done, but everything changes. Once you start having a base, the kind of data that you have is different. In this case, when you have tens of s and you go into the thousands, the kind of you’re starting to hear is already different. The usage data that you’re starting to get is different. You’re able to use telemetry a little bit more than when you had nobody. You start being able to triangulate in a way that you just weren’t able to earlier. So iterations are good, and that doesn’t mean don’t do really deep foundational generative efforts. They’re just a time and place to say, “This is my time to get scrappy, and this is my time to dive deeper into the topic.” Steve: If I didn’t think about it too deeply, I might, you know, sort of have this reflex that says, well, when we know nothing, that’s when we have to learn everything, that the foundational work comes at the beginning. But you’ve got a number of examples where you’re coming in and seeing a big gap and saying, no, it’s not, this is not the boil the ocean time, it’s the quick win or the thing that we can act on or the scrappy thing. And no one believes that A is B. No one believes that the scrappy quick thing is, in fact, going to answer all the questions. But you’re helping take action, you know, within the constraints that are there. Nizar: I’d say caveats. Tell people there are limitations to what I’m doing. We are aware of that. Every research method we do has limitations, and I’m yet to come across any research study that has solved everything or is now claiming that we have learned everything about our entire base or our entire feature area. And that’s the reason researchers continue to be in the same role or on the same team for years. There’s always a lot to uncover. And a lot of the time, just really weighing the cons of coming in early and saying, “You know what? I just started. I’ve been here for a week. Let me go disappear for three months or so.” And I get it. There are ways that you can incorporate your cross-functional partners. But for the most part, especially as somebody’s building credibility, starting with data is much more effective than starting with raw data and giving you some form of, “Here are some next steps,” where often the next steps are research. When I did that Pulse survey early on, the next steps were research. Now I needed to go and actually do in-depth interviews to learn more, but at least I had some litmus of, “What am I talking about? What is my script going to have?” So I’m not finding myself in a situation where I’m interviewing, even if it’s 10 end s, I’d be like, “Can you tell me anything? I don’t know where to start. The team doesn’t know where to start.” But I had something. And for me, the value of that effort, even if it just fed into the definition of a research template for the next steps, that was value-adding, and that saved me a lot of time. Steve: I’m going to switch topics a little bit here and go back to something you said, I don’t know, before. And I just — maybe you can unpack this or just clarify it. I think you were saying that, you know, that you look at, for people on the research team, you look at number of references or citations of research work in the work product of other cross-functional teams. I’m saying this really poorly, but did I capture that at all? Nizar: It’s a good summary, and I’ll also give it an asterisk and say, “Not as the only signal, but it’s an effective signal.” Steve: Yeah. So I have a bias against that. And that’s, of course, coming from someone that doesn’t work inside an organization. So my bias is maybe just hypothetical, but — or just from conversations. And maybe that — maybe that asterisk is really, really important. I agree it’s a signal. I do worry about researchers either getting external pressure or pressuring themselves that this thing, which is essentially out of their control, whether somebody else does something or not, is kind of — is a measure of their worth. Where there’s lots of reasons why people don’t do things and don’t listen to things. And I think you’re talking so much about how to — how to prevent that from happening. Right? The right work at the right time, with the right collaboration, with the right understanding, and all that stuff being kind of scaled appropriately. But, you know, just having spent my career giving people stuff that we’ve agreed was going to be important. And then seeing all kinds of things happen and don’t happen. And to a certain point, there’s a certain amount of surrender, right? Like, I’m going to give you everything that we agreed you need and maybe more, but I can’t control what happens. So I don’t know. I don’t want to frame this as a debate or anything like that. But I’m open to you telling me that, like, I’m wrong, that I’m framing that wrong. I’m just curious what — you know, how you think about this. It’s not the only signal, but how do you think about sort of how to use that signal or how we should all think about that signal of what somebody else does? Nizar: The conversation is super valid. That’s where the asterisk comes in. And if anything, I always love the counter perspectives here as well. The reason that I added asterisks here, too, is exactly what you’re saying, that you can’t really control what somebody else does, where it ends up putting some emphasis is encouraging research teams to be very strategic in of where they’re prioritizing their time and how they have ownership over the product direction as well. But that doesn’t only go on the researcher. A lot of the conversations that have to take place as well do have to happen at the leadership level. And kind of talking about if we’re to say that the researcher also is to be held able for what’s going on there, what’s the collaboration model, and where are they coming in, and are they left out of being able to have that, or is the expectation set that there’s some form of path for them to do it? And there are also multiple ways from my perspective to showcase that. I think when you look at the referencing, it’s as direct as it gets usually, but even that, it can be optimized. Sometimes you have to take it the other way. Don’t optimize for making sure that your research isn’t docs. That’s not what we’re optimizing for either. But what are the ways in which, as a research team, that for better or worse needs to continue kind of driving the narrative of the value that we bring and connecting the dots to the different decisions that have been made because of the research and the leadership role that each researcher is taking and actually guiding the product roap? How could we make sure that we are being very intentional about collecting the evidence and documenting it and being sure that we’re telling our story in a way that does a service to the team ? And often you’ll find researchers in environments where that’s just really hard with their teams. That’s just not how their stakeholders are wired. And when that happens, my question becomes, what’s the role of leadership and me in a lot of cases in streamlining that, but also what are other ways that are effective in gauging the success of that researcher that do not rely on that being the only mechanism? And that’s where that asterisk plays a huge role. And yeah, absolutely. There are some full quarters. We do performance reviews quarterly, which is pretty intense, and sometimes you don’t get to finish things that are meaningful in a quarter. So we want to give people the benefit of the doubt as well into how the research efforts bleed into the quarters after. But there are some quarters where you’re deep in the research, the team itself doesn’t even have a document, and there’s no way to say, “Hey, this is what’s happening.” But we look for other ways to continue connecting the dots there. But for me, one thing that I do genuinely care about, and maybe it’s just from previous experiences in the past of seeing where research can get thrown under the bus sometimes, I think for the past many years, I’ve been very intentional about just telling the story of the ROI of the researcher themselves. So not the ROI of research, which I think sometimes gets confused of the ROI of the researcher. I find it that often, and of course it depends environment to environment, company to company, but I find it that often people don’t debate the value of research. They sometimes debate the value of the researcher doing the research. That’s the topic that comes up here and there, and I try to be as intentional as possible to position the researcher and position the organization to give the space for the researcher to be a product leader, not only a research delivery mechanism, if that makes sense. And with that comes some of the expectations that end up changing. Fingers crossed it worked for me throughout my career. At the same time, I always want to acknowledge that when I say it works for me, there’s also a right time at the right place component of it, and it’s not always on the way that the research is conducted or what the researcher is doing. Steve: Let’s just switch topics again. We haven’t talked at all about, you know, your overall trajectory. And it’d be great maybe to get a — maybe a summary of how you found research, what you started off doing, what some things that you did that kind of led you to this role. Maybe have set the context that we haven’t talked about for what you have been sharing. Nizar: Sure, yeah, I could take it many, many years back. So I actually went to the University of Jordan to study industrial engineering and I was one of the few people who actually cared to have an emphasis on human factors. I don’t know why, but I was always fascinated by the intersection of humans, computers, and business and psychology and all of these together, and I didn’t really know what you could do with it. And it was as early as undergrad that I thought that, okay, this domain seems to cover a lot of those areas, graduated and worked at a company in Jordan under a title that back then was something along the lines of process engineer or something, but in reality it was more understand the inefficiencies in the process and how people are coming in and out of their day to day and how we can make it more efficient. So it had a big kind of research component, and I was aggressively reading about what are some of those programs that I could continue learning in that space, and because where I lived, nobody knew what that was. That wasn’t the thing. And you couldn’t even say, there’s visual design, but you couldn’t really say experience or UX research. And moving on to grad school, I went to San Jose State for the master’s program there, and the beauty of that was just the amount of exposure that I had to a lot of different companies and different people who are doing some of the research work, and it was pretty much a straight shot from there. I went into consulting in a researcher role, went into a startup where I built up from the ground up into a research and design org. So I was managing research and design, moved on to Google, at YouTube specifically, where I spent about three years, and then when the opportunity came out at Snowflake, it was just too hard to say no to that opportunity. So moved on. I’ve been there for about almost three years now, which is crazy to think about. Steve: Do you think of yourself as someone that has a superpower? Nizar: It’s a humbling question, to be honest. The thing I take pride in is I’m always open to being wrong. I’m always open to challenging the status quo and being told that there’s a better way to do it. And I think where I take some of that pride is a lot of the times I hear people that even you look up to throughout your career, and then you get to a point where you’re like, “I kind of disagree. I see a different way.” And trying to challenge status quo for something that could be better is just something that gets me pretty excited. Is it a superpower? I don’t know. Maybe it hinders me at times, but at the same time, especially in the conversation that’s taking place around research right now, you start seeing a lot of the consistent perspectives of this is right, this is wrong, this is what you do, this is what you don’t. And I try to be very intentional in hearing what are those different perspectives and why are they seeing things differently and what works for me and how do I acknowledge that what works for me at the environments that I’m in may not work for somebody else in the environment that they’re in as well and give people the benefit of the doubt and keep running with what I’m doing. Steve: There’s sort of two facets, I think. You started off saying that you’re okay being wrong yourself, but you’re also looking for when the conventional wisdom is wrong. Did I get that right? There’s sort of two aspects. It’s like you’re willing to forgo needing to be right, but you’re also embracing or curious about, hey, maybe something out there that’s established as right, the status quo, like you said. Maybe that’s wrong and you’re like challenging that. Nizar: And think of it like they’re intertwined. Think of it as somebody who is a researcher. For the most part, you’re kind of looking for best practices, you’re looking for the perspective, you’re looking for the voice of the crowd. They’re kind of intertwined a bit. And I think it’s a solid starting point. It’s much better than starting from zero. Learning from someone is always significantly better than just figuring it out on your own. I mentor researchers every now and then and I say, “If you can avoid being the first researcher out of grad school, I would avoid that.” But I want to acknowledge that not everybody has the luxury of picking and choosing especially their first job. So take it and learn on the job is better than not having anything. But it becomes a starting point of, okay, we think this is the best practice. I guess then there’s a tough conversation of, does the best practice make sense? Does the best practice work for me and my approach? Does the best practice work for the environment that I’m within? And where do we continue optimizing it? And how do I continue doing the internal reflection, the internal research on what’s working in the processes that I’m establishing and what’s not? And how do we continue treating honestly my career trajectory as a product that you continue learning, iterating and hopefully making it better? Often it leaves you at odds with what a lot of voices have in place, but it’s okay accepting that as well and being like there’s no reason for everybody to align on one topic. And it’s always fun. It’s always fascinating when you’re the, I want to look at things from both perspectives. Jon Stewart came back on The Daily Show last week and he got a lot of hate because he was in between two sides. But from my perspective, these are the voices that often bring in a lot of reason and just say let’s just call out everything as it is and see how we can look inwards of how we can continue to be better. But it’s always fascinating because you’re going to get some pushback from the side that agrees on one thing and then pushback from the side that disagrees on the other if you’re optimizing in your own way. Steve: Are you seeing patterns in the people that you’re mentoring in of what topics or questions you’re helping them with? Nizar: I think the biggest one is there’s an obsession with the research as the end goal. I think that’s the one that’s just becoming more and more and more apparent. There’s different reasons that when somebody’s starting their career, it makes sense that they think, okay, I’m here to do research. There are some people that are more mid or later career where that’s what they’ve learned how to optimize throughout their career because that was their success criteria. So there’s various flavors of that same thing. But you often hear a lot about like I want my methodology to be, like I’m focused on my methodology or I want to do more foundational research or it’s very anchored on research as the end goal. Even in our interview process, we interview a lot of amazing candidates with amazing resumes and as they’re presenting their case studies, they gloss over why they did the research or they gloss over what happened after it. But they take a lot of pride in the thing that they did, kind of the actions that they took as a researcher. I think that’s the biggest, for me, gap that I see between a lot of the conversations that I have and where I believe research should be positioned as more of a tool to drive decision-making rather than an end goal. Steve: I don’t know a ton about how mentorship could or should work, but are there things that you are able to say or do in these interactions to help somebody shift their perspective to what you’re talking about? Nizar: It depends on the relationship I have with the person too. So to be honest, that kind of dictates a lot of the conversations that happen and honestly, like how hard I push back. There are some people that I used to manage in the past who I’m very comfortable telling, “You’re just absolutely wrong and stop doing it the way you’re doing it and here’s how you can be more effective.” You can’t do that with somebody you barely know. And you try to nudge it in of like how do you expand your thought process beyond what you’re doing into why you’re doing it? And how do we kind of like reset your tone in of the perceived outcomes of the work that you’re doing? And I do a lot of resume reviews and I think that’s a place where people seek and I usually call out that a lot of resumes that I see for researchers read like job descriptions. And I try to tell them, “What’s your superpower? What’s your story?” When I read your resume and I see conduct tactical and strategic research, conduct qualitative and quantitative research, that reads like the job description that doesn’t give you an edge over other applicants and I don&#
01:07:17
36. Noam Segal returns
Episodio en Dollars to Donuts
This episode of Dollars to Donuts features a return visit from Noam Segal, now a Senior Research Manager at Upwork. AI will help us see opportunities for research that we haven’t seen. It will help us settle a bunch of debates that maybe we’ve struggled to settle before. It will help us to connect with more s, more customers, more clients, whatever you call them, from all over the world in a way which vastly improves how equitably and how inclusively we build technology products, which is something that we’ve struggled with traditionally, if we’re being honest here. – Noam Segal Show Links Dental Hygienist Explains Ultrasonic Scaling Interviewing s, second edition Steve Portigal on the Content Strategy Insights podcast Noam Segal on Dollars to Donuts, 2020 Noam on LinkedIn Upwork CatGPT shirt Young trader dies by suicide after thinking he racked up big losses on Robinhood Meta, TikTok and other social media CEOs testify in Senate hearing on child exploitation X is blocking Taylor Swift searches… barely Trust and safety Temperature in Generative AI The UX Research Reckoning is Here (Judd Antin) The Waves of Research Practice (Dave Hora) Strategic UX Research is the next big thing (Jared Spool) Sam Ladner Big Data needs Thick Data (Tricia Wang) Multipliers: How the Best Leaders Make Everyone Smarter Genway Help other people find Dollars to Donuts by leaving a review on Apple Podcasts. Transcript Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead research in their organization. I went to the dentist recently for my regular teeth cleaning. I was in the chair while the hygienist was working away. This obviously wasn’t the best situation to ask a question, but I had a moment of curiosity and I found a chance between implements in my mouth. I should say that at this dentist, their cleaning process, first go over your teeth with something called an ultrasonic scaler. I had assumed this was just like an industrial strength water pick, or like a tightly focused pressure washer for the mouth. After that, they follow up with a metal scraping pick. So during the metal scraping pick portion, I asked the hygienist, “Does the water soften it?” I was wondering if the first stage softens up whatever gets removed by this mechanical pick. Somehow my weird question prompted her to give me a 101 lesson on how teeth cleaning works, what is being cleaned and how the tools are used to accomplish that. Anyway, she starts off by telling me that the water is just to cool the cleaning head. The water isn’t doing the cleaning. There’s a vibrating cleaning head that does that work. I was very excited to learn this because I had the entirely wrong mental model. I had assumed that this device was just water and I hadn’t ever perceived any mechanical tip. Of course, I’ve never seen what this device looks like, other than when it’s coming right at my face when I’m the patient. And I had made all these assumptions based on what I experienced from being in that role. It was a lovely reminder about how we build mental models based on incomplete information, based on our role or interaction and how powerful those mental models are. And of course, this was also a reminder of the power of asking questions, where even this simple question in non-ideal circumstances led to a lot of information that really changed how I understood a process that I was involved in. It was a great reminder about one aspect of why I do this work and some of the process that makes it interesting and insightful. Speaking of interesting and insightful, we’ll get to my guest, Noam Siegel, in a few minutes, but I wanted to make sure you know that I recently released a second edition of my book, “Interviewing s.” It’s bigger and better. Two new chapters, a lot of updated content, new examples, new guest essays, and more. It’s the result of ten more years of me working as a researcher and teaching other people, as well as the changes that have happened in that time. As part of the “book tour,” I’ve had a lot of great conversations er research and interviewing s, and I want to share an excerpt from my discussion with Larry Swanson that was part of his podcast, Content Strategy Insights. Let’s go to that now. Larry Swanson: It also reminds me, as you’re talking about that, it’s like you show up at a place like in the old days, you drive up and you’re in the car with a team, and that’s a good reminder that this is like a business activity. In fact, you open the book with a chapter about business and the business intent of your interviews, and I also like that you close the book with a chapter on impact, which I assume is about the measurement and the assessment of that satisfying that business intent. Was that bookending intentional, or am I just reading into that? Steve: This is where I just laugh confidently and say, “Oh, of course, you saw my organizing scheme.” I hadn’t thought about it as bookending, which that’s a little bit of nice reflecting back. In some ways, I think I was just sort of following a chronology, like why are we doing this, how do we do it, and then what happens with that? So no, but sure. Larry: Yeah, sorry, I didn’t mean to project on that. But anyhow, that’s sort of the — maybe just focusing on the business part of it, because I think that’s something that’s come to the fore in the content design world, and particularly the last couple years. I think it might have to do with the sort of economic environment we’re in, but also even before that, there were people talking about increasing concern with the ROI of our work and alignment with business values, and maybe we’re focusing too much on the customer and not balancing that. But how do you balance or kind of plant your business intent in your head as you go into an interviewing project? Steve: I think it kind of — maybe it’s like a sine wave where it kind of comes in and out. We were just talking about transitioning into talking to Marnie, a hypothetical person, for 30 minutes. I really want people’s business intent to be absent during that interview, so that’s maybe the lower part of a curve. But leading up to that, who are we going to talk to, what are we going to talk to them about, who’s going to come with us? That’s very much rooted in — I don’t know why I made up this metaphor of the sine wave, but we’re very highly indexed on the business aspect of it. We designed this project to address some context that we see in the business, either a request or an opportunity that we proactively identify. So we think about what decisions have to be made, what knowledge gaps are there, what’s being produced, and what will we need to help inform decisions between different paths kind of coming up. I talk in the book about a business opportunity or a business question and a research question. So what do we as an organization — what decisions or tasks are kind of coming up for us? So what do we have to do? We want to launch a new X. We’re revising our queue. We need to make sure that people doing these and these things have this kind of information. That’s . Then from that, you can produce a research question. We need to learn from other people, our s, our customers, people downstream from them, whatever that is. We need to learn from them this information so that we can then make an intelligent response to this business challenge that we’re faced with. So all the planning, all the logistics, all the tactics, what method are we going to use, what sample are we going to create, what questions are we going to ask, what collateral are we going to produce to evaluate or to use as stimulus or prompting?All that is all coming from what is the business need and how we can go at it. Yes, there still is a sideways. So then we set that aside to talk to Marnie, to talk to everybody. We really embrace them. we have all this data. We have to make sense of this data. And then here, I think we sort of straddle a little bit because you’re going to answer the questions you started out with. I think if you do a reasonable job, you’re going to have a point of view about all the things that you wanted to learn about. But you always learn something that you didn’t know that you didn’t know beforehand. And I think this goes to the impact piece. This goes to sort of the business thing that’s behind all this. What do you do with what we didn’t know that we didn’t know? I want there to be this universal truth like, oh, if you just show people the real opportunity, then they’ll embrace it. And then everybody makes a billion dollars and the product is successful. I think that principle from improv is of yes and. I think we have to meet our brief. We’re asked to have a perspective on something. Part of the politics or the comion way of having impact is to not leave our teammates and stakeholders in the lurch. So we have these questions. We have answers to these questions. And also, we feel like there’s some other questions that we should have been asking. We we want to challenge how we framed this business question to begin with. We see there’s new opportunities. We see there’s insights here that other teams outside the scope of this initiative can benefit from. There’s all sorts of other stuff that you get. And I think it behooves us to be kind about how we bring that up, because no one necessarily wants a project or a thing to think about that they didn’t ask for. So how do you sort of find the learning ready moment or create that moment or find the advocate that can utilize the more that you learn that can have even more kind of impact on the business? That’s not a single moment. That’s an ongoing effort. Part of the dynamic that you have for the rest of the organization. Again, that was me in conversation with Larry Swanson on the Content Strategy Insights podcast. Check out the whole episode. And if you haven’t got the second edition of interviewing s yet, I encourage you to check it out. If you use the offer code donuts, that’s D O N U T S, you can get a 10 percent discount from Rosenfeld Media. You can also check out portigal.com/services to read more about the consulting work that I do for teams and organizations. But now let’s get to my conversation with Noam Siegel. He’s a research manager at Upwork, and he’s returning as a guest after four years. You can check out the original episode for a lot more about Noam’s background. Well, Noam, thank you for coming back to the podcast. It’s great to chat with you again. Noam Segal: It’s absolutely my pleasure, Steve. Great to see you and great to be here. Steve: Yes, if you are listening, we can see each other, but you can’t see us. So that’s the magic of technology, although Noam is wearing a shirt that says cat GPT on it. So we’ll see if we’re going to get into that or not. Noam: I do love silly t-shirts. I just ordered a few more silly t-shirts yesterday. My partner is not very happy about that particular aspect of who I am. But, you know, it is what it is and you get what you get. Steve: Right. You got to love all of you. Noam: Yeah. Steve: So that’s an interesting place to start. Let’s loop back. And so it’s to maybe a more normal discussion starter. We spoke for this podcast something like four years ago, early part of 2020. So, you know, I guess maybe a good place to start this conversation besides T-shirts and so on is what have you been up to professionally in the in the intervening years? Noam: A lot has happened. I can tell you that quite a lot given it’s not that much of a long time period in the great scheme of things. Steve: Yes. Mm hmm. Right. Noam: When we chatted last, I was working at Wealthfront, a wonderful financial technology company, and I was head of UX research there at the time. Steve: Right. Yes. Noam: I left Wealthfront for a very particular opportunity within Twitter, now X, because I was very interested in contributing to the health, so to speak, of our public conversation. And I had an opportunity to what was known at Twitter, now X, as the health research team. But we don’t mean health as in physical or mental health. We mean indeed health as in the health of the public conversation. In other companies, these types of teams are called integrity or trust and safety, et cetera. And we were dealing with everything to do with things like misinformation and disinformation, privacy, security, and all sorts of other trust and safety related issues. Sadly, a few months, really, or less than a year after I ed, Elon Musk took over the company. And one of the first layoffs that happened at the company were a layoff of basically the entire research team. And so I left before the layoffs, but that was the situation there. And I’d love to talk more about what that means in of how we build technology, et cetera. We can jump into that. From Twitter, now X, I moved to Meta, and I ed the team working on Facebook feed, which some people might view as the kind of brand page of Facebook or even the internet for some people. It’s a product used by billions of people daily. And it was a very interesting experience to work on both the front end of the feed and the back end as well, so to speak. So that was a very interesting experience. And in addition, I was also playing a role in what we were calling Facebook Research’s center of excellence or centers of excellence, where we were trying to improve our methodologies, our research thinking, our skills and knowledge, kind of working on how we work, which is very related to what we spoke of in the last podcast we did together a few years ago, when we talked all about research methodology, et cetera. So that was an interesting experience. But in April of 2023, along with a couple of tens of thousands of other people, I was laid off from Meta, as was I think approximately half of the research team at Facebook at the time. And several weeks later, I ed Upwork, which is where I work now. Upwork, for those who don’t know, briefly is a marketplace for clients looking to hire people for all sorts of jobs and freelancers who are looking to work, primarily in kind of the knowledge worker space, I would say. Upwork also caters to enterprises who are looking to leverage Upwork as a way to, you know, augment their workforce and hire freelancers or people for full-time positions as well. And at Upwork, I’m a senior research manager. I focus on the kind of the core experiences within the product, which includes the marketplace for clients and freelancers, in addition to everything to do with payments and taxes and work management and trust and safety, which I’m very happy to still be involved in. It’s a topic I care a lot about. Steve: For Twitter, because you were particularly interested in that health, that trust and safety aspect of it, I guess I want to ask why, what is it about that part of deg things for people to use that as a researcher or as a person that it’s something that you strongly connect to? Noam: Yeah, I think we have a set of societal ills, let’s call it, very troubling societal ills that I think we need to address urgently and with great care and with great responsibility. And one of those societal ills is the evolution of public conversation, of how we interact with each other as people and how hateful and nasty and unkind we can be to each other. And how much information put out there online is either inaccurate or completely false. This really came to be more salient in my mind during the 2016 elections to the US presidency. But I think it’s become even more salient ever since for multiple reasons, including the incredible and tragic rise in antisemitism in the world over recent years and all sorts of information running about out there on the interwebs that is, again, factually incorrect around all sorts of topics. Election related, related to certain geographical regions, to certain groups, et cetera. And so this is something I care deeply about, just given my personal background, just given what I’m observing in society. When we last had a conversation, I was at Wealthfront in the fintech space, and I recall a case happening, which really shocked all of us to our core with another company. I’m not going to name the company, but it’s another fintech company. A young person tried to use this other company to make certain types of investments and trades, but he was not well versed in how that world works, how those trades works, what options are and how to use them. And he believed that he had lost an incredible amount of money that he did not have and wasn’t able to lose. And it brought him to enough depths of despair that he ended up taking his own life. And that to me was just one story of many that made it incredibly clear that we need to be responsible and ethical in how we build technology products. And that few things could be more important than working on trust and safety. So yeah, it’s definitely an area I’m ionate about. And we’re recording this a day after yet another senate hearing with all of the heads of different social media companies who were faced with difficult facts about the effects that they’ve had on society and on families who lost their loved ones and other incredibly tragic stories because of the way they built their platforms, because of things they ignored. And I think research can play an absolutely critical role in building trustworthy ethical experiences, responsible experiences that really matter in this world, probably more than anything else I could think of. So that’s the long answer to your question. Steve: What kind of information can researchers provide that can — into situations like the ones that you’re describing? Noam: What sort of research we should be doing or not be doing and at what level, at what altitude, if that company put more effort into the first of all, age gating the platform and ensuring that people have the knowledge and the skill to conduct certain trades, but beyond that, the usability of the platform. Going back to the basics, which we don’t do enough of, I would suggest, which is just making sure that the information one is seeing is clear, you know, and not open to interpretations that could have incredibly tragic consequences. Like thinking I lost $700,000, I think was the number when that was in fact not the case at all. So for me, it goes back to those basics. Research can inform all sorts of more nuanced reactions than the one we’re seeing. Another thing that happened this week while we’re recording, which demonstrates what happens when you let go of your entire trust and safety team, including researchers, was that Taylor Swift, the incredible pop singer, artist extraordinaire, she was facing something incredibly tough to face online, whether you’re a celebrity or not, which was AI-generated nude images of her, fake images obviously. These were all generated by AI such that if you searched for her name, for Taylor Swift’s name, on particular social platforms, you would see those AI-generated images and perhaps believe, because they were very realistic, that these were in fact images of Taylor Swift when they were not. The solution this company came up with was to remove any search results for the Taylor Swift or Taylor or Swift or any combination of her first name and last name, which is moronic. I’m not sure what adjective to use. It’s incredibly aggressive, and I think as technologists we can do a lot better than cancel an entire search query because of that sort of thing happening. I think one thing for sure is that there’s no doubt there is a need for trust and safety professionals. There is a need for trust and safety researchers. We know how to inform the responsible and ethical building of these sorts of products and how to address these issues in much more nuanced and rational ways with much better outcomes. I mean, that seems pretty obvious, but it’s clearly not obvious to some of the people leading some of these companies. I hope that changes, and I’m very proud that at Upwork we do have these teams and we are working on these things. We care deeply about the trust and safety of the people we serve. Steve: I mean, you’re describing a failure with Taylor Swift AI images that there’s, you know, I guess the jargon is bad actors. People are behaving in a way that’s harmful. And when that happens, when there’s a system that can be exploited or manipulated or used to cause harm, you know, I think you’re identifying like there’s a gap. The system can be used that way. But you’re saying also that without researchers, companies are not as well set up to respond to those malicious behaviors. Noam: Yeah. Steve: And I’m, I guess I’m just looking to have the dots connected for me a little bit more like, but you can see sort of the failure of the systems and the failure of the humans that are the malicious s. But in that scenario, or kind of analogous ones, how do researchers serve to either prevent or, you know, mitigate those kinds of malicious uses? Noam: So I can give – there are a few answers here. One example would be that at X and other companies, some of the research we did and in some companies are still doing goes into ing content s and agents and other such people who are reviewing this type of content. And the research informs building tools so they can get to those problems faster and eliminate them and get rid of that content in more efficient and more effective ways. So for the time being, as you probably know, there are often humans in the loop here reviewing content. They’re using certain tools to do so. Those tools make them better at their job, and building those tools requires research. So that’s one example I would suggest. Another example is that in certain companies that, again, care more about these trust and safety issues, research informs providing s with tools that enable them to control their experience. Whether it’s blocking certain people or removing certain things from their experience or a bunch of other things that we can do. But ultimately, some platforms choose to give people more agency and more control over their experience, and research has heavily informed those sorts of tools. And you end up with an experience that is catered to your needs and what you’re willing to see and what you’d prefer not to see. And I think a final example is that even though I think a lot of researchers, we think of ourselves as mostly informing the -facing experience. You know, the actual designs that people end up seeing when they use a product, for example, Facebook’s feed. Several researchers in our field work more on the back end of things and helping companies sharpen and calibrate their algorithms such that the content that shows up for s makes more sense. We had that at Twitter, now X. We had that at Meta. Most companies that have any sort of recommendation systems and search systems and other such systems, they’re doing a lot of research on what to showcase to s and what sorts of underlying taxonomies make sense and various tagging systems. All sorts of inputs and parameters that go into these models and adjust these models. To give an even more specific example, in the realm of AI, we have all sorts of parameters, right? One of those parameters is called temperature, and when you adjust the temperature parameter, it sort of influences how creative versus how fixed by nature the algorithm responds to things, right? Like, how much it kind of thinks out of the box, so to speak, versus not. When you change the temperature of an AI-based tool, that of course influences how people experience it, right? And how they experience maybe how empathetic that experience feels or how aggressive it feels or how insulting it feels and so forth and so forth. And we need a lot of research going into these things to understand how tweaking all of these parameters affects how people perceive these tools, these technologies, these experiences that we’re building. So those are just some of the ways in which research, I think, can inform these topics. Steve: I don’t know. You used the word health kind of early on here. There’s a quality to the experience that we have with these tools, these platforms, separate from bad actors and abuse and misinformation, disinformation. There are research questions to just set the tone or kind of create the baseline experience. It sounds like that’s what the — if you’re working on the feed at Facebook, if you’re thinking about that algorithm, you’re using research to just create a — ideally in the best situation, a healthy versus unhealthy experience. Just — I think there’s research that talks about, oh, when you compare yourself to others, if you see positive messages, you react this way. If you see negative messages, you react this way. So you’re making me realize that asking these kinds of questions around sort of the healthfulness of the experience, I think I locked in on sort of malicious behavior, bad actors, exploitation and so on. But I think I’m hearing from you that there’s just a base on it, like what’s it like to go on — you know, I mean, what’s it like to go on LinkedIn every day when people are being laid off or when people are trying to get your attention or when people are performing, as people do on all these platforms. There is an experience that research can help understand and inform fine-tuning of algorithms and sort of what’s shown to people and how in order to create the desired experience. Noam: Absolutely. Trust and safety is an incredibly complex space. It’s very layered. To your point, you can create more trustworthy and safe experiences if you stop bad actors from even entering the experience in the first place. And again, ATX and I imagine other companies as well, part of what we did on the research side was inform things like security. And how do we help people secure their s and how do we make it harder for bad actors to open s even though their intentions are malicious? So you can create more trustworthy platforms by stopping bad actors at that stage. And then there are more lines of defense, which again, research can inform each and every one of those lines of defense to make sure that the ultimate, the end experience for each and every is a healthy one, is a trustworthy one. And I really don’t think there’s any stage where research can’t have incredibly meaningful impact. And I just really, really hope that as we lean in even more to AI and other incredibly advanced and complex systems that to many of us are a weird and wonderful black box that we simply do not understand. I really hope that we increase our investment in trust and safety exponentially because if we don’t, I really think the results will be horrific. And it’s our responsibility as insight gathering functions, as researchers, whatever you want to call it, to take ownership of this, to advocate for this and to make sure we’re doing this in a way that matches the incredible evolution and development of these platforms. It’s just incredible to witness. Steve: If we were to go into the future and write the history of, I guess I’ll just call it trust and safety research, what era are we in right now for that as a practice or an adoption? Noam: That’s a tough one. That’s a tough question. I think the only response I have for you now, but let’s talk again in four years or so, is that trust and safety in a sense, you mentioned bad actors earlier. Trust and safety in a sense is always this ongoing battle between forces of good, so to speak, and the forces of evil trying to catch up to each other and match each other’s capabilities and then beat the other side. With even better capabilities, I think what we’re witnessing now with AI-based systems is that the pace of innovation, the pace at which they are evolving and learning is shocking and hard to comprehend. It’s really, really hard to comprehend. Now, that’s not to say that it’s not going to take a long time before some of these systems are fully incorporated in our lives. We’ve been talking about self-driving cars for a very long time, and they are absolutely out there right now in the streets of San Francisco and maybe Phoenix, Arizona, and maybe a few other cities doing their thing and learning how to do their thing. But I think it’s going to be quite a while before every single vehicle on the road is a self-driving car. But that said, these systems are just getting more and more complicated. I think our ability to understand them, it’s getting very difficult. We have to figure out what tools we need to develop in order to catch up, in order for the forces of good, so to speak, to match the forces of evil. And we also need to that everything these systems are learning, they’re learning from us. And sadly, the human history is riddled with terrible acts and a long list of biases and isms, racism and sexism and ageism and everything else. So these AI systems are sadly learning a lot of bad things from us and implementing them, so to speak. So again, we have a great responsibility to be better because a little bit similar to a child, AI systems are learning from what we are generating. So we kind of have to be a role model to AI and we have to make sure that we’re leveraging AI, maybe somewhat ironically, to deal with issues created by the incredible development of this technology. So I hope that sort of answers the question. Steve: We know as researchers, right, any answer that doesn’t answer your question reveals the flaw in the question.vAnd my flaw is that I asked you to decouple research for trust and safety from everything else. And I think you answered in a way that says, hey, this stuff is all connected. The problems, society at large, the technology, and the building of things are all connected and research is a player in that. So, yeah, you gave a bigger picture answer to my attempt to sort of segment things out. I think we’re going to come back to AI in a bit, but I wanted to ask you, in addition to sort of trust and safety that we’ve talked about over the four years and this issue of building responsibly that you’ve highlighted, are there other things that you have seen or observed about our field that you want to use this time to reflect on? Noam: Yeah, absolutely. I think, as I mentioned, because this happened to me as well, we’ve seen a large number of tech layoffs and certainly for research teams, but not only, of course. We’ve seen reorgs happen, major reorgs. Because, I mean, reorgs are a reality in tech, everyone who’s worked in tech knows this, but we’ve seen some major reorganizations. And in fact, we’ve seen entire research teams shut down, including the example I gave earlier of the team at Twitter, now X. And as part of that, we’ve seen some incredibly thought-provoking articles come out. And I’m sure you’ve read some of these. One of them from a former leader at Airbnb, Judd Antin. He was my skip level manager, wrote an article about how the UX research reckoning is here. Another incredibly interesting article was around the waves of research practice by Dave Hora. Jared Spool wrote an article about how strategic UX research is the next thing. And I think that what all of these articles had in common was some sort of discussion on the value that insight-gathering functions or research functions bring to the table. And you might not be surprised by this, but I have a hot take for you on this that I would be happy to discuss. Steve: Bring it on. Hot take Noam: Hot take time. Steve: I’m ready. Noam: Are you ready for this? So here’s the thing. If we stick to the UX research reckoning framing, I’m a bit of a stickler for words. So the relevant, or I believe that the relevant definition for reckoning that Judd meant to reference is the avenging or punishing of past mistakes or misdeeds. So basically, as UX researchers, we made some mistakes, we made some misdeeds, and now we are being punished for it by being laid off. And again, the broader point in that article, I think, is what’s the value we bring as researchers? And I am here to say that although I agree that we’ve made mistakes, this has, everything that’s happened, has very little, if anything, to do with value. To do with the value that we bring. And I think it has everything to do with valuation, which is a very different thing. And if I take Meta as an example, Meta was suffering from a tough time as a company, spending a whole lot of money on AR, VR, and other capabilities. The stock was at one of its lowest points in recent years, if not the history of the company. And so Mark Zuckerberg announced a year of efficiency. And part of his idea of efficiency was to lay off about half of the research organization. And we have to ask ourselves, is that because researchers did not bring value to the organization? And again, I would suggest not. I would suggest that these days I’ve had work, and in every company I’ve been part of, I’ve seen some incredible value brought forward by researchers. Insights that can make a huge difference to everything from the experience to the strategy to use Jared Spool’s and others . But there’s a couple of problems. The first problem is a problem of attribution. How can you calculate the return on investment of research? How do you know and how can you record and document which decisions and which things were influenced by research and which weren’t? If I’m an engineer or designer and I’m working within my Jira or whatever platform or linear or whatever platform you’re using to manage your software development, then I have some sort of ticket. I have some sort of task. I write 10 lines of code. Everyone knows those 10 lines of code are mine or mine and other people. Everyone knows what those lines of code translate into in the experience. And so the ownership of what that experience looks like from design to engineering is clear because it’s clear who made the Figma and it’s clear who wrote the code. And everything is incredibly accurately documented. When it comes to research, when it comes to knowledge, you know, research is circulated in all sorts of ways, right? From Slack channels to presentations to a variety of meetings and one-on-one get-togethers with cross-functional partners. And in all of those meetings and all of those interactions, research is coming through in some way. But it’s incredibly fuzzy and unclear how that translates into impacts on the products. That doesn’t mean research doesn’t have value. That means it’s hard to measure the value. And then one more thing that I think is going on, which you probably know very well as one of the most knowledgeable people on the topic of interviews that I know, is what happens when I’m responding to your question? In this case, maybe you have some questions about what insights have we learned? What happens as I’m giving you a response? What are you doing? Make a guess. Steve: I’m thinking about my next question. Noam: You are thinking about your next question. It’s so hard to avoid that tendency. And I think in many cases, product managers, product leaders and other cross-functional partners of research, they’re taking in the research, but they’re just thinking about their next question. And to be fair, I think that one more thing that’s going on here is that we as researchers do not understand the feeling of being held able for certain metrics. And for millions, if not billions of dollars in revenue, that can be moved one way or the other by the quality of what we choose to build and what we choose not to build. And the roaps we have, the strategy we have, etc. Usually those are product leaders who are able for that. And we’re not. And so the pressure is on them. And so as they take in our insights, they can’t help but just think their own thoughts and think about their vision and maybe ignore certain things that we share. And then business leaders, ultimately, what do they care about? Again, valuation. The stock. That’s just how it works, which is why I said in the beginning that I don’t think this is about value at all. I think it’s about valuation. I think business leaders are optimizing their business for their valuation, for their stock price. They’re not laying off researchers because we didn’t deliver value or because we weren’t strategic enough. They’re laying off researchers and many other people because that’s one way of a few ways to become more efficient, to look good in front of your shareholders. It’s not such a complicated game. You know, we’re doing this interview a day after a particular company started offering dividends to its shareholders, and that had a very expected effect in the market on that stock. It just went up quite a little bit higher. That’s how the game works. Those are the dynamics of the market. And so we’re in the situation where I’m not saying we haven’t made mistakes again. I think Judd, for example, absolutely had a point when he discussed the different levels of research and the fact that we’re making a mistake by looking at usability as some sort of basic, tactical type of research that only junior researchers should do and that we shouldn’t be focused on. And that we should only be looking at higher levels and higher altitudes of research. I couldn’t agree more. I absolutely agree with Judd on that. But this basic premise of research not delivering value I think is incredibly problematic. And I don’t think it’s correct. I don’t think we need to move into some third or fourth or whatever wave of research. I just don’t see that personally. I think many of us have already been in wave one and wave two and wave three of research. We’ve already been doing strategic research. We’ve already been affecting the business level, the product level, the design level. We’ve already been conducting all sorts of research from usability to incredibly foundational, generative research. And I think we’re being very, very hard on ourselves. And I think we need to cut ourselves a little bit of slack. Just a little bit. Steve: I mean, I’m all about being kind to ourselves and not blaming ourselves for things that are beyond our control. We’re all susceptible to that and it’s hard to kind of watch that going on collectively. But when you’re in a situation where there is, I don’t know, a misalignment of values, like what, you know, like you said, value versus valuation. When that misalignment, that’s my word, not yours, when that exists, we can cut ourselves slack, but that’s not going to change that gap. I don’t want to say to you like, well, here’s, you know, you just outlined a systemic, deeply rooted, the nature of capitalism, it goes all the way up. How do we fix that? I guess I don’t, I think that’s not a fair question, although, you know, take a shot if you have a hot take there. Are there mindset changes or incremental steps or, you know, things that you’ve seen research teams do that acknowledge to some extent the difference between we’re not breaking value to their concern is about something else and how do we kind of meet them where they’re at? Noam: So look, Steve, that’s an incredibly fair question. And I do want to be crisp about the fact that, yes, we need to be doing something. Something needs to change even if I view the problem differently. But before I get to that, just to reiterate, we as researchers know very well that it’s absolutely critical to identify the problem and to identify the correct problem at the right level. So before I get to what we should do, I just want to highlight the fact that in my view the issue here is that some of us have misidentified the problem, in my opinion. And we need to be tackling the actual problem. And just to get to that and to pivot to kind of the second topic that we did cover last time and I want to cover again today. We did talk about in our original conversation about research methods and how we do research. And I do think, even though I believe we’ve brought a lot of value to the organizations that we work in as insight gathering functions, I do absolutely believe that given the broad evolution of the landscape we operate in, we do need to rethink how we operate. Not because we haven’t delivered value, but because the ways in which we can deliver value are rapidly changing. And I think we can now sort of extend ourselves. And I was very influenced by a book titled Multipliers, not sure if you’ve read it. But the basic idea of it is that there are employees within any company who are multipliers in the sense that they don’t just do great work, they make everyone else’s work even better. They level up everyone around them and they create these situations where they define incredible opportunities and they liberate people around them to get to those opportunities and to make the most out of them. They create a certain climate, which is a comfortable climate for innovation, but at the same time an intense climate where a lot of incredible things can happen. Where I’m getting at is that, and this is not surprising probably to the people listening to this, is that the era of AI is upon us and I think it’s incredibly important to acknowledge the ways in which we can extend our work and ourselves with AI tools. So I know that my mind has moved a little bit from methods, so to speak, to leveraging AI to use similar methods but at a scale that we’ve never experienced before and we’ve never been able to offer before to our partners. Yeah, I mean, I think there are certain paradigms in our industry that are changing and perhaps AI is even eradicating those paradigms and rendering them useless. I mean, if it’s okay, one recent example I have is that we had this paradigm that we need to make a tough choice. We’ve talked about this, you and I, a little bit. We have to make a tough choice between gathering qualitative data at small scales, which can often be okay, by the way, unless you’re developing a very complex product or unless you want to make sure that trust and safety is in the center of everything you do and then maybe you need a little bit more scale and you just couldn’t get it because you didn’t have the people to reach that scale of interviews or qualitative research. Or, of course, the other choice you could make was to gather quantitative data at any scale you like as long as you can afford it, namely by sending out surveys to hundreds or thousands of people. The issue is survey data is shallow data or thin data or whatever you want to call it, whereas I believe it was Sam Ladner who coined the term “thick data” for qualitative data. And sometimes you need that thick data and you need it at a scale that we were never able to reach before. And AI enables you to do that. I’ve personally witnessed tools, one of them being Genway, which are completely revolutionizing the way we conduct research. I’ve seen existing research tools, Sprig would be a good example, Lookback, there’s so many incredible tools that have incorporated AI into their workflows. And they are making paradigms like the one I mentioned, this choice between thick and thin data, they’re making them irrelevant, absolutely irrelevant. Which is very interesting to me. And it ties to this idea of multipliers, this idea in this book I love. Because AI research tools, like the ones I mentioned and so many more that we could talk about all day, they enable us, in a sense, to be multipliers. They liberate us, in a sense, to do a lot more than we could ever do before. And hopefully that translates into us enabling our cross-functional partners and the teams we work in to deliver their best thinking and their best work as well. So that’s, I think, where our field is going in a nutshell. Steve: Can you describe with maybe a little bit of specificity what a work process or set of work tasks that a researcher might go through where AI tools like the ones you’re describing, like how is that, yeah, what are they doing, what’s kind of coming to them and, you know, what does that process look like that’s AI enabled? Noam: I can give a couple of examples. The first example, if I think of a tool like Genway, an interview tool, is that interviewing is tough, as you know well. You’ve written what I consider, and many people in our industry consider, kind of the Bible of interviewing people. No offense to the actual Bible. And as someone who’s written one of the primary guides to how to interview people, I think you appreciate more than others how complex being an interviewer can be. It’s something that you can learn over years and years of training and mentorship and still not nail some pretty critical aspects of interviewing. For example, asking the right, the best, the ideal follow-up question, and actually listening to what’s being told to you actively, rather than thinking about that follow-up question all the time, because listening is what enables you to ask a good follow-up question. Systems like these can train on an unlimited number of past interviews and an unlimited number of texts like your book, and learn from all of that how to conduct the best possible interviews, right? And these types of abilities to learn and then apply that learning in an interview situation, I believe it’s fair to say it would be technically impossible for any researcher to achieve that level of learning, certainly in a matter of hours or days or months at the most or weeks, rather than years in the case of a researcher. You know, one of my hot takes, I hope the audience doesn’t kill me for this, is that David Letterman interviewed people for many, many years. And I personally think David Letterman is a horrible interviewer. I never understood why he asked the questions that he did, and everything about his interview style is very, very odd to me. But putting that aside, interviewing is a very complex skill. None of us can really ever witness how other people do it. And we all have to spend years of practice learning how to become better interviewers, which is a deceptively difficult skill to build. And these AI tools are coming in and, at least in theory, can learn all of that shockingly quickly. That’s one example, and I’m very curious to see how the research community responds to these types of tools and uses that, and what issues they do find in the quality of these types of interviews and how they can be improved. The second example is that I recall from even my undergrad psychology studies, not to mention my graduate studies, that our ability to hold information in our brain is quite limited. And so even when we’re synthesizing five interviews, not to mention 500, because sometimes you need 500, it’s very, very challenging. If you do five 30-minute in-depth interviews with people, organizing your thoughts and synthesizing those interviews has never been a trivial task. And I think there are a large number of biases and other issues and strange heuristics that we use to synthesize information that might not lead to the optimal outcome, an outcome that’s as objective and accurate, an accurate representation of the entirety of those interviews and how they interact with each other as we would want them to be. One particular task that generative AI and AI in general is very good at is summarizing and synthesizing information. And especially as we collect more information, that becomes a lot more relevant and even critical, I’d suggest. When we entered the big data era, we needed to develop a bunch of tools. You know, so many companies came out of that era building tools that enabled us to analyze and very easily visualize in beautiful dashboards what those data are telling us. Now, we can also start collecting qualitative data at unimaginable scales. And not just qualitative or quantitative data, because I think that distinction is going to matter less and less as time goes by, but more importantly, we will be and become so much closer to the people we serve, our s, our customers. I think we’ve talked about this in the previous podcast, but I think we talk about diary studies and how diary studies used to be physically sent to people’s mailboxes, right? And so you as a researcher had to plan your studies, send out an actual diary, have people log their entries into it, and then they would have to send it back, and then you would have to very manually look into those entries. And obviously that takes a very long time. These days, and especially with the of AI tools, you can be in touch with the people you serve all the time, as much as you want and as much as they want, and you can both collect data and synthesize data and even communicate those insights at a pace that’s hard to even fathom, for me at least. But it’s very immediate. Can you say very immediate or can you not modify the word immediate? Is something either immediate or not? Okay, I don’t know. I’m just afraid of my mother and what she might say here about my grammatical choices. But anyway, yeah, yeah. But I think that’s what matters the most. Steve: We’ll ask her to fast forward over this part. Noam: People’s schedules don’t really matter anymore because they can choose to interact with AI, for example, whenever they want to. And it can be in context in real time. And then AI can immediately synthesize those learnings. And it can immediately improve the way it collects insights based on that interaction and all previous interactions. I was thinking about this a lot in this framing of multipliers. Like, who is the multiplier in this context? Steve: What does this hold for researchers? Like there’s research, which I think you’re describing a really audacious vision for how, what research will be and I think speaks to the point about valuation versus value, but researchers, which currently refers to humans, what do you, what’s your vision for that or your anticipation for that? Noam: Is it the AI? Is it the researchers? Like, who is multiplying whom? But what I do think is that, well, first of all, I’m a techno-optimist or whatever you want to call it. Even with everything that’s happened, even with all of the tragedies and the negative aspects of technology that we’ve discussed in this conversation and others, I am still at heart a techno-optimist. And so my deep belief is that certainly for the foreseeable future, if not beyond, AI will become a valuable extension of ourselves and our work. And I do believe that even if we don’t have to deliver more value necessarily than we already are, even if our value just goes underappreciated and there’s nothing terribly wrong with how we’ve approached things, I still think that AI will augment our work, will amplify our work, will enable us to really invite the teams we work with to do their best work ever. Because AI will help us see opportunities for research that we haven’t seen. It will help us settle a bunch of debates that maybe we’ve struggled to settle before. It will help us to connect with more s, more customers, more clients, whatever you call them, from all over the world in a way which vastly improves how equitably and how inclusively we build technology products, which is something that we’ve struggled with traditionally, if we’re being honest here. A very clear example of that is that an AI can speak a bunch of different languages and connect with people across all time zones and languages, and even mimic certain characteristics of a person so they feel more comfortable in that context. So for multiple reasons, I feel like if we’re at all concerned about getting buy-in from our partners, if we’re concerned about the value we bring, the impact we have, I definitely think AI tools can really improve our chances of getting to where we want to be. And I think it’s going to be a very long time, if ever, before these tools replace us as researchers. You know, the reason I chose to be a researcher, the reason I chose to be a psychologist is because of the incredible complexity of the human mind. You know, the people listening to this can’t see this, but for my friends who are physicists, for example, so I’m holding up my phone right now in front of the camera, and if I drop my phone onto my desk or onto my floor, it’s a very easy calculation for physicists to say how quickly will the phone hit the desk, and what energy will there be there when the phone hits the desk, and what’s the chances that the phone will break given the speed of its fall. Physics is a beautiful thing, but it’s also a fairly reliable scientific practice. You know, there are rules in physics, and they’re fairly clear. And even though physicists sometimes, on occasion, like to look down upon people like me with a PhD in psychology and a background in psychology, I think that in many ways, the field of psychology and other fields that deal with the human experience, with the human mind, they are so much fuzzier and so much more complex. And I’m saying this because I think many, many other professions will be replaced by some form of AI before researchers ever are. And that’s because of this complexity, this fuzziness that’s hard to capture. I think in many ways, our field is incredibly technical, but in other ways, our field is not technical at all. There’s a lot of art to it, and there are all sorts of different aspects to it. It’s a lot easier for an AI to generate a piece of code or to generate a contract or to read a mammogram or an MRI and identify something, rather than talking to another human being and understanding them deeply. I think that’s a lot more complicated. So I’m not too concerned about the research fields, but we’ll wait and see, I guess. Steve: So as we kind of head towards wrapping up, you know, since I’ve known of you and known you, I’ve always seen you doing different things, I guess, to be involved with the community of research and what’s that look like for you now? Noam: I, like many people in our profession, have definitely been on a journey. And if I’m being honest with myself and the audience, it’s been a very challenging few years, certainly for me. And I know for many others out there, whether it’s COVID and layoffs and a bunch of other personal events and things in life that just happen to us. And that’s part of the reason why I decided, and I know quite a few other people in our community decided to do this, to pursue coaching, among other things. I took a certification in coaching. I think even with my background in psychology, I felt like there was so much more to learn in this realm. I think that it’s always been important for me to people in our community, and I wanted to do that even better. And so one of the things I’m doing these days, to a limited extent, is some coaching, not just for UX researchers or UX professionals, but people in tech in general. And then I have some thoughts for the near future around sharing some of that coaching and ideas in other ways. In addition to continuing to teach in all sorts of ways, I’m still teaching at a bunch of different institutions and planning to restart some of my teaching on Maven, which is a wonderful platform for learning all sorts of things. I think the general trend in my life and career right now, and maybe this will resonate with people, is that definitely a challenging few years feel like maybe now coming out of it a little bit, ready to take on certain other challenges in addition to my role at Upwork, etc. And I know that I really want to be there to our community in particular as we all go through rather challenging times. So I just invite anyone who wants to get in touch to message me on LinkedIn or email me or get in touch in any way that works for you, and I’d be happy to chat and help. And then I also thought about, and we’ll see where this goes, but as you can tell, these topics of AI and where our field is going and how it’s evolving, that’s a lot to me and I’m thinking about it constantly and want to be part of this evolution, if not revolution, in how we work. And so I’d love to have conversations similar to this, whether out in public or privately around these topics to continue to understand them. And I’m just looking forward to seeing what is next for our industry. I feel like when we spoke a few years ago, I think we had a solid sense of what’s to come. And I think in many ways we discussed things that did end up manifesting in some way or another. But in this conversation, Steve, I don’t know what we’ll be talking about in four years. If you give me the opportunity to talk to you again. And I can’t decide with myself if that’s exciting or incredibly anxiety-provoking. So I don’t know. Why not both? Or if I’m going to be an optimist or say I’m an optimist, then I’ll choose to be an optimist and say, maybe that’s exciting. Maybe it’s exciting that I really don’t know what’s coming down the line. But I do know that I want to thank you again so much for taking the time to do this. It’s always such a pleasure to talk to you. So thank you for giving me the opportunity. Steve: That’s my line, man. I’m saying thank you. Noam: Well, for me, it’s a special treat. Maybe I can share also with whoever’s listening to this that we did get a chance to meet in person finally. Not that long ago. And that was even more of a treat. And I really do hope that our community of research can get together more often moving forward and meet up and discuss all of these issues. Steve: These are some really encouraging, I think, provocative things to think about and some really positive and encouraging sentiments for everyone. And there will be show notes that go with this podcast as always and so the stuff that Noam you’ve mentioned and you know ways to get in touch with you, we’ll put that all in there so people can connect with you if, if by some chance they aren’t already connected with you. So yeah, I’ll flip it back as well and say thank you for taking the time and for thinking so deeply about this and sharing with everybody. It was lovely to have the chance to revisit with you and kind of catch up on some of these topics after a few years and. I look forward to four years from now doing this again if not sooner. Noam: Can’t wait. Adding it to my calendar right now. Steve: All right. Noam: Cheers, Steve. Have a good rest of the day. Steve: Cheers. Well, that’s another episode in the can. Thanks for listening. Tell everyone about Dollars to Donuts and give us a review on Apple Podcasts. You can find Dollars to Donuts in most of the places that you find podcasts. Or visit portigal.com/podcast to get all the episodes with show notes and transcripts. Our theme music is by Bruce Todd. The post 36. Noam Segal returns first appeared on Portigal Consulting.
01:15:48
New episodes! New book!
Episodio en Dollars to Donuts
Today we’ve got a quick program note about new episodes of Dollars to Donuts, an announcement about my new book, and an interview with Steve Portigal. Show Links Interviewing s (2nd Edition) Steve on The Rosenfeld Review Podcast “How-to with John Wilson” on HBO Introduction Steve Portigal: Hi, everyone, it’s Steve Portigal. The podcast has been sitting quietly for a little while now, I guess, but I’m back today with two quick announcements and a bonus. First of all, new episodes are coming. I don’t know exactly when but the wheels are in motion. I’m looking forward to some great conversations about leading research, and I’m excited to share those with you. Second, I have a new book just about to come out. I wrote Interviewing s in 2013, and it’s become a classic text er research. Now it’s 2023, 10 years later, and lots of things have changed. So I’ve updated it! The book – Interviewing s, second edition, will be published on October 17, 2023, and is available NOW for pre-order from Rosenfeld Media, at a 15% discount. The bonus is my conversation with Lou Rosenfeld, where we talked broadly er research, as well as the second edition of Interviewing s. This interview originally appeared on the Rosenfeld Review Podcast. So welcomed by the sound of the trumpeting elephant, we’ll go there now, and I’ll see you back here before too long with more episodes of Dollars to Donuts. The post New episodes! New book! first appeared on Portigal Consulting.
37:25
También te puede gustar Ver más
A VER, HABER, HAVER Cuando en 2017 estaba descargando camiones por 1000€ al mes, un compañero, en medio de la faena, me preguntó… ¿Cuánto odias este trabajo? -Odio ganar tan poco dinero, pero descargar cajas 10 horas al día mantiene mis brazos de acero a estrenar. Este trabajo sería ideal si pudiera venir en avión privado. Nos reímos. Hoy gano varios millones de euros al año como copywriter, doy talleres por todo el mundo, publico con uno de los mejores sellos del Planeta y mis extraños libros se venden por miles, también tengo miles de haters a los que adoro (no es broma) y viajo en avión privado, aunque no lo hago por trabajo, es placer. Supongo que te preocupa saber si me mantengo en forma. Lo hago, todos los días a las 11:00h estoy con las pesas. No es lo mismo que los camiones, pero casi. Y en este podcast te voy a contar cómo me hice millonario escribiendo y cómo puedo hacer que tu negocio gane mucho más si aprendes a escribir de una vez. A ver, haber, haver… Actualizado
El Podcast de Webpositer 🎧 Tu Podcast de Emprendimiento, Negocio y Marketing Digital ▶ Nuevo Episodio todos los Martes 🎙 Presenta: @lu1sma | By @webpositer Actualizado
Caviar Online: Comunicación y Marketing Digital Caviar online es un podcast de Marficom con Carles Fité y Joan Martín. Cada viernes nos cuentan un tema de marketing y comunicación digital además de repasar todas las novedades de la semana en el mundo de las redes sociales. Lo hacen mezclado con música y con un estilo muy particular, siempre con buen humor y contenido de calidad. Actualizado