← Previous · All Episodes · Next →
Developer User Research extends your runway, with Ana Hevesi Episode 21

Developer User Research extends your runway, with Ana Hevesi

Ana Hevesi is a Developer Experience Researcher and Ecosystem Consultant. She has experience marshalling technical products and communities at companies including Stack Overflow, Nodejitsu and MongoDB.

· 17:34

|

Jack: Hi, everyone. You're listening to Scaling DevTools. The show that investigates how DevTools go from zero to one. I'm joined today by Ana Hevesi, who is a DevTools UX researcher and developer ecosystem consultant.

Anna has a ton of experience in DevRel open source community management at companies like stack overflow Nodejitsu and MongoDB. The very observant of you will notice that I also worked at Stack overflow, and we were there at the same time, but we unfortunately did not meet. Anna, thank you so much for joining.

Ana: Jack. Absolutely. It is great to be here. I'm sorry that, you know, we didn't meet while we were actually officially colleagues, but like better, late than never. Right.

Jack: Absolutely. And I'm really excited about this episode because I think that messaging user research just really understanding your customer, your developer, is just so important. So thank you really for your time.

Anna, what drew you to start doing user research on developers?

Ana: I have been in open source community management and developer relations for, a long time over, a couple of cycles, lot of different companies and ecosystems. And what I found over time was that a lot of the efforts that we employed that me, my colleagues, my teammates, my direct reports, employed to reach users we were not always as clear as I felt we could have been about. Why a given thing was the right thing to do and why it was having the impact that we wanted to have. And so somewhat recently, I was in the community team, of a premature, public organization. And we were working on what exactly. To do next and you know, how the community team fit into the subset of, of developer relations, uh, activities and so on. And I said, Hey, I think there's really an opportunity to get specific with our users, especially, you know, given the maturity, , of this product. Let's sit down. Let's actually do some developer user research with our users. And what came out of that was all of these very specific opportunities. That weren't really surfacing via, you know, existing marketing processes or weren't necessarily on the radar of, uh, the developer advocacy team.

But that, were for example, really exciting use cases that we hadn't even known, but that might have made sense to continue to support or, It got to be very clear that, folks were, eager to start, start learning from, from each other on a certain area of the product. And I, I shared the findings with everyone and folks said, this is great.

Why haven't we been doing this? And at that point, I felt like I was probably onto something. And that's where I, went out, uh, as an independent consultant, uh, and been offering this ever since.

Jack: And Anna, how do you usually conduct developer user researcher at startups?

Ana: So really like to start with a question that I think of as like the anchor question and you want this to be. Big open ended and crucial to the business. and so examples of questions very often, we're just starting off together. Me and my clients with, In what ways does this product solve meaningful problems for our users? But it can also be, you know, depending on where things are a little bit more fine grained than that, be things like why do people get involved in our support community? Uh, or why aren't we attracting more open source contributors? Or is this pivot the right evolution for our product. And so when , we're early, as so many of my early stage clients are in this practice and getting to know our users, I recommend, I like to have us do a broad swath of, Different types of users.

I like to kind of go as wide as possible. And so we will optimize for people who, uh, have varying educational backgrounds have spent really different amounts of time in the field, who have different roles within their orgs who have different representations of race, gender. So on so forth, and I tend to find that going wide, that way means we surface the most opportunities to find the things that we should be drilling into further later gonna be useful down the line. So, you know, from there, we, we reach out, we, we ask users for their time. For what it's worth. I take that piece very seriously because when users are giving us their time, like that is an absolute gift. I'll write a script which uncovers, two things together. And it's both where they are in their journeys as technologists.

So we can understand what their goals are. In their careers next in their life next. And then we also get into a lot of nitty gritty about how they are using the product. Typically, we will either analyze, we, we will either take recordings of the conversations or I will have someone, writing a shotgun, and taking notes from there. I wind up sitting down and I do, I dive deep into an analysis phase. I'm a big fan by the way of the user research tool.

Dovetail, I frequently use that for the steps. but there's sort of two axis or information types that I look for. And one, is things that are sort of very easy to observe. Like, okay, well, this person has spent, five years in the industry, or, this is someone's title in their current job, et cetera. After I've done a pass for those sort of clear and observable things, I will dig deeper. And that's when I start to suss out, oh, that comment about their experience with this product that was I'm getting frustration or I'm getting relief and. That piece, having the sort of cut and dry objective information is really necessary to know the product and the person well enough to, to contextualize that emotion. But from there, these start to become patterns, right? They become themes. And so what happens is I will begin layering these instances of these. Different, themes, reactions, et cetera, on top of each other. And then what we have is quotes that I will pull out that me and other folks within the company can refer to and we can, we can look around together and say, this looks like this, especially around this point in, the product, that making sense to everybody.

Absolutely great.

Jack: So you're kind of bringing those lots of interviews down into actionable things. How do you use those findings that you, pull out?

Ana: What I will try to do here is give you some specific enough PA pictures or pads inward, while being clear for you, me and everyone, who's listening the details, vary. So widely based on the people, the situation, the product at hand, and it's really that variation that we're, using, , as a point of power and leverage.

But nonetheless, so we've conducted these, user interviews. We, as I said, have started out with a very deliberate anchor question. And some of the things that we are going to know after this process, I spoke to this already, but, we're, we're, we're going to have a, clearer, demographic level and, and sort of very clear cut picture of, , what kinds of users are currently having the best time. With the product. And so to speak to some of the attributes that can be revealed there, sometimes we get clear on, oh, okay. Well there's for example, a lot of type script usage that goes along with our product. Okay, cool. Or, alright, there are, a lot of people using the product right now who are tech leads and court sort of straddling this, this, place between, uh, management and individual contributorship. Just as a couple of examples. We're gonna understand who is being [00:09:00] impacted by the tool and by the product better. From there, we are also going to have, as I mentioned, we're going to have discussed where the product is. Giving people superpowers and where it's kind of onerous getting to be onerous for them. And so we're gonna have a much better understanding of the gaps and that could be, what we identify as gaps in, how the product itself is working. Maybe it's, , gaps in, how. Clear or direct error messages are, maybe it is, gaps in the documentation. We're gonna have, start to understand where our model of this tool and how it works and how it will be received. Where there's a Delta between that and our users. So from there. We can start to infer a lot of things about what we do next and how we spend our time. That can mean, we're going to focus on, uh, improving documentation, for some more specific use cases past just the get up and running stage. Of the product or it can point to, some really good other tools or products to spend time doing integration with. We could also have it revealed to us that there are a lot of our users who. Want to better understand how to improve in one specific area. And you know, that can be an area of this tool in this product, but it's sort of the tip of the iceberg of a larger scale that they wanna improve. That. So maybe there's an opportunity to start getting folks together in, you know, you and your dev re team getting folks together in a sort of structured, or [00:11:00] semi-structured, you know, lightning talks or some other sort of peer learning opportunity. Some things like that.

Jack: Anna. One of the things that you've spoken a lot about is how user research can extend our runway at a startup. How can user research extend our runway?

Ana: So it's a matter of conducting developer user research, making it possible to gain insights that we. Don't otherwise have the ability to gain, which allows us to run much more deliberate experiments. And if you, as a founder are looking at time, spent headcount, the, the different directions you go in an effort you put in. This is going to allow you to draw way more specific pictures of what you're doing over, what amount of time and why. And all of that is going to enable you to. Basically just chart a much clearer navigation path. Um, one thing that, comes up sometimes I, I wanna mention is, some folks will, will, you know, I have a chat with them and, and they'll say, well, I'm, I'm actually talking to my users all the time. And, um, so I, I don't think we, we really need to, to do this. In fact, you know, we hang out and discord together all the time and, sometimes people will, you know, drop a feature request and we'll implement it on the spot. And I always wanna pause at that moment because while that sort of. In the weeds and on the front lines, perpetual rapid interaction, with our users and the, very, very high responsiveness with, and to our users that I'm describing here that can look and sound from the outside the same as the. Developer user research practices that I'm talking about. However, they're actually distinct because what I've, what I spoke to a little bit in the practices. When I work with a client is we're basically putting a box around things and we're making a little laboratory and that makes it. To isolate the variables, which is what, in, what allows you to see how one problem connects to one potential solution you wanna experiment with and how much of your cycles you're going to spend doing that.

Whereas if you are, having. Tons of, ongoing interaction with your users, and maybe implementing things based on their request in a very unstructured fashion, it makes it incredibly hard to isolate the variables. So to be clear, I'm absolutely not saying don't interact with your users. Don't build relationships with your users.

I'm not for a moment saying that, but I am saying that taking a step back. Implementing these practices deliberately and then deciding what the next best and right thing to do is based on the tangible evidence that we've come to is going to allow you to chart a deliberate course in pace with your runway. So there are gonna be so many fewer surprises.

Jack: That makes so much sense. It can feel so often at startup, like you're lurching from one direction to another based on random pieces of feedback or conversations. And if you're making big business decisions, directional decisions, it completely makes sense that it's worth doing this properly in a, in a lab, as you say.

Ana: You know, a lot of that is understandable though, right? Because the rate at which people in the world are becoming software developers has just been accelerating and accelerating and accelerating. And if we go back several decades, someone being a program or someone being a software engineer meant sort of one reasonably clear thing. And now. I was looking up these numbers the other day, the market research firm Slash Data said in 2019 that there are, 25 million developers in the world right now. And that, that number is expected to increase to 45 million by the year 2030. So between boot camps, between more traditional paths, between the wide variety of different jobs in tech and subsets of the industry. It makes sense that it is difficult to figure out which action connects to which form of impact in our product experience and in our user adoption efforts. But it is actually possible to create, you know, draw this box to create this lab, as I said, and we can calibrate our instruments so much more finely, and I'm really excited about what that can do for people and how that can help startups get further.

So what I would like to leave folks with, is that this can help you make better trade offs. And you're already doing developer user research, whether or not you plan to. The question is how expensive is it? What is it costing you? How much sooner could you have this information? And what could you do with a differential in your runway, if you had this information now. So basically developer user research is a means of taking your navigational instruments and calibrating them much more finely and thoroughly. So everyone in your org knows where they're going. And so that you as a founder can make an informed decision about how much of your fuel you're using to get to your next destination.

Jack: amazing. Where can people learn more about you and, contact you if they want help?

Ana: You can find my website@annahe.com. Uh, that'll get you everything you, need to know about my approach to reach out to me once again. Um, that is a N a H E V E S. I.

Jack: amazing. And Anna, thank you so much for joining and everyone. Thank you so much for listening and we'll see you again very soon.

View episode details


Creators and Guests

Ana Hevesi
Guest
Ana Hevesi
Developer Experience Researcher, Ecosystem Consultant

Subscribe

Listen to Scaling DevTools using one of many popular podcasting apps or directories.

Apple Podcasts Spotify Overcast Pocket Casts Amazon Music YouTube
← Previous · All Episodes · Next →