Episode 250: Building Smart: Clarity Over Perfection

In this episode of the Product Thinking Podcast, Melissa Perri dives into the challenge of managing a backlog of customer requests without losing strategic focus. She shares insights on how to categorize and triage requests effectively and emphasizes the importance of maintaining a strategic approach to product management. 

Melissa also discusses the role of stakeholder engagement and the need to create space for strategic thinking and discovery. If you're grappling with balancing customer demands and strategic goals, this episode is a must-listen. 

Melissa offers practical advice on how to manage backlogs and stakeholder expectations while carving out time for strategic initiatives. Tune in to learn how to transform backlog chaos into strategic clarity.

Resources:

Follow/Subscribe Now:
Facebook | Instagram | LinkedIn

Episode Transcript:

PreRoll: [00:00:00] Creating great products isn't just about product managers and their day to day interactions with developers. It's about how an organization supports products as a whole. The systems, the processes, and cultures in place that help companies deliver value to their customers. With the help of some boundary pushing guests and inspiration from your most pressing product questions, we'll dive into this system from every angle and help you find your way.

Think like a great product leader. This is the product thinking podcast. Here's your host, Melissa Perri.

Melissa Perri: Hello, and welcome to another episode of the Product Thinking Podcast. It's time for Dear Melissa. This is the time of the week where I answer all of your burning product management questions. If you have a question for me, go to dear melissa.com and let me know what it is. I answer them here every single Friday.

This week's question is from a junior product manager, and it's all about knowing when a feature is ready for delivery. [00:01:00] Let's dive in.

PMs make great investors. If you're a product leader, curious about angel investing, check out Angel Squad. It's where over 2000 operators from Google, Meta and Apple learn to invest in high growth startups alongside Hustle Fund. I've been a member for years and highly recommend it. They've given me a few 30 day guest passes to share

so head over to go.angelsquad.co/melissa and make sure to act fast as the passes are limited.

Dear Melissa, one challenge I often think about is knowing when a feature is well defined enough to move into delivery. Do you usually follow a framework or rely more on experience and intuition?

This is a great question and a big struggle for product managers when they're first starting out in their career. So if you've had any kind of agile training, they do usually have a ready for dev type of system where things are specified. There are acceptance criteria. That means that something's done that has actually been worked on.

You can follow that as a framework, but I try to keep it a little bit more loose [00:02:00] in my area and make it more collaborative with developers. What I've seen as well is that sometimes people will make, take that acceptance criteria and those specs so far that developers won't start anything unless everything is so clear.

Detailed all the way down into the little nitty gritty on all of these features, and I don't think that's realistic. So we always wanna balance, we need more clarity with, we're overthinking this, or we can't actually start this until it's really ready or very well specified. So for me, the answer isn't a very rigid framework.

So for me, the answer isn't a very rigid framework. It's more about understanding what types of confidence you need based on your specific situation.

So there's actually two gates that you should think about. The first gate is validation, and then you should think about specification. So two different hurdles, right?

Is it validated enough to build? And is it defined enough for developers to actually start coding it? There's different questions that we need to actually answer in [00:03:00] both of these phases, and you cannot skip validation. And just go straight into delivery, right? But the level of specification that you can do when you're thinking about delivery can vary dramatically based on your risk.

So let's start with our first gate, which is validation. So we wanna think about, do we have enough confidence that this is actually worth building and invalidation? We should think about de-risking the why and the what, not about the how. So we should be thinking about what evidence do we have that users will actually want this solution?

What's the bar for validation as well when we think about risk tolerance? So in this area, we wanna think about high costs. We wanna think about risk to the users. We wanna make sure that. When we are actually looking at how much validation we need and how much confidence we need, it matches both the investment and the impact that we wanna put on this.

So there's a lot of different types of validation that you can do. You can do user interviews, prototypes, AB testing, competitive analysis, all of these different [00:04:00] things so that we can make sure that what we're building is actually right. So once we've gone through all that data then we actually have to think about what kind of confidence we need here to say this is ready to go.

So this is the hard part, and this is where a lot of junior product managers get stuck is trying to be 100% confident that this is the right thing to build.

You will never be 100% confident unless it's like a reported bug that comes back all the time and people are like, just fix this. And that's not what we're really talking about here. So when you think about how confident do you need to be, think about the risk, what's the risk profile? And risk happens in a couple different ways.

I think about things like cost risk, how expensive is it to actually build and maintain? The higher the cost, the need for stronger validation. We can think about technical risk. Does this touch sensitive systems, data integrations? When we think about technical risk, does it affect the code base for a very long time?

Is it really hard to build the higher the risk? Now [00:05:00] we're gonna need clear specifications and make sure that's very well validated before we move it into that. User impact risk. Could this break existing workflows, frustrate users? How many users does it actually impact? Is there sensitive data that's actually in here?

Again, the higher the impact, the need for more validation, and then we got business risk, right? Is this the bet the company's making that will put us into the future? Is there implications for the business on the way that we're gonna change or evolve in the future? Is it more of a nice to have or is it something that we really need to do to help this business exceed. The higher the risk, more validation we need Now, again, you're never gonna get to a hundred percent, so what is good enough?

That's where this becomes a little more art than science. If you're like, I'm 60% confident in it, but your risk is, let's say like medium to high. That's actually pretty good. 60% confidence actually is pretty high. A lot of times like 50 50. You could do that. [00:06:00] How fast is it to test, right? Can you actually put this out?

See if there's any traction, if the risk is lower, if it's higher risk and you really gotta nail it, maybe we need a little more validation. So again, that's gonna come a little bit more with experience than a hard and fast rule. But you should be talking to your team. You should be talking to your leaders to figure out what's our appetite for risk at different stages.

Now second part of this is specifications. So is it clear enough for development to start? And that's probably what you're talking about here too. So when we're thinking about having development start on something that has been validated. So again, it's past validation. This isn't about having every detail figured out, but it is about having enough clarity and context so that they can start intelligently.

So a core question that you can ask yourself, can a developer read these specifications or what I'm actually writing out here, user story specifications, details about our product and understand what they're building without guessing. So you need things like clarity on user [00:07:00] interactions, data requirements, right?

What data do we need to actually pull out of the systems, digest, understand what data do we need from our customers? Those types of things become really important because they shape databases and underlying architecture for developers. You also need to think about anything like integrations, all of that different stuff.

So you don't need pixel perfect designs usually for a developer to get started. You don't need every edge case to actually be solved, but you do need to have enough clarity on what it is that we're building here after that's been validated. So I like to think through things like user flow. How does someone actually use the feature?

Can you walk through a happy path? Can you actually do this? So maybe you have a prototype or something that the developer can get an idea for or concept. Usually that would be enough combined with a data requirement. Data requirements on what information do we need to collect, store and display, integration points, like does this need to talk to other systems, APIs, or services? All of those might be enough [00:08:00] for developers to say, Hey, I can get started on the backend of things. While you fine tune the front end. We have enough to go on where we can start exploring, handle this a little bit, and then we'll work through it iteratively.

You also wanna make sure that before they start as well, you have things like success criteria figured out. How will we know if this is working as intended? That will be the last hurdle that we go through, which is like acceptance criteria that we think of in Agile. Does it work? Are they able to do these things?

Are we catching errors? All of that stuff will need to be figured out before we can actually ship it. But there is actually a lot of stuff that we could figure out along the way. Detailed UX polish, maker interactions, those things usually improve through iteration. So maybe the developers are putting together the first iteration of it, showing it to you, and the UX designer. And the UX designer is like, move this, do that. You might be clicking through and say, oh, we forgot to include this edge case, or we forgot to have this work like X, Y, Z for these types of customers over here. You can iterate through that.

[00:09:00] So I don't want you to think that you need to have everything figured out before a developer can touch it. But you do need to largely have the concept available, start to understand the flow, start to understand the functionality, and then you can usually iterate and optimize that way.

And that's going to be faster than specking out every single little detail. So how can you tell where the balance is here without a ton of experience in it? Let's talk about some red flags that you're just not ready for development. If developers are asking you the same clarifying questions over and over and over again, you did a really bad job of building context upfront and explaining what they're building to them.

So they might not have enough definition when it comes to flows. They might not have enough definition on the why to be able to fill in some gaps. So you're gonna wanna go back to the drawing board. Explain the why, explain the concept, explain the product, how it should work, how users are gonna use it.

Maybe you need some more specified flows, functionalities, how it should actually [00:10:00] work. That's a really good sign that it's not quite there yet. And this is a big thing that does happen in a lot of teams and a lot of product owners will tell me and product managers when they're first starting out oh, I'm so frustrated 'cause my developers won't stop asking me questions.

That's on you. You did not build that context there. Now some developers, though, again, are a little bit more junior or need a little bit more handholding or are used to that. So you also need to know your team a little bit. So in this case, if your developers are a little more let's say junior or need some more specified requirements. If they're consultants as well, third party, this goes for that too, then you're probably gonna need more specifications. So you need to know your team. You need to get to know your team. There another sign that might not be ready for developers. Let's say they get into it and they're surfacing issues where it does not work for a large group of customers or users.

They come back, they say, Hey, we, we were using this data to test on and it's just not working. There's way too many edge cases. That's telling me that you didn't think [00:11:00] through that product in enough detail to find something that works in scales for most of your customers. And that's what you wanna do.

So you either did not scope it correctly and say, Hey, it's only gonna work for these customers, or you didn't think through a solution that actually satisfies a lot of the cases that you wanna go through. So again, that would tell me to go back to the drawing board. If your team is debating fundamental assumptions rather than implementation details, again, that means that it's not ready for development.

So if your product is validated, you've tested the concept, the users understand it. There seems like there is value here. You pretty feel pretty confident depending on your risk profile. Your developers are more about how are we gonna implement this? And they understand the concept, they understand the flows, they understand how it's going to work and they're not just what is this that we're building?

That's telling me that you are ready for development. You do not wanna get into analysis paralysis. A lot of stuff can be figured out along the way. When we're building on our team at Product Institute my developers and I are going back and forth, usually off of a prototype or some [00:12:00] concept as we build it.

And that's okay. That's what you want. You want to have it collaborative. You wanna be working together on these things. That's really what's gonna keep you agile and it's gonna keep you iterating and making things are great for your customers. So it does not have to be perfect. But it should be defined enough where we can all understand it and you're building that context.

So I hope that answers your question. Thank you so much for sending in a dear Melissa question. Again, if you have a question for me, go to dear melissa.com and let me know what it is. And next Wednesday, we'll be back with another amazing guest. Make sure that you like and subscribe to this podcast so that you never miss an episode. We'll see you next time.

Melissa Perri