Scrum, AI, and the Real Costs: Q&A (Part 2)
In Part 2 of this Q&A series stemming from questions in the webinar, Managing Your AI Teammate, Eric Naiburg continues the conversation with Darrell Fernandes, diving deeper into how AI is reshaping the way Scrum Teams work.
Together, they explore practical applications of AI in Scrum — from drafting and refining user stories to strengthening Definitions of Done and improving Gherkin statements. Darrell shares how AI can help teams create clearer, more consistent, and testable backlog items, while also warning against over-reliance.
Eric and Darrell examine AI’s impact on team dynamics, including how meeting-recording tools can summarize conversations, capture action items, and support retrospectives. They also address the human side of adoption: differing mental models, fear of change, and the critical role Scrum Masters play as enablers — not gatekeepers — of AI experimentation.
Finally, they tackle a topic many teams overlook: total cost of ownership. As AI capabilities expand, Product Owners must understand infrastructure, data, and operational costs to avoid unintended financial consequences.
If you're navigating how to thoughtfully integrate AI into your Scrum Team — balancing opportunity, risk, and cost — this episode offers practical insights and grounded guidance.
Transcript
moderator 0:00
Welcome to the scrum.org community Podcast, the podcast from the home of Scrum. In this podcast, agile experts, including professional scrum trainers and other industry thought leaders, share their stories and experiences. We also explore hot topics in our space with thought provoking, challenging, energetic discussions. We hope you enjoy this episode.
Eric Naiburg 0:25
Hi everybody and welcome back. We've got another episode talking about AI again. This is Eric naberg. I'm joined with Daryl Fernandez, and we will talk about some of the questions from our webinar. So in this section, I want to talk a little bit about AI and how it plays with the scrum team AI and Scrum Masters AI and product owners and some of the practical uses of AI. Maybe we can first start with Darrell. Are you seeing Scrum teams using AI to help in their work today, things like creating and updating definitions of done, writing user stories, creating maybe more plain, natural language writing or even describing it in pulling out from more complex ideas and thoughts, simplifying it for folks,
Darrell Fernandes 1:27
absolutely we're seeing that, and the best teams that we've seen do some of those things. And I'll pull out user stories as one thing, right? Because there's so many examples of user stories out in the data in the Data verse, if you will. And so where does AI get its get its patterns? It gets a patterns from the Dataverse. If you have a specific format that you want your user stories to work, based on your industry, based on your geography, based on your culture, whatever the case may be, you can actually ask AI whether a certain example is well suited for the user story frame that you've built. So a lot of companies and a lot of teams use a user story frame in order to construct consistent, testable, followable, traceable user stories. And in those cases, those teams are actually using them with AI to improve and interrogate and improve their user stories. And I think the same can be said of definition of Done. The same can be said of gherkin statements. All of those things have the can take advantage of the benefits of AI. Teams can certainly use that, and we've seen some of them do that very well. And it's it's quite interesting when you put a an incomplete, I'll say, user story, into the query, if you will, and see what response AI gives you it. You can get one of two things. Sometimes AI will just try to fill in the blanks, and I think you have to be very careful with that, right? You don't want AI to over extend with information and insight that you haven't given it context for the best case is when, and you can specify this. You can just ask AI where it either doesn't fit the model, where it's incomplete, where it could be better, and then go do the work yourself to finish that job. It's, it's, unless you're really good at providing context, unless you're really good at setting AI up for success. It's a pretty big order for AI to finish your thoughts on how the user story could be better constructed with, again, all of the data in the Dataverse, it doesn't, it may not have the right context for the problem you're trying
Eric Naiburg 3:52
to solve. Yeah, and I think, think about the user story example, it's also making it easier for people to understand. So I may write something and I'm so close to it that I might write it in a way that it may be in interpreted incorrectly. It may be misunderstood. It just might go way too deep or way too high, because I forget that not everybody knows what I'm thinking or what I know, and using AI to refine what I have and better describe what I've done, and have it even start asking me questions. So just like you can question the AI, you can have aI question you and ask you for more details so that it can spit out and describe that scenario better. I think, you know what I get?
Darrell Fernandes 4:41
Darrell, yeah, I think just you hit on something there, Eric, that's so important, especially to the scrum master. Role as the scrum master is trying to facilitate the process and trying to help the team get better, both from a product owner perspective and a development team perspective. AI can be a really nice concept. Conflict manager, if, if things are coming together and you're seeing things in the communications not happening, what better tool for the scrum master to use than to go to AI and say, Tell me why this isn't why this story, why this vision statement, why this definition of Done, whatever the case may be, didn't meet the expectations of all parties in the team like it's fairly agnostic. AI is not going to have emotion, per se in that response, but it really can help the scrum master be more effective at facilitating that that connect between
Eric Naiburg 5:39
all parties and facilitating the team to start to come together. So you think about a definition of done in the organization. May have their their definition, different people on the team may have theirs, and being able to kind of bring that together into a single cohesive statement, or set of statements, and rather than a whole bunch of things, and then getting into those arguments of, Well, mine's better, yours is better. This is different. No, this is the same. Really feeding those in and then coming up with something that then we can discuss again. It's not coming up with the final definition of Done. It's coming up with something that we can then bring together, that we can discuss as a team. There's another place that this, I think plays in its listening. So with AI technology now we can turn it on and have it listen to conversations, have it listen to what's going on. So where do you see this playing, for example, in stakeholder feedback?
Darrell Fernandes 6:41
I think it can be very powerful. There's so many there's so many logistics around that, whether they're compliance questions, whether, like, there's so much to that. But if you can take advantage of that, and if you have the availability of that, that tool set, it's really important because, because, as we're all interacting, right, sometimes we're trying to communicate, and so we're trying to to speak and not, therefore not listen. Sometimes we're trying to take a note, because we're listening so intently that we don't want to lose something. Yet we miss something. When you've got that ability for AI to produce a summary. You know, recently, I forgot that AI was listening to a call that I was, that I happened to be on, and at the end, I got an email that I totally forgot about. And I said, here are the actions for the meeting. And I'm like, oh my, I forgot an action that aI had captured because it was listening agnostically in the background the whole time. And it it was just helpful to me, as I'll call myself a product owner, in that particular instance, it was just helpful for me, because it gave me that and it would have been that action would have been completely lost, or until another party reminded me that we didn't follow up on it. It just it just brought it right to the fore. So even simple things like that, I mean, you talk about big problems or big opportunities, like clarifying definition of done even small things like action items and commitments, it can help with in that case. So I think it's tremendously valuable throughout that process.
Eric Naiburg 8:11
And it's not just the listening, but it's then the interpreting and bringing back and kind of understanding, in some ways, what's important where, if we're busy taking notes, we're not paying attention. If we're stopping, we're breaking flow, if, certainly, at my my age, I'm forgetting things midstream sometimes. And we have the opportunity to take advantage of something that's not just recording it, where I'm going to have to listen to it, figure it out, but it's actually interpreting it along the way, and then providing that feedback, which I think is so critical in really in all of the roles on the scrum team, whether it's as a scrum master and members of the scrum team in a retrospective, trying to figure out where can we improve, and how can we improve, whether it's a product owner or developers working with stakeholders and getting that feedback, whether it's developers working with each other, there's so many opportunities to take advantage in little to no downside in this case, because it's not changing anything. It's not, it's not doing anything business critical, if you will, but it's getting the information that we need and kind of bringing it to the forefront.
Darrell Fernandes 9:33
Yeah, just clicking back to Episode one really quickly, if you didn't have a if someone didn't have a chance to listen to episode one, and they jumped right into Episode Two, the whole team dynamic, so retrospective and AI, because it's listening, can actually provide the scrum master, the product owner, the dev team, insight on the team dynamic in that retrospective and kind of what's happening and what. Where things are headed and where the liens are, if you will. So I think it can certainly play an interesting role. And if you're going to go down some of those paths, that whole test and learn on, what are you getting out of that recording? Is it what you want? Do you need to do something different? Should you be prompting it differently. Those are all things to learn as we go, because it's going to continue to evolve so quickly underneath us,
Eric Naiburg 10:27
absolutely so when we think about this and there's all these different models beneath it, where do you see things happening? Where how do we start to mitigate the different mental models of AI as the team is starting to use it, right? So we've got different mental models of the team members. We've got different models of the AI. How do we mitigate some of the risk that comes into play?
Darrell Fernandes 10:59
Yeah, I think there's a couple pieces to this, right? And the biggest piece is talking and communicating, right? What are our expectations? How are we all thinking AI is going to help the team, and there's no right answer to that, and there's no wrong answer to that, because AI can help in so many different ways, right? We all have different problems that we face every day, and AI can help us in different ways with each of those problems. So but like, if you're if you're a head down developer, AI, has a very different value proposition to you than it does if you're the product owner, and you're and you're trying to set vision and market research and do all the things that a product owner needs to do. So I think this whole notion of a single mental model, I think there's a collective set of needs that the team has, that everyone should be aware of. I don't think everybody has to believe everybody else's need. You don't have to believe that my need is to do some admin tasks around error handling within my code. That's just my need. I believe it's going to make me more effective and more efficient in delivering the product to market. And that's okay, unless it doesn't, and that's where that test and learn has to come back, Right exactly.
Eric Naiburg 12:20
And I think, I think we have an opportunity. There's a new opportunity for Scrum Masters, and that's them kind of starting to lead that charge. So one of the questions I keep hearing, and I've heard it in the webinar, I hear talking to friends, I see it in our forums, I read it in blogs and other places, is who's leading the AI charge within the organization, within the teams. Is it the product owner? Maybe they're probably more focused on, how do I bring AI into my products? Right? But who's leading that charge? And I think we have an opportunity, and maybe it's even a reinvigoration of the scrum master role, to be that person or those people, and really seeing in helping the team understand how they can leverage AI, not only the scrum master leveraging AI themselves, because there's a lot of places it plays like we we've been talking about, but also them bringing that insight and reducing that fear, you can almost look at the fear of AI as an impediment, and getting over that fear and getting away from that impediment as part of the role of
Darrell Fernandes 13:32
the Scrum Master. Yeah, and I think the one thing that the scrum master has to be very thoughtful about is to not end up being a gatekeeper for AI? Yes, because again, back to the team dynamic. Everybody has to understand the role that AI is playing. Everybody has to understand where AI fits, and touch AI, if you will, when they need to, to feel its impact on the team. And I think a an uncomfortable place that a team could get is for the scrum master to say, my job is to is to use AI to help all of you. And I think that that could be, that a would be a bottleneck. B would create a really interesting team dynamic around how AI fits with the team. So I think there's this notion that you laid out of the enabler for AI within the team, which I totally agree with. I think Scrum Masters are in a great position to do that, but not the gatekeeper. And I think just balancing that and walking that line is going to be a really important element. And I think there's there's things in industry now that are happening on on what good looks like in setting a team up for success in AI, and we talked about it in the paper. We talked about it on the webinar around building context and making sure there's consistency in your context, making sure that teams understand how to effectively use prompts Scrum Masters should really understand. And those pieces, you know, it'll be interesting to see how the industry evolves, as to what kind of authority we give people who understand how to use those really well. But, but I think there's value in Scrum Masters understanding those pieces, especially context and prompting, because they go so hand in glove and making sure that you get consistent, productive results out of AI,
Eric Naiburg 15:26
yeah, and as they're coaching the team, as they're supporting the teammates, as they're working within their Scrum teams, as you said, as as that enabler, and helping to enable the teams to be successful. I think there's one more topic maybe we touch on here in this episode, is, do you see a difference between AI product owner roles and traditional product owner roles? So as a product owner, am I defining my products differently for AI versus traditional product, product ownership, product management.
Darrell Fernandes 16:05
Well, and you hit on this a minute ago, Eric, I think there's, there's two pieces to this. There's, what role does AI play in my product, and as a product owner, how do I have to think about AI as a feature of my product? And understand the AI market, understand the AI capability, understand what elements AI can add to in a productive way, the value my product has in markets. And then there's the role that AI can play as a team member around me to help me deliver my product in my product features to market better, faster, to drive more value and differentiation for my product. So if you think about those two things, I think an AI enabled product requires a different skill set than a traditional product owner, because you really have to understand a little more deeply around how AI works. You certainly don't need to be an AI expert, but you do have to understand enough about the underpinnings of AI to what data will be important to your product. How can you differentiate with the data you have access to? How can you efficiently use that data? And by efficiently, I mean mostly on the cost profile. Running AI is not cheap, right? It's power intensive. It's infrastructure intensive. So when you start to think about adding AI to your product as a feature, you have to understand how you do that effectively, efficiently, see that your your cost of ownership for your product doesn't go through the roof inadvertently because you added AI. So so the product owner of an AI enabled product, I believe, has to be thinking a little with a new set of information and a new foundation of knowledge that they may not have had in the past. So they've got to build that the AI, the product owner, using AI as a team member, I think, has a lighter lift, little bit different way to think about things. How Where does AI fit in the team? How am I going to take advantage of it? How am I going to make sure it's it's giving me the right answers? How am I going to test whether AI is helping me and being effective in what it delivers for me? So so I think there's two different ways to think about as a PO to think about AI in your role, and I think they have different levels of investment, if you will, in where AI fits, and how much you have to get under the covers to be productive
Eric Naiburg 18:29
with it. And something you said in there, I think, is super important. I'm not sure people are thinking about it as much as I'd hoped they were, or maybe I'm naive, and they are thinking about is a lot it, but it's that total cost of ownership, the expense of, I think people, what happens is they get caught up in the buzz. They get caught up in all the excitement of, oh, look at what all AI can do for me. And next thing you know, they're talking about how they're going to add all these AI features into their product, and ignoring the fact or just not aware of the fact that this is going to cost, and it may cost me a lot more than we can earn, and yes, it's going to be better, and it's going to be super valuable, but are people going to be willing to pay what it's going to cost me to Supply and provide? And as a product owner, P and L, profit and loss is a big part of my my job, and I need to make sure that I'm building a product that that is
Darrell Fernandes 19:30
profitable, yeah, as a former CIO, it's the thing that concerns me the most about how AI is coming into products, Eric, because I don't think we fully yet understand the total cost of ownership of adding that AI feature. So it's almost like it's difficult for a product owner to even put a price tag on the feature, because we don't know the true cost of all the infrastructure pieces that come together. Because I don't think that the industry has normalized itself to that point yet. So until we get there. It's something to pay close attention to, and I liken it to the cloud a little bit. We can rewind 10 years when everybody made a quick push to the cloud, and then all of a sudden realize that code that isn't optimized for the cloud actually doesn't run any cheaper in the cloud. And all of a sudden big bills started to come up, and we had to pivot a little bit, and we had to re engineer some code. Nothing wrong with any of that, but let's learn from a little bit of that, and let's understand what we're jumping into while we can, before we get the bill, and realize we've got some real new work to do that we didn't anticipate.
Eric Naiburg 20:34
Yeah, and I think a big difference there is the cost of storage, the cost of many people going to the cloud, brought the cost of cloud down this one, the vendors haven't even figured out their pricing models yet correct, and they haven't figured out how much it's going to cost. We see these vendors now building data centers next to power supplies in the lake because of the cost. And they're they're going to figure out what this is going to cost over time, and it's probably not going to go down, it's probably going to go up. So we have to think about the future, not just the cost today, as we plan ahead. Well, this has been a great conversation, and I look forward to our next one. So thank you and thank you all who joined. And have a great day. You. You.
Transcribed by https://otter.ai