One-on-One with projekt202: CTO Rob Pierry
In the conclusion to his two-part interview, projekt202’s Chief Technology Officer Rob Pierry discusses dynamics and communications that drive successful development projects, the measure of a strong Agile team, and the most important aspect of team building.
We've talked about design, development and research. What other areas should be considered for team success?
We've talked about business a little bit here, too, making sure that business stakeholders are actually involved in these teams. The other big one to definitely mention is QA. A lot of times, especially in organizations with existing development capability, QA is this big waterfall stripe at the end.
So, we have this whole other process you have to go through before we get there?
Yeah, and all these QA guys are “just getting in the way of us shipping this thing, and we're ready to go.” Oftentimes, they're the ones who get shortchanged, too. "Development took a little longer. I guess we're just going to cut QA."
"For us, this really close integration goes beyond just designers, researchers and business folks, and includes QA as first-class members of the team."
That's the trickle-down effect, right? R&D has time to be creative and figure things out, then development doesn't get nearly enough time, then QA has a blink of an eye to make sure it's working right.
Yeah, everyone's trying to kick it down the road and there's nothing after QA. No one ever says, "I guess those users can wait a little longer." Obviously, that's a mistake, but the trap is it's a mistake that you don't immediately realize the cost of having made. You ship something without doing proper QA, maybe it's going to be fine. Maybe you've got a really good culture of developer-automated testing, so your development team has done some testing. Maybe you even have good user validation integration. You've got designers and researchers integrated with your teams, so launch day is not the first time that users have seen your system. You've gone out and done regular user validation, and it was built on a foundation of research in the first place.
The trap here is that developers are developers; they're not QA folks. If you're building anything non-trivial, something's going to break. That's just the reality of this.
For us, this really close integration goes beyond designers, researchers and business folks, and includes QA as first-class members of the team. That's not just, "OK, guess what. You're on the team." There's a mutual respect element that has to go with that.
They have to be included in the process so they have the context to be a good QA team, right?
Yeah, and fundamentally, when they speak, the rest of the team has to listen. In the same way that when the designer speaks or the researcher speaks, the rest of the team has to listen, because you've assembled a team of problem-solvers who you're recognizing for their experience, their input and their judgment, as opposed to someone who's going to do what you say. Traditionally, QA has been the ultimate order-fulfiller.
"As a business stakeholder, what I really care about is value delivered."
There's a different mindset and a different skill set with those folks, and they bring value to the team, so that's important to include as well.
Does that help your Agile velocity?
Velocity is a really interesting idea to talk about. When you think about it, it's a nice metric. It's a really easy thing to measure. "I estimated how hard it would be to build something and then, in some unit of time, I accomplished X." That's my velocity. As business stakeholders, we are sometimes led into this trap of the easily measurable. Maybe that's not what you should be measuring.
Velocity really just measures how long it took me to execute what I estimated. As a business stakeholder, what I really care about is value delivered. Is it something that's going to increase my conversion or provide a more positive experience? That's the thing that I really care about.
If I really want to know, "When will I have delivered sufficient value," I have to derive it. I have to back into it. I have to go look at the backlog and interpret it, and say, "OK, I feel like we will have done it after these 20 stories, and those 20 stories are estimated at this many points, and my velocity is here, and that's when I'm going to get there." I think that's putting the emphasis on the wrong thing.
If you're telling the team -- by measuring, tracking, reporting, putting charts up -- that velocity is the important thing, that's what they're going to internalize as the important thing.
It seems the business would be most interested in how quickly you're shipping. Do you feel the same way about that being the most important measurement?
I think it's what's contained in what you're shipping. It's the value delivered to your users or to your business within each story. It’s subtle. I think this is why there's a trap, because it's really easy to measure. "I estimated this at five points and it took a week to do," or "I could do two of those in a sprint." For all its faults and difficulties, estimates are easy to make, even relative ones. It's easy to get better over time.
Business value, that's harder to conceptualize. What is that? What unit do I even measure that in? There are a bunch of things under that umbrella: X% conversion increase, average order values or just picking some e-commerce things.
I can measure those things, but how do I measure an individual user's story, the effect that it has on one of those things? I can't directly; it's hard and, because it's hard to measure, I don't measure it. That's Agile's answer: What's easy to measure? "Points per sprint; that's easy to measure, so let's measure that. Let's make charts about it; let's talk about it, and let's compare teams across the organizations, and how they're doing on things."
How do you measure a good Agile team then?
The transition from tearing things down to building the new thing up is a little bit difficult. I do think there is some deeper insight to be had about a larger deviation from Agile, let's say. To me, it's context. Measuring velocity is probably inescapable, at this point.
Because it's part of the Agile process and that's what people are following?
And because it's easy to measure. What I'm saying is: don't measure this easy thing. You really should be thinking about this hard thing that I can't really tell you how to measure. That's, in some respects, not fair. It's okay to measure velocity, just …
You should be thinking about the most important thing for that team to achieve. So you're saying it's probably not velocity?
Right, the tools are set up to measure those things, so just know what you're doing. That's all. In some respects, I think that's all you have to do. You have to make sure that you occasionally remind yourselves, as team members, that, "Hey, velocity is not the important thing. It's an indicator, but it's not the fundamental metric that we're running our business on." I think you lose sight of that, especially the further you go down this path into teams as order-takers.
They've got stories that they have to deliver, and there's some manager walking around with a clipboard that shows their velocity, and that's how they figure out whether they're a good team or not. That short-circuits the entire business-value conversation. It's just, "How many points can I ship per sprint?"
That comes into play in the measurement of velocity, which is really not doing anything for team performance, right?
Yeah, and the more dysfunctional the organization is, the more that happens. If you've built a really order-taker-y culture, you haven't empowered teams. They know that they're measured on velocity, the smart ones will figure out how to artificially inflate velocity, how to make the metrics look like they're supposed to look, to your detriment as a business.
I think that's where refocusing and thinking about business value is reinforced by that cultural thing. If your teams are problem solvers, they have the context that they need, and they're fundamentally empowered to look out for what's going to be the most useful thing to build.
For example, you have a user who spends 10 to 12 hours a day at work because of bad business processes, which the software was built on. We need to break down those processes and rethink them so the software we're building serves her, and not processes that don't need to be there, correct?
Absolutely. It's hard to talk about things in isolation. That's why I keep coming back to the culture thing so much, because that's fundamentally what you want. If I, as a business stakeholder, give a hypothesis to a team and it's a bad idea, I need to have the kind of organization built where the team will tell me that. The team will say, "Hey, wait a minute. I know this is what you think you want, but this isn't actually what we need. What we need is X, Y and Z."
For folks who have their own development organizations: Can you actually, right now, imagine your development teams coming back to you and telling you that? If you can't, I think you're selling yourself and your organization short. As leaders, that’s what we need to try to build: teams that will tell us when we're full of it.
That's really important because we're not going to be right all the time. By assembling these multi-disciplinary, truly empowered teams, you make up for those blind spots and those off-days that individuals are going to have. You get everybody looking out for each other, valuing each other's opinions.
One obvious outcome of that is overall efficiency; firstly, from them telling you whether it's a bad idea to do this thing you told them to do, potentially saving hundreds of thousands of dollars on wasted spending. Secondly, because these folks are operating in an environment where they feel their opinions are valued, they've got support networks around them.
Since everyone feels empowered to challenge each other in a respectful way, you have checks and balances?
Absolutely, and it does wonders for motivation, if you know that if something occurs to you and you bring it up, it's going to be heard. It doesn't have to be listened to every time, but it's going to be heard and evaluated.
That's really motivating, especially if you contrast that with Agile nomenclature stuff like sprint: "We're going to sprint all the time. Someone's going to keep pointing you in the next direction to sprint to, and it's not your job to question." That's demoralizing, especially when you know that the thing that you're working on is not going to be of value when it gets launched.
There are all these mutually reinforcing things. I know my voice is going to be heard. I understand why I'm being asked to build what's in the backlog, and I have a way to correct it if I don't feel like it's right. I've got a channel to communicate to. I've got people who will listen to me and evaluate my ideas on its merits. That just makes me more interested in doing my job, and it makes me better at my job.
Then, if you combine all that with that soft cultural stuff, you combine it with some fundamental things like, "OK, QA has integrated. I'm going to have fewer defects. I'm going to have less remediation. I'm going to have less production outage." That's obviously an efficiency gain. "OK, this overall picture that's coming into focus is clearly better than the way things have been done before." It's only because it's hard to build that cultural reinforcement mechanism that more folks aren't doing this.
This team of people -- researchers, designers, developers, QA -- they're challenging each other and working well together. What are some other ways that this builds efficiency?
When you can take as a given that you've got this multidisciplinary team, it frees you up to change the inputs to, and outputs from, that team in ways that are more efficient.
For example, if I know that I've got developers on my team that understand and respect design and are familiar with design, that means I don't have to fully specify the design inputs. I can do something like use a high-fidelity wireframe and a style guide. I don't have to do red-lined comps for every single screen in the entire system. They're not following the letter of the law. They're following the spirit of the law.
If they see a mistake, they can reach out to the designer and say, "I feel like this isn't right. Did you make the right design decisions based on how we're developing this software?"
Yeah, or even to validate intent. Whether it's challenging or whether it's just clarifying, you reach out. It's oftentimes literally reaching their arms out and touching the person who's next to them because they're on the team.
If you think about the cost involved in making red-lined comps for every single screen you might want developed versus just having your wireframe and your design language that you've invested in, that’s a huge return on productivity.
I think that thread follows through if you understand that QA is on your team, and the team values and respects QA, and QA values and respects design. Then what you're going to have is a QA team who's actually capable of evaluating design fidelity.
"It's a tool to remind your talented teams of what they should be paying attention to, and that starts with having the right team culture."
That's something we run into a lot with clients that have existing development organizations. They've got a QA team, and that QA team might be really, really good, but they're good at executing test plans. They're good at evaluating whether the system works or not, but they may not be trained, or understand, what's important to check for visually, or in a user-experience context.
It's a learning process for developers to understand that when the visual designer says, "I want 8 pixels of margin," they mean 8 pixels. They don't mean 7. They don't mean 9, and it's different from having 4. Developers have to learn that.
QA has to learn that, too, so what you end up with -- with these in-sprint folks who are similarly invested -- is they can actually evaluate the output. They can look at what the developer built, and they can be that true second set of eyes.
Which means catching mistakes quicker, at the beginning, and not later down the line where it's costly because it means redeveloping or redesigning?
Absolutely. It also means protecting this investment that you've made in user experience.
Again, if we're able to fully incorporate QA, not just into the team, but into that environment of mutual respect and collaboration, then you get those checks and balances that you need. QA understands the design language. QA understands the intent of building what they're building, so they're equipped to notice when things are going off the rails, just like developers are and designers are. You've got one more set of people who can say, "Hey, this isn't right here," or "I noticed when we did this user validation session, that people struggled with X and Y, so I made sure that in my test plan, those types of things were included. What we've built doesn't seem to me like it fixes that issue."
That's not just fewer defects coming back to you from the field, but happier users and better return. These mutually reinforcing things really lead to this team being more productive and more efficient.
When talking about integrated teams, what's the most important takeaway from all of this?
The most important thing is the culture that you're building. Methodology is great; methodology tells you how to proceed. It gives you some guidance, but you're not going to be able to design a methodology that will make inefficient teams execute perfectly. It's a guide, it's a scaffolding, it's a foundation.
At its best, it's a tool to remind your talented teams of what they should be paying attention to, and that starts with having the right team culture. I feel like we're doing a good job of that. It's hard to do, but you have to start early, and you have to continually reinforce it.
For folks looking to make a transition, you can start small, but you have to really be aware of the messages that you're sending. If you want an empowered team, you have to give them some power.