Unlocking Transparent Roadmap Prioritization: A Case Study in Collaborative Decision-making for Platform Teams

Magalie R
Peaksys Engineering
8 min readSep 20, 2023

Target

Anyone managing activity for Platform Teams and wondering how to have concrete elements to prioritize and/or explain the choices in their roadmap.

Short version

In the Architecture Department, we’ve revamped the prioritization process for our Platform Teams. We launched the Platform Tech-Value Alignment (PTVA) method to increase user satisfaction.

We’ve introduced a homemade “tech ROI” model, and engaged external stakeholders through “valuation workshops.” ROI is now integrated into Jira, fostering data-driven prioritization. We’ve set up regular meetings to ensure transparency and enhanced collaboration. Now our prioritization process is transparent.

Long version

Introduction to Platform Tech-Value Alignment (PTVA)

This story starts in the Architecture Department.

Our seven teams provide internal products, aka platforms (Kubernetes, Keycloak, Kafka, etc.), to the other departments of our company. So, their clients are all tech profiles (Devs, Devops, etc.).

These teams are called Platform Teams and are led by a PO (Platform Owner) and a PTL (Platform Technical Leader).

Until the end of 2022, teams would work on their roadmap independently: they used common sense to plan their topics on their own and made their own assessment of client expectations.

It worked… but it had its limits:
— How to decide objectively which topic is more important than another?
— How to be sure we respond to our clients’ needs?
— How to explain why some topics are not handled in the short-term?

We decided to address these pain points: make prioritization easier for teams and include our internal clients in our prioritization process. And guess what? It worked!

In the rest of this article, we will detail how we got there. Keep in mind that we used a “pilot team” to test our approach before deploying it to all teams.

Step 1 — Prepare the terrain: standardize.

To make sure that what we were about to test would be replicable on a larger scale, we had to standardize some elements.

Our company made the choice to use Jira for daily work and to have an overview of roadmaps and prioritization using the “Jira Structure” and “Advanced Roadmap” plugins.

So, our first aim was to make teams speak the same language when they manipulate Jira objects (same definition, same time granularity, etc.).

There were some Jira standards that had been defined in our company, but were not deployed in our teams so far: no need to reinvent the wheel, we decided to use them.

To sum up the standard, in our Jira, we have this hierarchy: Theme > Epics > US/Task.

A “Theme” is defined as a problem we want to solve to improve your platform, and which has an impact on your clients.

“Theme”, for us, is the right level of object to use on the roadmap.

Do not neglect this part: standardization can be a long process if teams are far from the target. Standardization changes the way that teams work.

Step 2 — Find the right method: our “homemade” ROI is born.

There are so many ways to prioritize a roadmap: RICE, ROI, Eisenhower Matrix, MoSCoW, or WSJF, to name just a few.

We had two main concerns:
— A lot of methods are mostly business-oriented, whereas our topics clearly place technical value first.
— Our Platform Teams and stakeholders were not used to this kind of exercise, at all.

We didn’t want a complex method and we were searching for something we could set up as easily as possible across the company.

Thus, we decided to create our own value model, using a “tech ROI”: the value provided, divided by the effort required.

Regarding value: we chose to use a simple 1000-point scale, with 5 criteria.

To find technical criteria that make sense for our topics and, above all, that can be used by all our Platform Teams, we worked with our pilot team. We broke down existing subjects to identify which indicators could be useful and exclude those that were overly specific to the team.

At the end, we agreed on the following criteria: security, obsolescence, product technical improvement, run reduction, and customer satisfaction.

We then went a little bit further, weighting our criteria, as it was obvious to us that the five criteria are not equally important. We defined the following grid:

Criteria and points grid

Regarding effort, we decided to use the “t-shirt size” system, which teams know well, and for each size we defined a time range.

But we couldn’t stop here: to obtain our ROI, we also had to convert it to… points.

We followed our agile coach’s advice to use classic scrum methodology, as everyone is familiar with it (even if our teams use Kanban, that doesn’t mean they don’t know scrum).

Thus, we took as a reference: 2 weeks = 1 sprint = 1 point.

We had our formula to convert t-shirt size into points.

Effort Grid : Time and point equivalence regarding T-shirt size

And there we are: we had our formula to obtain our ROI (“value” points / “effort” points).

Step 3 — Create a community

From the start, we were strongly convinced that value should be set collaboratively, with people outside of our department.

So, we supported the teams in defining their stakeholders.

As their end users are all the company’s tech profiles, we needed profiles capable of representing different groups (Devs, Devops, Lead Dev, Technical Referent, etc.) from different entities (Business & Tech).

We did this exercise team by team.

For each team, we defined at least five external stakeholders.

Why five? Because if two can’t participate, stakeholders still remain more numerous than the team members.

Keep in mind that the more participants you have, the longer the discussions will be. Nobody wants to have 2-hour workshops… so we set a limit of seven stakeholders per team at most.

Step 4 — Define the process: how we created a “Valuation Workshop”

OK then. We had our standards, our ROI formula and our community. We had to figure out how to collect our values.

So, we decided to organize a Valuation Workshopwith stakeholders who belong to other departments.

Once the community participants were identified, we had to agree on the format for the Valuation Workshop.

The idea was still to keep it as simple as possible, because it is a new exercise for our teams and their guests.

We used the Miro tool for several reasons: some attendees could attend remotely, teams know the tool, it wouldn’t take too long to prepare the workshop, and it’s easy to use.

What does it look like?

MIRO template for valuation workshop

We wanted to hide the “complexity” of our criteria weighting, to focus the discussions on the impact, instead of on points. That’s why you mainly see stars, and not the points inside the stars.

How does the workshop proceed?

The Platform Owner and the Platform Tech Lead pitch a Theme (remember, in our standardization phase, we asked them to fill out a template in Jira): who is the target, what do they want to do, for what benefit, etc.

After the pitch, there is time to answer stakeholder questions.

At this point, the difficulty is to remind attendees that we are valuating “the topic and its benefits”, not a technical solution.

When questions are over, we go criteria by criteria and ask stakeholders, “in your opinion, what is the impact of Theme A on this criteria?”.

It’s not a vote: it’s a discussion that should converge towards a level of impact.

Someone in the workshop shares their point of view, then we ask the others if they agree or disagree.

If we are not able to agree, we take the lowest shared level.

From our experience, the key here is the pitch of the Theme.

If the PO/PTL are not able to clearly define why they want to do a certain Theme and what the expected benefits are, the stakeholders won’t be able to arbitrate.

After the workshop:
— The Platform Owner simply has to total the points (visible in the stars) and report this value in Jira.
— The Team then macro-estimates the effort needed to complete the Theme and puts the t-shirt size in Jira.

And that’s it: thanks to a Jira-calculated field, the ROI (value/effort) is displayed directly on the Theme ticket.

Step 5 — Reap the rewards: prioritize and communicate.

Now that teams have their ROI in Jira, there is a new rule to apply: when they are prioritizing their subjects, ROI become the number-1 criteria.

Of course, it doesn’t mean that there are no exceptions. But if there are, the PO must justify why a topic with a lower ROI goes before another with a higher ROI.

To make sure we stay aligned between teams and hierarchy, we have set up a 45-minute meeting every two months, called a “Product Committee”. The “Valuation Workshops” also take place at least every two months, before this meeting, so that the PO can adjust their roadmap if necessary.

In the meeting, the PO and PTL present their roadmap and prioritization, among other things.

So far, with our new transparent valuation process, prioritization has been rarely challenged by management. There is no need: the Platform Owners have adeptly incorporated ROI as an extra factor for a prioritized roadmap in a relevant and intelligent way.

Once the prioritization is approved, to give visibility to our stakeholders as well as to everyone interested in our teams’ activities, we decided to launch a larger informative meeting, which takes place every quarter.

Teams present their roadmap to all Peaksys employees: 10 minutes to present what has been done in the previous quarter, and the new priorities to come.

Quarterly Roadmap Meeting: each team shares its roadmap.

All our committee materials (Product Committee and Quarter Roadmap) are accessible to everyone on Confluence.

Final Step — Congratulate, observe, improve.

After all the above steps are completed, you can first congratulate your teams, because they will certainly have done a great job, which will have led to profound changes in their way of working for some of them.

Then, you can start collecting feedback.

Our teams are satisfied that they have a better perspective on their own topics, and especially that they have additional factual elements to help them prioritize in an objective way.

We have done a post-mortem on our “Valuation Workshop” and the feedback was sometimes beyond our expectations. Our stakeholders were happy to have visibility on incoming topics, and both our team and our guests were enthusiastic about having this space and time to discuss matters together and realize that they always found a compromise on the value of our topics.

We also recently did a poll on our “Quarterly Roadmap Meeting” and feedback was good as well: attendees were pleased to have visibility and transparency on what happened and what is coming for them from our teams.

To conclude, what we have put in place in the teams undeniably brings value to the way we work. Nevertheless, we know that we must continue to observe ourselves, in order to make the necessary adjustments to allow teams to obtain maximum benefit.

--

--