Show me the money: six common pricing models for a tech solution

Triggerise
Frontier Tech Hub
Published in
10 min readAug 12, 2022

Our collaborative exploration with Networking HIV & AIDS Community of Southern Africa (NACOSA) and other partners is well underway. In our reflective processing that goes alongside the sprints, we’re writing regular blog posts to update our audience with learnings as we go.

Here’s a bit more background on the pilot we’re working on that examines the possibility of digitally verifying results for social and development impact bonds (SIBs and DIBs respectively):

Our hypothesis is: can we demonstrate the effectiveness and trustworthiness of using technology platforms like ours to verify outcomes directly and more efficiently? This is important because, if successful, adopting a tech-driven approach to verify outcomes could drastically cut down the current time and cost involved without compromising on the accuracy and quality of the results. This will allow teams to spend more resources where it really matters: delivering impact. Our Tiko platform features real-time data monitoring and project dashboards to give a detailed and up-to-date overview of a project’s status. This enables a more empowered and efficient decision-making environment within each programme.

Excitingly, we are testing our hypothesis within the environment of a real SIB. The Imagine SIB, implemented by NACOSA, is running an HIV prevention and treatment programme for adolescent girls and young women in South Africa. At Triggerise in Kenya, we used a digital platform called Tiko to administer a similar model of service delivery. We’ll be applying Tiko technology in this new context.

Constraints to designing a bond
Although it may be so that all stakeholders in the SIB/DIB agree that digital verification of outcomes and outputs can reduce overall costs and time and increase transparency, there are other constraints and incentives that need to be taken into consideration. These could have an effect on preventing the successful implementation of digital verification. A brief outline of these constraints follow:

  1. Budget. It is critical for the cost of the digital verification solution to be embedded in the budgets early on. Within this pilot, NACOSA has had to find funds to pay for the integration of the digital verification solution after the design and budget approvals, which has been a challenge and required budget reshuffling on planned activities. The core principles of our budgeting include simple, predictable, and transparent pricing. We shall expand on these factors in the second half of this blog post through an exploration of common pricing models for technology solutions.
  2. Time. The digital verification solution must be aligned with the timelines of a SIB/DIB. By incorporating the digital verification solution in the design of the bond, the solution can be developed in time for the bond’s implementation.
  3. Effectiveness. The solution must add value to all parties involved in the bond — even if they are not users of the system — and, most importantly, not create resistance if one party feels that they are losing out. Throughout this pilot, we have explored the value drivers for each party and will share more about this in our next blog post.

Review of the second sprint
Bearing all of these factors in mind, our second sprint was largely focused on evaluating the technical feasibility for the verification solution to be integrated with the Imagine SIB’s current data collection and monitoring and evaluation (M&E) tool. In addition, we had discussions with other impact bond implementers to explore whether there is interest in using a standalone digital verification system that can send data back to a database but is not integrated with a data collection tool or system.

Further learnings of this sprint

  • It’s possible to integrate Triggerise’s current digital verification solution into the Imagine SIB’s M&E system. However, that does come with a one-off technology integration cost
  • The implementers (of the other DIB we explored) showed interest in the technology, which was encouraging. These are a couple of points the team was interested in:
  1. The aspect of digitising field-level data through a simple smartphone app (the DIB is largely going to be implemented with a paper-based data collection system on the field, so their ears pricked up at the opportunity to take all of that online instead)
  2. Managing a digital voucher system with dynamic subsidies (based on user parameters/demographics/type of user) and issuing of micro-rewards

Similarly to the first sprint, we provided a budget (covering the three-year duration of the DIB) for the project. Ultimately, the proposal didn’t come to fruition because a) the budgets for the programme were largely set already, with little wiggle room due to projected rates of inflation, and b) the DIB is due to start soon and there are very tight timelines for implementation and training of a new system.

Here, in this current hypothetical exploration, we were hampered by timing and future financial insecurity. And in our actual collaboration with NACOSA et al., a crucial learning for us is that we’ve seen how important it is to budget for technology solutions early on in the contracting phase of new impact bonds. For us, we would ideally want to be involved as early as possible prior to contracting discussions. When a bond is being designed and outcomes decided upon, it is useful for all parties to examine the tech perspective, as there’s a strong opportunity for the technology powering the digital verification to influence the type of outcomes being collected. Getting in on the process early impacts both the cost and the potential outcomes.

In Sprint One, for example, we learned that we needed to build an integrated system (blending components of the Tiko platform into an existing M&E system for the Imagine SIB). Of course, this integration is coming at a cost that neither party anticipated, and we are now exploring avenues to pay for this technology work. This is proving to be difficult as budgets are mostly already set.

Verifying outcomes is central to achieving impact efficiently, which is why it’s crucial to allocate funds properly. Knowing how to budget is crucial, and every cent can have the power to make a contribution to a desired outcome. In parallel to finding ways to pay for the tech integration with NACOSA, we spent time in Sprint Two exploring another impact bond to test the digital verification system with. Here, the learning is the same: without the technology solution’s cost embedded in the budget upfront, it’s difficult for donors and/or implementers to find the funds to cover them.

Simple. Predictable. Transparent: three criteria to pricing a digital verification solution
We have noticed that there is a real need for donors and funders to have simple, transparent, and predictable (we’ll expand on these terms below) costs so they can appropriately budget for, say, a digital verification solution like the one we are piloting. That way, enough resources can be assigned to effectively fund and maintain it over a multi-year impact bond. Let’s explore the three key elements in more detail:

A pricing model must be simple, predictable, and transparent for the payer:

  • Simple. We believe that a simpler pricing model will be easier for the payer to “sell” to the other players of the impact bond and/or the donor. A complicated pricing model (with multiple pricing variables like users, data storage capacity, or features) requires additional analysis to map out each potential scenario for the future
  • Predictable. Not all surprises are fun. The predictability of software and related maintenance costs is going to be a critical factor for the payer who will almost always be bound by a pre-set budget
  • Transparent. Accessibility of data within the chosen model will be key in building trust in a solution. In the case that a pricing model is usage-based, it is important that the usage metrics are trackable by the solution, available in real-time, and supported by details that can be made available to the payer

Six common pricing models: their merits and limitations
In light of the need for simple, predictable, and transparent costing for technology solutions, we have spent some time in this sprint reflecting on pricing models. We’ll outline six common ones and explore their pros and cons. It’s also possible to combine a number of pricing models to tailor one. For example, one could choose a flat fee for the use of the technology with variable costs pulled out so they can be properly budgeted based on a specific project’s requirements — stay with us, as we get into all the nitty gritty below. Our outline is based on work by Gartner, an information technology (IT) research and consultancy company.

  1. Charging a flat rate

Charging a flat rate is a simple solution that saves time by being easy to buy and quick to understand. However, most impact bonds are long-term commitments and a flat rate solution can be difficult to upsell as project requirements and needs change. We don’t need quick and easy solutions, but nuanced and innovative ones.

2. Charging per registered or active user

A per-user pricing model can be determined per “registered user” or “active/concurrent user”, as well as include others like staff members delivering a service (as opposed to users who receive a service). The advantage of this model is that we could provide feature plans with different pricing (some users may need a wider set of functionality on the digital verification system than others). Furthermore, it is simple to understand and scalable as impact bonds grow. However, charging per user can discourage expanding usage of a given solution, which would be counterintuitive for an impact bond trying to reach or exceed its targets. Additionally, keeping track of users may become complicated if more than one tech system is integrated.

3. Charging on a consumption or usage basis

Also known as a variable or pay-as-you-go solution, the payer would only be charged when they use the digital verification solution based on a usage parameter (for example, it could be based on data volumes, hours, or API requests).

Through this approach, the solution could scale up or down easily and provide a low barrier to entry for users; if it’s cheap to get started, it’s a great way to test something. However, the total costs of the solution using this model would be difficult to predict, making it hard to provide an accurate budget upfront and creating a higher risk for the payer of under- or over-budgeting. If this model is used, budget floors (to protect the technology provider) and ceilings (to protect the payer) could help mitigate this risk.

4. Charging by plan or package

This popular pricing model would offer packages or sets of features for a specific price, usually based on the user persona. Typically, three to five plan options are offered. This is a simple and transparent option for payers and donors, and with this pricing model, the digital verification solution can appeal to prospect payers with different budgets. Furthermore, tiered pricing packages are good to show increasing value at different price points.

On the flipside, plans can be confusing to the payer if they are overly complex. Confused payers may not understand the value of the plan and may fall into the trap, so to speak, of switching to a competitor purely on cost. Additionally, annual pricing increases may not align with value that’s visible to decision-makers, and this lack of resonance can lead to doubt in the solution and tech provider. This is where the need for transparency is key.

5. Charging by outcome

Through this model, a donor is charged based on a percentage of some financial outcome of the product, such as the amount of impact behaviours or other factors related to the value of a solution. Advantageously, it aligns revenue for the technology provider with positive results driven by the solution, motivating all involved to hit targets with specificity and efficiency. Some digital verification solutions are geared towards simply verifying (as opposed to driving better outcomes too), so this model would be better suited for the latter.

On the other hand, it can be difficult to justify alignment to outcome, and might put everyone involved at the mercy of project delays. When implemented, longer-term friction can arise as it is difficult to predict costs on the duration of the grant; they can fluctuate month on month. A solution here would be to have upfront capital made available from a third party to cover the setup and running costs (a similar model to service providers on impact bonds).

6. Charging on a donor’s parameter basis

Here, the pricing model is set based on a donor’s parameters, such as number of employees, grant size, or number of beneficiaries. Payment bands can be flexible. For example: with higher usage, one’s marginal price per user could drop. This would be best used as an option integrated with a pricing plan or package. This model is often used to scale pricing easily and would align with the donor’s ability to fund a solution. However, the single donor parameter doesn’t always align with value for them, causing resistance to inevitable price increases.

Our conclusion or main learning from the second sprint is that the pricing structure for the solution needs to be simple, predictable, and transparent so it can be embedded in multi-year budgets. Approaching pricing is complex, and the pricing model for the digital verification solution will require thoughtful analysis and testing as we build our business case. Given the current budgetary constraints and tight timelines for piloting the solution on an impact bond such as the Imagine SIB, in our next sprint we will collaborate with NACOSA as thought partners rather than using the project as a testing ground for the solution. We’ll focus on outlining the features of the solution and touch on the examples of the types of outcomes the solution will measure. As we remain open, collaborative, and connected, we’ll navigate our way through this complex and engaging challenge of tailoring the right financial model.

--

--