μ Architecture offers Next Best Offer

Daniel Buchta
5 min readSep 16, 2022

--

Source: https://en.wiktionary.org/wiki/%CE%BC

This article provides real implementation of μ Architecture in the Next Best Offer use case.

Next best offer (also known as next best action) is a form of predictive analytics that helps marketers and their organizations better judge customer spending habits and guide marketing efforts toward connecting with customers to close a deal.

We want to bring value to the customer toengage him make him engaged and make a deal with us. So we need right offer in the right time.

Offer value is time dependent. It’s highest value is in the Primary Interest period. The Secondary Interest period meets the lower offer value. Then, offer does not have relevant value for customer. Moreover, there is Point of Negative Impact, where offer starts to generate negative value for customer.

Offer value is time dependent. It’s highest value is in the Primary Interest Period. The Secondary Interest Period meets the lower offer value. Then, offer does not have relevant value for customer. Moreover, there is Point of Negative Impact, where offer starts to generate negative value for customer.

That’s where data comes into this. They have to be both actual and relevant. Moreover, we want them to be personalized.

Actual data means they have to be event-driven.

Relevant data means to be polyglot.

Personalized data means to be decentralized.

That’s the completelly oposite scenario towards standard ETL/batch data processing where we want to get all of the data to the one place to do complex analytics process on them.

And, in fact standard ETL/batch data processing fails here in these terms:

  • Reaction time is too long many times going not only beyond The Interest Periods but falling behind Point of Negative Impact.
  • Complexity affects time-to-market by extensivelly growing feature delivery hand in hand with growing engineering costs.
  • Company culture concentrates on How can we do it? and When it will be done? instead of What can we do? causing that customer remains on the periphery of interest.

Let’s create Next Best Offer

To create Next Best Offer we need data. They are polyglot in different ways:

  • character, some of the covers characteristic, some of them cover dynamics, some of them cover relations
  • speed ie. realtime, batch, API.
  • environment, ie. DWH, OLTP, 3rd parties
Offer Pool is standard DWH batch output. Customer Profile is CRM connector. Ineraction Events are Customer Journey continuous track flow.

When creating Next Best Offer, you need:

  • Pool of offers typically prepared by DWH through batch process
  • Pool of customers typically served by CRM in oltp manner
  • Events that covers customer journey. They come in continuous flow.

Customer journey events are in fact places in time that trigger action. When we want to serve customer The Next Best Offer, we have to know curent customer state. Otherwise, when trying to create Next Best Offer from previous customer state, we will frustrate him with previous best offer. That’s when we reach Point of Negative Impact.

That’s why having continuous flow of events is inevitable prerequisite of succesfull Next Beste Offer solution.

Once, we have data sources defined, we can create a value for customer.

First step towards creating valuable Next Best Offer is to create pool of valid personalized offers.

First step towards creating valuable Next Best Offer is to create pool of valid personalized offers.

The most interesting part on diagram above are that tiny black arrows. In fact, their smoothness will determine whether your application will be the valuable. Once you will not pay main attention to the borders between data sources and app itself, your product will soon become complex, your feature delivery pipeline will be longer and longer and engineering costs will remain growing.

Zhamak Dehghani covers this theme by the term orthogonality within her book Data Mesh.

μ Architecture comes here with pattern depicted as those hexagonal structure to create recursivelly repeating pattern of self simillar data micro processors consisting of tripplets source connect, processor and sink connect, which we will call data μ-products.

We will demonstrate this on Valid Offers μ-product.

We will need also Validations and Eligibilities to create it.

Valid Offers μ-product.

When using streaming technology like Apache Kafka we can define communication boundaries by topics. They are performant, resillient and scalable.

Valid Offers μ-product then consists of Sources are represented by topics offers, customers, events. Business logic is delivered through topics validation_rules and eligibilities. The serving side of the Valid Offers μ-product is valid_offers topic.

Valid Offers μ-product. Sources are represented by topics offers, customers, events. Business logic is delivered through topics validation_rules and eligibilities. The serving side of the Valid Offers μ-product is valid_offers topic.

Kstream library serves here as a place for business logic. When divided into smaller steps, this is able to absorb whole Next Best Offer business logic.

This diagram illustrates previous steps

The same pattern is then used to create final best offers by filtering them from valid offers.

Next Best Offer as a μ-product at the end of continuous data flow.

By realising this, it makes this kind of architecture continously producing the Next Best Offer personalized for each customer.

Note that there is no database used. When realizing it as self similar pattern of producers, processors and consumers, at the end you realize, that what is an active part of application is not retrieving results via queries but continuous data flow making application to react in tact with customer journey.

Whole application then serves various marketing channels making them be a part of Next Best Offer platform in terms of taking primary attention to the smoothness of data flow that in fatc brings value to our customers.

Next Best Offer platform including Marketing Channels.

True crème de la crème magic happens when marketing channel’s customer feedback is backpropagated into platform via Interaction events.

That’s where I use the term Continuous Customer Experience:)

Continuous Customer Experience:)

This type of architecture can be enhanced by machine learning implemented straight into the flow within μ-products. Check my previous article μ Architecture goes machine learning for more information.

Thanks for reading:)

Check out also previous articles about μ Architecture:

μ Architecture

μ Architecture goes machine learning

--

--

Daniel Buchta

Architect | Data & AI/ML Enthusiast🚀 | Quantum Computing Pioneer🛸 | Chaos Theory PhD.