μ Architecture goes machine learning

Daniel Buchta
4 min readSep 9, 2022

--

Source: https://en.wiktionary.org/wiki/%CE%BC

This article is continuation of introduction to μ Architecture. This time we will discuss how machine learning can be organically embeded into this kind of data architecture.

First, let’s repeat the main principle of μ Architecture:

Queries are passive and data are active.

Let’s repeat also μ Architecture definition.

μ Architecture is recursivelly repeating pattern of self simillar data micro processors consisting of tripplets source connect, processor and sink connect, which we will call data μ-products. These μ products then create mesh. This pattern allows us to spread the complexity of processing across the mesh through distribution and scalability.

μ product/μ neuron

Zhamak Dehghani demonstrates the data product as the architectural quantum. I will not use this as term quantum will better fit into the next generation of an arhcitecture I call an ω Architecture.

Data Mesh principle and μ Architecture as it’s data architecture interpretation are based on idea of neural network. Then, building block rather looks like neuron.

That’s why I’ll use term μ neuron as a basic building block in μ Architecture.

Basic building block of μ Architecture is μ product/μ neuron. Source: https://martinfowler.com/articles/data-mesh-principles.html

μ Architecture as μ Neural Network consisting of μ neuron

One of the main advantages and challenges μ Architecture offers isto represent what now is a consistent from to pipeline as a topological structure that seems like neural network.

Three μ products/μ neuron connected linear way

When looking at the linear way depicted at the diagram above, it looks like nothing really interesting happens. In fact, complexity challenge is the first impression man can have arising in his mind:)

Let’s look at the scenario, where both users μ neuron and pageviews μ neuron supply the users pageviews μ neuron.

Scenario, where both users μ neuron and pageviews μ neuron supply the users pageviews μ neuron.

Here it starts to make sense. Users Pageviews μ neuron is here to serve as an end point for some services or send information into another μ neuron/s.

In fact, as for any kind of architecture including LEGO:) it’s so reliable as it’s basic building block.

From the perspective of technology, event data streaming seems to be the best fit for this kind of architecture. It’ due to throughput performance and ability to store/transform information along the stream/s.

Me, especially, like Apache Kafka ecosystem (but do not hesitate to chose one by your own decision:) where we can find:

  • Connectors, both sink and source and topics with storage to interconnect neurons
  • Streams API and KSQL to do μ transformations inside μ neuron.
Source: https://itnext.io/is-kafka-a-message-queue-or-a-stream-processing-platform-7decc3cf1cf

To have one core technology to materialize μ Architecture seems like the benefit that is worth considering.

μ Architecture goes machine learning

And what about embeding machine learning into our neural network?

μ neuron in the centre uses ML.

Every μ neuron can operate with or within ML model. It’s because we can repeat the selfsimilar pattern also within the μ neuron itself.

Event comes, event goes.

We want to create diagram depicted above.

Let’s start this way. We create Flow and derive Feature from it by:

  • sending data into antoher topic
  • enrich them with other data/metadata
Feature data flow derived from basic Flow.

Now, we can add the ML part of the μ neuron. Model+Model Update Topic supplies continuously ML technology both with data to update model and to serve model for output. Serving Layer then serves data back to Flow.

Model+Model Update Topic supplies continuously ML technology both with data to update model and to serve model for output. Serving Layer then serves data back to Flow.

This kind of architecture is challenging. It combines both flow complexity and performance requirements. The best fit for this is cloud environment. I also recommend considering the use of techniques such as chaos engineering not only because I am chaos tehory PhD:). Configuration, Notification, Logging and automation are de facto prerequisities.

μ Architecture as μ Neural Network like topology leveraging Machine Learning.

At the end, when using Reinforcement Learning/Q learning, it seems for me to be la crème de la crème of modern data architecture. Continuity and ability to provide constant feedback makes it what I call Infinite Data Experience.

Infinite Data Experience, ML empowered.

I want also to turn your attention towards the work of another great people around the modern data architecture.

First of all big thanks to Zhamak Dehghani who created a Data mesh as a decentralized sociotechnical approach to share, access, and manage analytical data in complex and large-scale environments.

I also found great inspiration in the article Data mesh and monoliths integration patterns by Ugo Ciraci.

Thanks for reading:)

--

--

Daniel Buchta

Architect | Data & AI/ML Enthusiast🚀 | Quantum Computing Pioneer🛸 | Chaos Theory PhD.