MuleSoft Catalyst
Centre 4 Enablement Playbook Establishing the Foundation.
Identify Assets to Be Created and Reused Across Initial Projects
In this article I now look, from a different perspective, at identifying the assets to be used across the initial projects of a Centre 4 Enablement that is emerging from its protostar stage into a fully formed C4E capable of delivering significant benefit to the organisation in which it is born. These assets must be foundational and reusable as I have said previously but there remains a number of different facets which we can use to decide where our initial efforts should be expended.
The facet that we shall explore in this article is that of Integration best practices. Now, it’s still important to remember that the foundational assets can include documentation such as standards, policies, patterns etc but for this article I will look at particular aspects of integration best practice and cover the three areas suggested by the Catalyst methodology and, in particular, the Centre 4 Enablement playbook.
When implementing an API integration capability into an organisation one of the first things that you are trying to demonstrate is that the capability can add value to all aspects of the organisation after all MuleSoft is not, and would never claim to be, an inexpensive technology. So, you must look to ensure that all ‘types’ of data sharing both within and external to the organisation can be served efficiently and securely. In order to do this you will probably need to develop API’s in the three categories suggested by the Centre for Enablement playbook. Let’s look at each in turn.
API
Probably the most recognised business case and the one that people will immediately go to when anyone mentions API’s is the ‘Request / Response’ API and indeed these are an accessible way to take a data from a source and deliver it to a target system either within or external to the organisation. Selecting the correct strategic and reusable API’s to develop in the first tranche of deliverables, remembering at all times to keep the three layer API architecture in mind, will give you a head start in demonstrating to management that the MuleSoft API capabilities will provide them with the opportunity to accelerate business strategies, improve the ability to share data consistently, improve the speed of data sharing and provide new and innovative ways of engaging with customers. While initial investment in the development and delivery of these foundational assets may seem expensive at the time it must always be born in mind that, as long as the API’s being developed are reusable, overall cost will reduce as time moves on and new projects ‘avoid’ the repeated cost of development through the reuse of existing API’s.
While the ‘Request / Response’ API’s will undoubtedly provide the bulk of your initial API developments it is also important to remember that in most businesses there will need to be mechanisms for processing large volumes of data and hence we come to batch API’s
Batch
Within almost every organisation there will be a requirement to process large volumes of data in a single process whether this be financial data such as a payroll, synchronization of data from one system to another backend system or the updating of pricing data to highlight some of the examples that I have seen.
The MuleSoft platform possesses the ability to process messages in batches and I would suggest that some of the opportunities that the Centre 4 Enablement will find in order to prove its worth will fall in this area.
A word of warning though when deciding which use cases to tackle in the initial tranche of work. While batch processing can be tuned to perform very efficiently and in a reliable way you are increasing the risk that you are facing due to the large-scale failures that can be seen due to the volume of data that is being processed. Batch processing is addressing the processing of large volumes of data and by its very nature the processing becomes more complex and requires confident integration architects and developers to ensure its success, reliability and recovery from failure. It should also be remembered that large volumes of data will require significant and robust processing power to achieve the throughput required.
While the benefits are there the risk is always present so I would suggest that you ensure you have a good understanding of what goals you are trying to achieve and have the services of the people who know how to get you there both from the MuleSoft and business perspective.
Event-driven
The third category suggested by the Centre 4 Enablement playbook is that of Event driven Architecture. Event Driven Architecture and API-led connectivity are two very different styles of architecture with the main difference being that API led connectivity is generally from a source to a target whereas Event driven architecture is typically one Event driving the delivery of data to one or many endpoints, one-to-many. For some business scenarios this makes a perfect combination to provide a solution to a business problem.
Although, maybe not as widely used as the previous two categories, Event driven architecture is still a type of architecture that will be an important aspect to a lot of businesses.
To explore Event driven architecture in a bit more depth let’s look at the architecture style. Event-driven architecture is an architecture style that looks for events that occur in the organisations business domain and are required to initiate an action in one or more applications either within or external to the organisation. The triggering event can be something simple, such as a business event that when it occurs triggers some processing through a real time or near real time API, a complex event where a series of events need to occur before a response is generated or a streaming event where a stream of data is received and processed over a period of time. Whereas API’s as discussed above are synchronous events Event driven architecture relies on synchronous architecture and you need to know how to deal with both.
One good example of the use of an Event driven architecture would be where IoT devices are generating information which needs to ‘trigger’ one or more API’s to deliver data to multiple endpoints or systems. Current technology is generating a plethora of sensors that can detect changes in the state of the devices they are monitoring which capture useful information and produce streams of data. API-led connectivity cannot effectively manage this constant stream of data and hence a better approach is to utilise an event-driven architecture.
A different perspective
And so we conclude with a different perspective that can be used when assessing the initial tranche of API’s that can be developed within the strategic API’s required to prove the worth of the Centre 4 Enablement and of the API capability that is being established.
Initially it may seem quite a daunting task and to a certain extent it is because there is a responsibility to get it right and establish your foundations. To this end looking at the different ways of selecting those first API’s is a tool that is definitely worth having in your toolbox and I wish this had been the case when I first started. Hopefully some of this experience will help you take those first steps more easily.