Crypto Kim
IT Architecture
Published in
5 min readNov 26, 2018

--

“The flying Dutchman” (see also photo page)

Key Takeaways

  • Event Driven Architecture helps with decoupling applications
  • Think about the names used for events
  • Let Publisher publish and filter within subscribers
  • Use meta data within event headers
  • Use standardization on entities within Event, so that the Event is a first citizen data object

Introduction

There is a lot of hype around Microservices and the use of events for implementing the choreography pattern. However this is nice for companies like Netflix and Twitter, but there are a lot of organisations still struggling with files and ESB like products. Also my current client uses an ESB namely the Oracle SOA Suite 12c for integrations. We cannot just throw away this ESB, but we can make use of the event mechanism built in. This blog describes the way we use the EDN (Event Delivery Network) component, that is used within SOA composites to throw events and to subscribe on events.

EDN

Oracle has a component that you can use to publish events and to subscribe on events within a SOA composite. Just use the invoke activity with the eventname and the content of the event. Within a composite you can subscribe on events and set filters. You can also configure “oneAndOnlyone” consistency property and indicate if you want a durable or non-durable subscriber. The EDN hides the underlying JMS provider, which can be changed (weblogic jms or OracleAQ). Separate Topics can be defined for each event or just use 1 topic for all events.

Notes:

  • Applications must always be abstracted by a corresponding SOA Composite. Applications should not use JMS directly
  • EDN cannot be used directly from within Oracle OSB

Event

We use the following definition for an event. The event is based on business entities and the operations executed on those entities. We use the following definition:

BusinessObject The name of the business entity, i.e. Invoice, Customer

Operation Created, Updated, Deleted

TrackingId Tracking ID so that flows can be followed

SourceSystem The system that published the event

TargetSystem The target system to which the event must be sent explicitly. This is
an optional field

Identifiers This is used to add meta data, i.e. invoice number, location, order number

Message The message. This contains the content of the business entity

Master data entities

This works fine for master data entities, because you want to use events to indicate that entities are created, updated or deleted. We use this to decouple the MDM solution from the interested applications. New applications are easily added by making a SOA composite subscriber for that application.

Transactional data

In this case the solution falls short, because the context is missing. For example when an Invoice is received from a customer, you want to send the event InvoiceReceivedFromCustomer. In our case we use the Created operation on Invoice but this does not say much. Within the meta data we had to add more context information.

Lessons learned

  • Think about the names you use for Events, i.e. Invoice_Create is too vague and will need more metadata to filter on the correct event
  • Think about the data you put in the message
  • Master data entities can be especially good candidates to publish

Data will change

The content of entities will change, i.e. fields are added. After this change you want to synchronize the interested applications with this new information. This is especially needed for master data. So you are decoupled on the publishing- and subscribing applications, but still strongly coupled to the data. You need to synchronize the new dataset to the applications (batch). There are several options:

· Use ETL to synchronize the data

This can be a good option in case it is lots of data. Extra tooling and data transformation is needed, which you tried to avoid. The source and target systems are again coupled.

· Publish all data and make use of the Pub-Sub implementation
This works fine in case there are less entities. You don’t need extra ETL tooling but re-use the publish-subscribe mechanism. We use this and also use the targetSystem field within the event, when we know it is only relevant to a particular application(s).

Lessons learned

  • Think about data reconciliation in case data fields are changed
  • Try to avoid ETL because this couples the source and target systems again

Filters

What I see often in integration projects, is that each interested application a new integration is built particularly for that application. For example a new supplier wants to retrieve Invoice messages from the company. However it wants to retrieve the invoice in its own format (for example csv) and based on its own protocol (for example sftp). The application, that generates the invoices, just exports yet another file. What I propose is to let the publishing application generate Invoice events. It should have no knowledge about the interesting applications. It just has to do the jobs. Then for each supplier make a Subscribe composite that filters the Invoices targeted for that supplier and map the Invoice to the suppliers format and protocol. This is where the Subscriber composite filters come in.

With the filter you can define which events you want to receive. Note that you can also indicate if you want to have a Durable subscriber. In that case you receive events in case the composite was stopped.

Lessons learned

  • Let Publishers publish all data and let Subscriber filter the correct data
  • Filter on the meta data part of the event and not on the event message content

Error handling

Functional- and technical errors are unavoidable. This also is the case with events, but with the extra complexity that the publisher is decoupled from the subscriber(s). This means that in case the event is published from the source, the work is done. But still errors can occur within the subscriber both functionally as technically. you have to think of the following scenario’s:

  • Do i want to be able to re-publish the same event?
    The consequence is that target systems must implement idempotency
  • Does the sending application need to know that all targets have successfully handled the event?
    I my opinion you should try to solve the problem at the target, because otherwise you introduce strong coupling again. This also depends on the business requirements of course.

Conclusion

There are more topics that can be discussed, i.e. event versioning, data security. Maybe another time.

As always, feedback and questions are very welcome !

Conclusion

Event driven architectures can be a good solution to decouple applications! We are now able to connect new applications faster than before, because of the Pub-Sub pattern and the standardization of the event data.

Think about the events you publish, because otherwise you still need much of filter logic within the subscribers. Also think about the way you want so solve data synchronization/reconciliation in case the data is changed.

src: https://rogervdkimmenade.blogspot.com

--

--

Crypto Kim
IT Architecture

Sharing my practical experiences in the crypto space.