Digital Process Transformation

Digital Transformation: Creating Process Transparency in Highly Regulated Industries

Brian Timmeny
12 min readFeb 26, 2018

--

Introduction

Digital Transformation & Process Automation

Often when we consider the importance of a digital transformation, we are immediately brought to strategies surrounding agile, agile-at-scale, DevOps, Cloud computing, and machine learning/AI. However, within large and small organizations alike, the need to drive consistent delivery remains paramount to the success of a digital transformation.

Establishing a well defined process is paramount to the success of a well-ordered digital transformation.

Setting the Stage

As an organization approaches digital transformation, there remain three cornerstone elements to drive that change on a global scale. These are: cultural change, process and operational change, and common global delivery methods.

This post will focus primarily on the second area (process and operational change) discussing a perspective on the manner in which an organization may drive digital operations/process transformation-at-scale. This process change will drive automation into the fabric of business and technology organization, and is the catalyst that will ensure a well-ordered digital change at a global scale.

Establishing a well defined process is important. Automating that process, and driving consistency is the foundation of a well-executed digital transformation.

Driving Digital Process Automation in Five Phases

The Five Phases of Automation

The digital transformation of a process framework can be discussed in five stages. These processes begin with the definition of a minimum end-to-end delivery process, and end with the deployment and utilization of that automated process on a global scale.

The importance of driving a global automated process lies in that it allows individuals within a global organization to operate within the guidelines of regulatory and company policy, without the personal burden associated with traditional process management and enforcement.

Minimum Viable Process vs Detailed Level Definition

Global organizations are, by definition, extremely complex in their diversity of culture, operations, process, and delivery methods. During the early phases of digital transformation, it will be critical to define a single (minimal), common process definition for the future-state digital organization.

A cornerstone that will drive success in this phase of process definition will be the ability to drive and depict a minimum end-to-end process definition in order to facilitate an early start of the automation exercise.

Driving a minimum process definition is critical in order to begin, as quickly as possible, the automation of that process … it is only through automating processes whereby an organization will reap the benefits of reduced overhead and increased compliance.

Challenge. Early debates will quickly arise around the need and importance of local process and operational variations. This will occur at the business unit level, as well as the geographic level. Examples will include unique business operations, such as risk and advisory, and will also stem from areas such as audit and regulatory requirements. While there is no doubt that these are important factors at maturity, during the early stages of this process, it is far more important to focus on beginning the process of automation (the culture of automation), over the perfection of a detailed end-to-end process definition.

Phase One: Defining an End-To-End Process

Why Process Is Necessary

In the age of agile transformation, there is often spirited debate around the need for process, and if agile and lean methodology should simply assume the role of process engineering. In some cases, this is certainly a viable possibility, and methodology and process should always be defined concurrently. However, process is the manner by which we know all appropriate steps are followed, security is implemented, and regulatory compliance is met. Methodology is our “how”.

An articulate definition of an organization’s end-to-end delivery process is critical to ensure appropriate function of the operation, as well as drive security, regulatory and organization policy compliance.

The method by which these steps/milestones are met may vary, and therefore before embarking upon a methodology revamp, it is first important to ensure that a single common process (including key milestones) is established across the organization. Automating these steps is the manner by which this process is then crystalized into the organization, and into the fabric of the culture of the group.

Defining Process within an Agile Context

The process cycle must be well understood by the entire organization. The timing of these cycles must also be embraced in order to drive consistent delivery, and support those same delivery efforts. Below is a view of the end-to-end cycle for the process automation that will serve as an example foundation below.

Two Baseline Components of Process Planning

Agile Process in Linear Model Format

As we consider a global end-to-end delivery process, there remain two important phases within that cycle.

The first revolves around understanding the corporate needs and product approval, ensuring those solutions are feasible (within reason), and approving those products to start/continue (agile-at-scale execution). This phase, at global digital scale, will normally occur within the context of an annual quarter.

The second context is delivery execution. This involves the constant planning (agile, DevOps), refinement, and delivery of our products. This is an evergreen cycle, influenced by quarterly approvals and funding cycles. This phase will normally occur at two levels, the program increment (estimated six weeks) and sprint cadence (estimated every two weeks).

Milestone Definition & Process Planning

Defining the end-to-end process drives a consistent understanding related to the phases of delivery. That is, when defining a process, the keys to understand and denote will be the milestones along the way. This is what will eventually allow for the monitoring of the global end-to-end process.

Phase Two: Providing an Automation Framework

Defining and Housing the Process Data Architecture

When beginning this stage, the assumption is that we now have a baseline foundation process and milestone definition.

With that context in mind, the organization must now establish a common data store where this information (milestones) can be housed and scaled to the needs of the global organization.

Process Data Store

This data store will house key milestones associated to each of our delivery programs and products. In doing so, we now create the ability to understand holistically the end-to-end process of the organization, and more practically to the day-to-day delivery of each of our global business units and teams.

Once we have established a milestone definition of our end-to-end processes, these must then be placed into a single, globally accessible repository.

Scaling a Global Process Architecture Solution

As the early process phases and milestones are defined, it will then be important to ensure that each of these components are built in a loosely coupled fashion. That is, every actor (application suite) must exist in a self-sufficient manner whereby milestones are met, and then later updated to the wider global organization (data store).

Service Based Process Model Architecture

Within this global process architecture, all applications (actors) are then able to complete their milestones and update that “state” to a central repository that will then log that “state” across the complexity of the entire organization.

By placing milestone definitions within a common, global repository, we now have the ability to report individual delivery teams’ milestone “state” in a globally consistent manner.

The importance of building a service-based data access model cannot be overstated. An open model, accessed via suites of micro-services will allow for data model interaction across the enterprise, without the direct need to constantly extend infrastructure as new communication channels are created.

The concept of micro-services and resources are further explained in this article, written by Martin Fowler.

Scaling Global Interactions

The ability for a complex ecosystem to interact seamlessly across a global suite of applications is a cornerstone to establishing a globally scalable interaction model.

Process Store Federated Interaction Model

Rarely will global organization begin this process from zero. Rather, in many cases, local process applications are already in use (and have been) for some time.

By establishing a loosely coupled process architecture, we now enable a complex process framework to report consistently on a global scale.

We must, therefore, build a framework that will allow for information to be centrally housed and utilized, without creating a tightly coupled process architecture (which will limit progress and speed of adoption).

A Common Source of Process Truth

A cornerstone to establishing a federated process model is the ability to communicate across the breadth of the global organization. Driving a common source of truth, related to each critical milestone across the wider group is important in order to scale processes across business units and geographic areas.

Common Data Shared State

In this manner, all participating applications may update (post) and understand (get) real-time process milestone “state” without having to tightly couple to the entire global process ecosystem.

All applications can update a common milestone resource, thereby allowing for complex relationships to be understood through common, global milestone definitions.

A Proactive Resource Driven Model

A federated process model will require the ability to interact with asynchronous data streams. In this way, processes that are not necessarily tightly coupled may interact with one another, concurrently updating the state of any given step asynchronously.

Dynamic Data State Model

The power of a shared data source lies in the ability to interact with events, and then ensure that those events can then drive downstream process updates. That is, once a first step is completed (for examples, updates received from all required input actors), an update can then be published (for all downstream consumers) that step two is ready to begin.

This article does not intend to review the manner or merits of reactive programming models. For a more detailed discussion surrounding reactive programming, please see the following GitHub update.

Phase Three: Provide Automated Process Visibility

Process and component “states” are recorded in a uniform manner, and universally accessible (via resource based model). The power of the model now comes to life via logs and visualization.

The diagram below is an example of a process visualization state that can be created utilizing the data hosted in the common process data store.

Process State Visualization

Because we now store milestone information for all of our portfolios and products in a single logical construct, updates to the “state” of any given portfolio product team may be viewed in order to understand their current state of delivery, against a common set of approved funding criteria.

Dynamic Process Step Updates

As depicted in the picture above, therefore, any application suite across the enterprise may now interact with the associated resource, and in doing so inquire or update the “state” of a given process milestone from the vantage point of the associated business organization or geographic unit.

The Importance of a Visible Process Model

The importance of common visibility (through logs or a visible dashboard) allows up-to-date interrogation, and traceability into the end-to-end processes established by the global organization. In doing this, the group is then able to assess that all required steps and safe-guards were enforced throughout the end-to-end delivery life-cycle.

Establishing a common process milestone repository is important. Establishing visibility utilizing that global data is the key to driving consistency of digital processes at global scale.

These steps are critical during reviews which ensure that all appropriate security, authority regulation, and organization policies have been observed and ensured.

Phase Four: Early Application Integration

Process Automation Pilots

The roll-out of process integration at-scale will be complex. The objective must be to drive early adoption and practical integration of the most important components within the process. This will form the foundation for an organization’s MVP (minimum viable product) definition.

Best Candidates for Early Adoption

In many cases, the most urgent processes will be those that ensure an organization’s awareness of two high level process concepts. The first will be that of understanding which portfolios, programs and products have been approved (and funded) for delivery. The second will be to know that all required compliance components are met (approved) and visibility exists across deployed components.

For this reason, one perspective on the fist two areas would be to integrate into the end-to-end process steps would be: 1) those of portfolio, program, and product approvals, and 2) release deployment approvals.

Milestone Area №1: Portfolio, Program, Product Approval. When considering our first integrations, one of the key elements to monitor and view the state of funding approval related to our portfolios, programs, and products.

Milestone Area №2: Release Deployment Approval. Release deployment approval is what allows us to govern the business and technology release process, and ensure that all components released to production have been validated through the appropriate (automated) security, compliance, and bank policy regulations.

Establishing early integration across 1) funding approval and 2) release approval allows for a critical first step in global process awareness.

Implementing within a “Redundant” Model.

A cornerstone of driving this process at scale is not to replace what exists immediately, but rather to allow the processes already in place to continue, while we improve and evolve a global model using the very same infrastructure, but now integrated as part of a global, connected process model.

When driving a global process, it remains important to maintain current process systems in place, while focusing initially on the connectivity of those systems through a central process data store. This evolves the global process, without endangering the current operations of the organization.

Phase Five: Scaling the Automated Process Framework

Loosely Coupled Process Framework Through Resource Models

As discussed above, it will remain important to ensure that all components within the process ecosystem interact through a series of established resources that then communicate this information back to the common process (milestone) repository. By doing so, we ensure that the processes already in place remain, and we evolve each local component (status information) for use within the wider global process ecosystem.

Dynamically Expanding the EcoSystem

By creating this loosely coupled framework, over time, the process ecosystem can evolve and add components (new input systems), without having to break the integrity of the overall system (already in place). The process, therefore, can continue while new sources of input can be added and evolved over time.

The above is a depiction of how the existing ecosystem (starting with two application suites) can later be amplified to include additional information stores. These additional stores can then updates and affect the information held within the common process data store, thereby improving the information, evolving the stability, without having to break existing data sources.

Processes evolve. Establishing a global process model that can scale is critical to the success of the long-term digital transformation.

Conclusion

In driving the next generation of digital and DevOps transformation, this does not leave mature process architecture behind. Rather it protects and automates that same framework. In doing so, it creates an easy-to-adopt process, one that can be integrated into the very fabric of the organization.

There are five key phases in driving automated process maturity. These phases are:

  • Process end-to-end definition
  • Automated process framework definition
  • Automated process visibility
  • Early application pilot integration
  • Automated process framework global scaling

By defining and automating processes across the global organization, this ensures that the digital transformation evolves into the very fabric of the organization. This gives a global organization a method by which process compliance can be established and automated across the natural complexity of a global organization.

Articles Referenced In this Post:

--

--