I recently attended a fintech conference where I met the founder and CEO of a blockchain startup similar to PeerNova. Like everyone else at the conference, he was extremely bullish about the disruptive effect of DLT in areas like mortgages, syndicated loans, post-trade derivatives settlement, etc. and passionately extolled the virtues of his technology platform. What was most interesting about this particular conversation was his assertion that his company would only choose DLT initiatives that involved digital assets existing solely on the ledger for their entire lifecycle. According to him, this model would avoid the need for his company to integrate with any internal systems, as well as prevent any potential conflicts with internal IT teams who may view his company as a threat to their internal visions and plans.
At first glance, this strategy sounds like a smart one. After all, a small startup can be overwhelmed by the herculean effort of integrating their DLT platform with the internal systems and data sources of every network participant (each of them likely a global financial institution). As vendors, we also know first-hand how internal opposition can quickly stonewall or derail the most well-conceived projects. It is no wonder that integrating DLT platforms with internal systems is something nobody wants to talk about. But can we really ignore this elephant in the room? To objectively evaluate this “on-chain only” perspective, we need to answer some basic questions:
- Is it realistic to focus solely on tokenized assets and on-chain workflows, and ignore the topic of integrating DLT platforms with internal systems?
- What impact, if any, does this approach have on success rate and time-to-market for DLT initiatives?
To answer the above questions, we need to develop a deep understanding of customer business drivers, operating environments and their pain-points (pretty standard stuff we all learn in any foundational business course). PeerNova’s target customers are global financial institutions and market infrastructure providers (or utilities). They are highly regulated and have multiple lines of business. They generate revenue based on fees (e.g. advisory, asset management, custody, payment servicing businesses), trading (e.g. broker/dealer businesses), or infrastructure (e.g. trade execution, centralized clearing). As shown in Figure 1, their operating environments are siloed due to their diverse lines of business, geographical distribution, and multiple regulatory jurisdictions, as well as mergers and acquisitions. Consequently, all data (transactional data, reference data, meta-data) and workflows are siloed too, leading to a proliferation of disparate applications and data sources. This leads to multiple copies of the same asset within an institution. For example, an inter-company trade between the New York and Tokyo entities may be booked as two separate legs in two different local systems, in different formats, which then need to be reconciled. Due to various lifecycle events and workflows, there will be more copies created when the trade gets enriched. In addition, there are various centralized functions that depend on aggregating data from multiple business silos.
- Sub-ledgers for different lines of business need to be reconciled with the general ledger.
- Delivery versus Payment (DvP): Cash balances need to be updated in various accounts on payment, and asset ownership accordingly transferred. This is typically done through well-established processes by entities like custodian banks and CSDs (asset transfers), and payment platforms (Swift, FedWire, CHIPS, CHAPS, etc).
- Liquidity management: Collateral needs to be categorized (available/pledged/received), loan and cash balances reconciled and reported, high-quality liquid assets (HQLA) and non-HQLA metrics calculated from aggregating data from multiple sources and reported.
- Data from external providers (counter-parties, exchanges, CCPs, market data providers) needs to be reconciled with internal data.
The list goes on and on…but you get the general picture. In other words, there is no single golden source of data (asset data, market data, reference data, meta-data) as they exist in various silos of an organization in various disparate non-standard formats. To a lesser extent, the same holds true for business logic, which is distributed across multiple applications, often resulting in non-standard and redundant workflows.
Now that we have assessed customers’ business drivers, operating environments, and challenges posed due to siloed operations, let’s consider what happens if a specific type of asset is tokenized and transacted on a DLT platform.
- The tokenized asset still needs to be reconciled with the multiple internal representations of the asset.
- Cash payments continue to happen using traditional channels, since Central Bank’s cash is not tokenized (very early experiments in this area so far), hence cash accounts stored off-chain need to be updated to reflect any transactions.
- Many lifecycle events occur off-chain, e.g.credit defaults, corporate actions, etc. and the affected tokenized assets need to be updated on the ledger.
- Centralized functions like risk and collateral management, finance (G/L), and firm-level investor and regulatory reporting need to accurately reflect data pertaining to the tokenized assets
- The issue of internal reconciliations, non-standard asset representations, and redundant workflows persist, even for tokenized assets as shown in Figure 2.
It is clear from the above that tokenizing specific asset types is not sufficient in itself: For financial institutions to adopt DLT, the networks cannot be hermetically isolated from the rest of the financial infrastructure. The newly tokenized assets must be integrated with off-chain assets and workflows in order to ensure process correctness and regulatory compliance. In order to make this integration seamless, it is imperative for financial institutions to identify and address the sources of internal friction, ie. lack of front-to-back operational visibility, complex reconciliations, and lack of a single source of truth for data and business logic. Once an internal golden source of truth is established for data as well as lifecycle events, integration with DLT networks can be seamless, and the benefits of DLT networks can be fully realized by the network participants. Ignoring internal friction and solely focusing on tokenization will inevitably result in DLT initiatives being stuck in PoC phase. DLT initiatives announced with much fanfare fail to materialize as production “go-live” dates keep getting postponed, like a desert mirage that is always within sight but just out of reach. This not only delays Time-to-Market for DLT initiatives, but it also discourages new entrants from adopting DLT, thus reinforcing the perception that DLT is more hype than substance.
At PeerNova, we continuously strive to help financial institutions address both internal as well as external friction. Our Cuneiform platform is geared to solving both, ensuring fast Time-to-Market for DLT initiatives and enabling our customers to derive all the benefits that DLT can offer (both across and within institutions). Stay tuned for more related topics such as creating internal golden sources and DLT interoperability. Contact us if your organization would like to learn how our customers have benefited from our platform and expertise.