Why and how integrate Blockchains into your information system?
Everyone, every industry and every organization is constrained by the lack of trust in digital infrastructures (leaks, diversions, influences, third party administrators…). This leads to breaks, a lack of verifiability, performance, and agility. Resilient infrastructures are essential to enable confidentiality and verifiable institutions. They are more open than the sovereign and more protective than the GAFAM’s. On this first level, a second specialized services level makes it possible to provide a transverse verification service for all information systems. We will see here how new architecture is changing the game and technological choices in blockchains.
Interoperability is key for the industries in a digital age
Interoperability refers to the ability of computer systems or software to exchange and use information. As it stands, interoperability issues are one of the major obstacles to realizing the value of digital transformation according to the Institution of Civil Engineering. Interoperability is key to the performance and agility of the co-production along the logistics, production, research, distribution, sales chain, research, marketing… all these processes are cooperative across industries.
In finance and among regulators, despite the partial nationalization of banks, their implications in central banks, despite the strong regulations and KYC / AML processes, the FinCENFiles reveal the ineffectiveness of verifications and applications operational rules. The banks have thus allowed the financing and laundered the money of terrorism, fraud, corruption, criminals, circumvention of international rules, with dramatic human consequences. The information was present but there were not the required interventions, in particular for lack of interoperability and means of verifying the operational executions by the actors.
Interoperability is a major lever for transforming the value of 4.0 Industry with multiple component integration without exposing itself too much and without sharing too much, but in a continuous flow. However, to remain competitive, industries are forced to integrate at an exponential rate various new flows and reach a limit related to verifiability.
For the supply chain, interoperability is vital. Players need to cooperate to keep pace with increasingly demanding and variable pull-flow co-production and real-time global distribution. Flows are now synchronous between these players who contribute to the product: customization, parameterization according to countries, increasing traceability, with more certification, customs, regulatory and accounting documentation to be updated. However, the flow of these states between the players is asynchronous, not very verifiable, very partial, in silos by players and by nature. The impact is illustrated by many difficulties: integration of e-commerce (20 years later… the harsh reality of group purchasing sites and those who have succeeded are the leaders), new emergency humanitarian logistics (eg low return after natural disasters), new emergency health (COVID: the lack of agility between circuits), exponential requirements on food (real) traceability…
Moreover, a door is open to errors, fraud, malevolence, spying and tensions between partners and with institutions, when there is a lack of cooperation, or/and a rupture towards systems, or/and non-verifiable operational status.
Why synchronous business and still asynchronous interoperability?
In the current state of affairs, interoperability leads to a major risk of losing control over data and information that flows between equipment. From more or less proprietary components, third-party platforms, infrastructures and data sources dominated by players or states, the centralization of digital data and its increasingly complex reprocessing require expertise that generally goes beyond the company’s field of competence and can lead to the emergence of dominant positions and governance hijacked by the lack of resilience. Without being able to control the risk in the information system, flows and sometimes projects are interrupted.
Whatever the tools of the information system, TMS, ERP, BPM, CRM, etc., whatever the means of security and data integration, whatever interfaces are available for exchanging information, a technical brick is missing for check, at input: EDI flows for example, certificates, operations, reports, data, agreements, transactions received from cross-functional processes of industries and platforms and at output: to guarantee neutrality and security of processing, the respect of confidentiality and the conditions of exploitation of data and metadata.
How to integrate a resilient distributed referential of proofs?
Without modifying existing flows and via standard interfaces (standardized by the W3C), information systems can query an ad hoc repository of stakeholders, without any related interest, neither technical, nor governance nor in infrastructures. This repository provides evidence that allows each part to verify flows, certificates, states, history, rights, agreements and transactions… None of the parties could manipulate or influence the processing of evidence. However, each organization is free to define the conditions of confidentiality, access, validation and dynamic enrolment in the ecosystem. In this way, each one proves and gives the means to verify the interoperability information.
This lever, a repository of distributed evidence can be deployed, and integrated. It requires very little maintenance, transactions are free of charge (marginal costs depending on the risk to be covered and the validation of the certificates), there is no volume limit, migrations and upgrades have become easy by network levels. There is no longer trade-off between private, public, federated, with permissions… There is no longer a trade-off between resilience and performance, infrastructure governance no longer needs to be conditioned by third-party organizations, consumption is minimal without having the need for Community rules. In short, the new frameworks have removed the constraints of blockchains and crypto-currencies. It is possible to build proof networks on several levels of networks, from the non-fungible towards the fungible, at least resilient towards the most resilient, large and dynamic volumes, towards immutable states.
Due to the advances in native multi-signature and native zero knowledge proof on protocols based on inputs and outputs and not on account addresses and contracts, the intelligence of transactions and emissions is managed by the free composition of elaborate multi-signature conditions, advantageously replacing smart contracts. Mechanisms on several levels of “spv with 2 peg way” networks, make it possible to resolve forks dependencies, scalability and to have a very resilient base layer in PoW which can certify in 1 transaction the history of a thousand transactions on the upper layers which they then operate without mining, with simple and fast consensuses. These upper layers are generally distributed between a level 2 that manages multi-level emissions, allowing the management of configurable interoperability rules between the level 3 parallélisées blockchains. The level 3 blockchains (on the nodes of level 2), confidential by default, managed according to the governance of the organizations receive flows and manage the registry of unique proofs, time-stamped, signed and then delegated “blindly” to the level 2 third-party network.
How to choose a technology for a resilient distributed referential of proofs?
There is no longer a need for PoS compromise, there is no longer a need to expose yourself to significant maintenance and administration, to the flaws and third-party systems of Hyperledger-type protocols, there is no longer any need for expose oneself to the insecurity of EVMs of Ethereum type protocols. It is no longer a question of private, federated or public projects. This involves defining the target architecture, and choosing the appropriate framework to configure the conditions of the distributed network and its interconnections to the information system and towards certification of its history on a PoW (consumption being minimized at periodic processing of aggregates).
On the market, 4 or 5 technologies at most allow legally strong evidence, confidentiality, and integration without compromising on resilience, performance, security, and freedom of governance. These are the minimum requirements for large-scale deployment of a distributed evidence network, essential for information systems in a digital age.
For production projects, I recommend that you look at tools and solutions like Zendoo from Horizen or Elements from Blockstream. Equipped and mature, these solutions comply with the requirements described here. For your watch, Kadena is particularly promising, probably the only scalable blockchain without a tiered architecture.