Dom Steil

Multi-Protocol Blockchain Computing

The Enterprise Blockchain Computing market is growing rapidly with Fortune 500 companies forming their internal teams to develop blockchain applications that create separation in their respective industries. Yes, there is a difference between protocols, that is clear. Being directly worked or incentivized into a specific protocol or vendor is not advantageous to the enterprise customer. There are certain requirements that are needed in order for the Enterprise to be able to leverage Blockchain and achieve maximum value from the technology.

Shared transactional processes and shared state is a must.

We are past provisioning and have entered the era of discovery of known nodes that are giving authority to read and write from the ledger. The concept of bringing a consortium is a methodology to streamline authentication and discovery. There needs to be the use of a DID used to authenticate, create, sign and record a transaction, this is important. Workflow and shared process is separate from the use of a DID at the Enterprise level.

The current paradigm: there is POC, PILOT with partner company X, who will provide this set of data to this ledger that we share and that removes the need for us to manually reconcile the process therefore giving us the operational efficiencies needed to reduce our cost and increase our top line. This is what we are hearing from many enterprise customers in US, Europe and Asia. However, this is not a direct representation of the reality of enterprise blockchain solutions today. Where are the production applications that are being used for: uses of provenance, connectivity, immutability, transactional efficacy. There needs to be a discovery mechanism for application specific nodes that enterprises can offer and share. A network of enterprise nodes that leverage a B2B messaging protocol, shared process, shared state, transactional finality, process upgradeability, scalability, partitioning; interfaces that are tightly integrated into growth critical data assets and third party applications.

There needs to be a shared microservices approach to building these applications in that they can trust that the process and therefore result they are using is shared and therefore there is consensus as to the process.

These processes need to be upgradeable. The immutable nature of smart contracts in the contract oriented programming abstraction today does not provider this without having another protocol layer on top of the core virtual machine. A kernel that is upgradeable or a set of libraries that are inserted into the contract on original deployment that enables versioning or referencing for later use.

Reserving state is currently a public good and is free other than the one time cost of deployment. This will need to change or the ability to arbitrarily choose that which is trash on the chain and can be removed over time due to inactivity, lack of incentive or payment and need to remain in the world state. Enterprise Blockchain Networks or Business Networks withing the realm of Permissioned known access networks are essentially Intranet2.0. There are innovations happening on closed and open blockchains. Permissionless innovations undoubtedly are driving the future of this technology and doing the things that were never before possible with blockchain technology. Non-repudiation on permissioned networks using virtualization to share a federated data set is one application for the enterprise that is industry and sector agnostic though requires customization and services for each individual application.

Blockchain networks are enabling the permissionless tokenization of anything. Anything can be tokenized and a market could be created from any one entry point on the network. Initial Coin Offerings are the Initial phase of tokenization for Capital formation; the tokenization of assets that are formally unowned or unapproved is interesting. Is there a way to provide an abstration layer to approve Tokenization of a digital or real world asset. How does one tie tokenization of a real world asset to the blockchain digital world state. Are virtual inventories usable in the physical world, is the overlay needed?

There is the need to build enterprise blockchain applications that enable shared single view of transactional processes to augment existing siloed systems. Opting into business networks that are completely decentralized where assets are frequently moving between participants in the market is the way that tokenization and assets should be moved onto ledgers. Where does that data need to come from to execute computer logic that is shared and agreed upon. The Oracle problem is a multi-protocol problem. It does not matter if you are using Ethereum, Hyperledger, Corda; there needs to be a trusted verification mechanism for the data that is being used to auto-execute a transaction flow or set of state changes.

Enterprise Smart Contracts consist of transaction types, transaction flows, that are distributed on a log and executed on every node. This is true of Smart Contracts on Ethereum that are imprinted into the log that every node shared and true of the Chaincode that is instantiated on Hyperledger Nodes in the network. There is a set of functions that cause state changes within a group of peers whether it is thousands or it is 4 enterprise customers. The state is agreed upon and the individuals or the companies can trust that they are looking at the same state that has been derived from the shared agreed upon processes.

At it’s core Blockchain technologies are ledgers yes, yet ledgers with stored procedures, yet what is new is distributing verifiable value and state. Creating Business Networks and Virtualization of Data across multiple protocols at the application layer provides a better and faster way of doing business. Virtualizing value and distributing it amongst a variable set of unknown trustless peers is new.

Is transactional efficiency the most important aspect of Blockchain technologies? Has it created the separation and arbitrage. Is it as fast to send a payment and verify a transaction as it is to get the physical good there. Is there an opportunity to optimize and recalibrate the efficacy of the payment and transfer of goods. Which one is driving the speed of the other. If the transactional efficiencies create enhanced visibility or what part of it needs to be sped up?

Docker Containers are the core tenants of blockchain nodes. Each company transports a Docker Container as a transferable set of its transaction history and its processes. Where a Docker Container lives is unimportant as of now but is a question that must be answered in time. Where and how a docker container is transported, instantiated. Where do the images of state live, are these images the immutable logs of verticalized enterprise operations in the future.

Automation and provisioning of nodes is the first step using step functions and CloudFormation Templates or Terraform is the ability to automate the first step of the process for the enterprise. Though companies, developers, system integrators are figuring out that speed to provision the network does not lead to an immediate value add for the enterprise customer. Automation of resources via API is a superpower we were gifted with yet; it is powerful to be able to have unlimited computational resources that you can setup with a JSON template. A JSON template to define a set of conditions and the requirements for the network, the API gateway, the containers, the subnets.

Though, this power unfortunately does not lead to achievable Key Performance Indicators for the Enterprise Customer. It leads to less downtime for hosted transaction services; but there are different requirements for the enterprise customer. The Enterprise Customer needs to define and discover what are the KPIS for a blockchain network. What were the KPIs for the first intranets? Public blockchain networks don’t need KPIs, the price of the coin is the KPI.

Multiple Blockchain protocols are being used in the Enterprise and Enterprise customers need to be able to understand the advantageous and disadvantageous of using certain protocols. The next step is actually getting the data into the protocol. The third step is actually implementing an agreed upon process or the agent that is shared between the companies. Then determine a way to value the network by setting performance indicators as to whether or not the investment in the network is returning dividends for all participants. The most important part of the above is the data. The data that is used in the Enterprise on shared networks is of utmost importance. Why is this?

Shared data and real time data that is trusted data can be used across other technologies. Is there something to this? The fact that Blockchains can bifurcate is an important lesson in that bifurciation is a failure in consensus. Until that failure of consensus, there is agreement to state.

Enterprise agreement on state leading to verticalization may be the killer application at least for intranet2.0. Blockchains have certain properties and attributes that are secured by a proof of X. These attributes and properties can be used as a solution to a dare or a questions. Logically centralized and organizationally decentralized databases that are secured by a proof provides transactional efficiencies not data efficiencies. Immutable persistence for transactions is key as is the petrified effect that blockchains provide as an attribute. The enterprise will adopt blockchain protocols and applications that provide a scalable and secure mechanism for using attributes that provide efficiencies by proof. These systems create separation in the respective industry.

For public networks tokenization speaking from radical truth is the killer application. Decentralized organization through incentives creates virtual camps that can self organize. Enterprise organization still a need chief conductor for the orchestra. The microservices coordinator is important to improve, coordinate, automate and actually actualize that value of the network.

Enterprise Blockchain Networks will take time though it is no longer early. We are entering the node (org) discovery era and this will be done in different ways; my bet would be on using a platform they are already on and providing a way for them to discover who else is also already there.