Protofire Releases Infrastructure Templates for Gnosis Beacon Chain

Protofire.io
Protofire Blog
Published in
3 min readJun 13, 2022

Now, you can deploy Beacon Chain nodes and validators across multiple cloud providers/regions, improving fault tolerance, risk management, and proximity.

Risks of running multiple validators on a single VM

Protofire has been working with Gnosis for over a year now. The collaboration resulted in the integration of Gnosis Safe Multisig’s fork with Moonriver and the Gnosis Safe fork’s with Evmos, for instance.

Several months ago, Gnosis launched a Beacon network, which is an equivalent of ETH2. In the wake of the release, Protofire assisted in deploying 800+ validators to the Gnosis Beacon Chain.

Since technical requirements for becoming a validator remain quite simple, it’s been a common practice to run thousands of validators on the same virtual machine. In case a virtual machine goes down, the risk of losing a significant part of the validators’ security network arises. In its turn, this may also inflict penalties on the validator.

What is the solution?

To address the issue mentioned above, the team at Protofire designed a set of tools that allows for distributing validators across multiple virtual machines, regions, and clouds. This approach brings decentralization to a whole new level, improving fault tolerance and risk management.

The general idea behind the solution is to:

  • set a hard limit of validators per instance as 128
  • distribute validators across multiple instances, geographical regions, and cloud providers
  • utilize basic infrastructure configurations across clouds for the sake of consistency
  • implement changes to the infrastructure exclusively via Terraform and Ansible

On the diagram below, you can see how the approach is implemented exemplified on AWS services.

The default security rules allow to forward traffic to the P2P, JSON-RPC HTTP, JSON-RPC WebSocket, SSH, and AWS HealthCheck ports.

In addition to these resources, latency-based records for each instance across all of the clouds are deployed into a designated hosted zone in AWS Route53, as well as a set of health checks to track the state of end points.

As for Google Cloud and Microsoft Azure, equivalent services are deployed: GCP Compute Network and Azure Virtual Network instead of AWS VPC, GCP Compute Instance and Azure Virtual Machine instead of AWS EC2, etc.

How to make it work

To start deploying your infrastructure, you’ll need:

  • Registered accounts in a cloud of choice (Amazon Web Services, Google Cloud Platform, Microsoft Azure) with programmatic access.
  • Installed Python 3. It will be used to convert a JSON configuration file into Terraform configurations.
  • Installed Terraform to deploy infrastructure.
  • Installed Ansible to configure the environment on the newly created virtual machines.
  • Ready-to-use Gnosis Beacon Chain validator’s keystores.

As of now, having an Amazon Web Services account is a requirement, since it’s used for DNS and health checks management. The AWS account is also used to store Terraform state by default, but this behavior can be disabled.

After you have all prerequisites ready, you can start by signing in to your cloud accounts using the respective provider’s CLI. Then, configure your deployment options and apply them using Terraform. Finally, deploy your application with Ansible. Now, you’re all set.

The cost-effective habit of keeping all the sensitive data in a single storage may oftentimes lead to the loss of all the data stored. The deployment of Gnosis Beacon Chain nodes and validators to several clouds with our set of tools secures the ecosystem against data loss or risk of failure.

Having tested the described templates on the real-world projects, the team at Protofire proved the high efficiency of the Infrastructure as Code approach for data management both in the short and long run. With distributed data storage, running validators on multiple virtual machines enhances security and helps to avoid penalties on validators. Consequently, decentralized data management enables the ecosystem owners to save their effort and refocus from system support to platform diversification and modification.

--

--

Protofire.io
Protofire Blog

We help token-based startups with protocol & smart contract engineering, high-performance trusted data feeds (oracles), and awesome developer tools (SDKs/APIs).