WAX Galaxy
Published in

WAX Galaxy

Lightweight WAX Hyperion API Node Setup Guide

Custom Lightweight WAX History solutions for limited blockchain history

If you are familiar with EOSIO history APIs then you know that there are only a few reliable solutions and out of which the most popular one at the moment is Hyperion, developed by the EOSRio team.

As you might be aware setting up and maintaining a Full History Hyperion node in production is time-consuming, expensive and resource-intensive. At the moment a full history Hyperion API node storage size on the WAX blockchain is growing around 30GB/Week. It’s only expected to increase in future with the increased adoption of the network. The good thing about Hyperion is that it’s built with scalability in mind and as it uses Elasticsearch database which is horizontally scalable it’s easy to add more resources as the network grows. But from a DAPP or a game developers perspective, this might be an expensive thing to set up and maintain.

As a block producer, API as a service provider and an early adopter of the EOSIO ecosystem we have been actively testing and deploying different history services for various purposes. Recently, a customer of ours wanted to have a Hyperion API node on WAX but they were not interested in the full history for their application use-case but instead they only needed history up to a couple of days.

So we came up with a custom solution that satisfies their requirement. You can follow the steps below to set up this custom solution or reach out to us for a fully managed service. You can also check out our standalone node offerings and pricing here: https://waxgalaxy.io/tools/api-services

Requirements:

  • API node Hardware(minimum specs): Multi-threaded CPU with at-least 4gHZ CPU speed or above, 32GB RAM, 200GB NVME SSD
  • Full State-History node Hardware(recommended specs): i9 CPU, 128GB RAM, 6TB NVME SSD [For a partial state-history, you can have lower specs or have it on the same server as Hyperion. This can also be started from a snapshot]
  • Hyperion version: v3.3.4-rc8 or above
  • Dependencies: Elasticsearch 7.14.X, RabbitMQ, Redis, Node.js v16, PM2, Elasticsearch Curator 5.7+

Installation Instructions

  • Use the instructions here and install all the necessary packages required for Hyperion.
  • Now that all the key components are secured we shall install the Elasticsearch curator(to manage and prune the indices in our ES database).

Hyperion Setup:

Follow the instructions here to set up your Hyperion config files and make sure to follow the below points for the custom setup scenario:

  1. As we only need history from a couple of days, we can set the start block in the chain config file to a couple of days back or use live mode only true setting.
  2. And make sure to set up the blocks and deltas “index_partition_size”: 10000 [Adjust according to your requirements] this is important to identify and prune the old data from the indices.
  3. Start Hyperion and make sure it’s working!

Elasticsearch Curator Setup:

You can find the ES Curator guide here: https://www.elastic.co/guide/en/elasticsearch/client/curator/current/index.html

The first thing to validate is the version compatibility with Elasticsearch, you can find the compatibility matrix here: https://www.elastic.co/guide/en/elasticsearch/client/curator/current/version-compatibility.html

Now, that you have identified the necessary ES-Curator version, you can go ahead with the installation process by following the instructions here or just follow the steps below:

wget -qO - https://packages.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -

Add one of the following — noting the correct path, debian or debian9 — in your /etc/apt/sources.list.d/ directory in a file with a .list suffix, for example curator.list

deb [arch=amd64] https://packages.elastic.co/curator/5/debian stable mainordeb [arch=amd64] https://packages.elastic.co/curator/5/debian9 stable main

Now, you can go ahead and install the Elasticsearch Curator.

sudo apt-get update && sudo apt-get install elasticsearch-curator

After the installation, make sure to double-check the version:

curator --version

Now that the installation is done, you can go ahead and write the configuration for ES Curator. Below you find the example config files that are needed to curate the ES cluster.

For using Curator, we created a directory in the root with the default path: /root/.curator/Now tha you have the directory you have to create two config files one is curator.yml and the other one is the action.yml## curator.yml - it defines the config for curator to connect with the ES cluster. Below is an example config for your reference:---
# Remember, leave a key empty if there is no value. None will be a string,
# not a Python "NoneType"
client:
hosts:
- 127.0.0.1
port: 9200
url_prefix:
use_ssl: false
certificate:
client_cert:
client_key:
ssl_no_validate: False
username: <Your ES Cluster Username>
password: <Your ES Cluster Password>
timeout: 30
master_only: False
logging:
loglevel: INFO
logfile:
logformat: default
blacklist: ['elasticsearch', 'urllib3']
## action.yml defines what needs to be curated from the ES cluster, here you use the regex to identify the Indices names etc and perform different actions on them. So in our case, we want to delete (wax-action-|wax-delta-)indices older than 4hrs to save the space.---
# Remember, leave a key empty if there is no value. None will be a string,
# not a Python "NoneType"
#
# Also remember that all examples have 'disable_action' set to True. If you
# want to use this action as a template, be sure to set this to False after
# copying it.
actions:
1:
action: delete_indices
description: >-
Delete indices older than 4 hours (based on index name)
options:
ignore_empty_list: True
disable_action: False
filters:
- filtertype: pattern
kind: regex
value: '^(wax-action-|wax-delta-).*$'
- filtertype: age
source: creation_date
direction: older
timestring: '%Y.%m.%d.%h'
unit: hours
unit_count: 4

Now that you have set up the config, you can test the actions and see if it works before actually executing them using the following command, you can also change the logging mode to debug to monitor any issues etc.

./curator — config /root/.curator/curator.yml /root/.curator/action.yml — dry-run

After all the testing is done, you can set up a crontab script to trigger es curator according to your requirements.

There are some limitations though:

  • Obviously, you’ll not have the full history of transactions
  • You cannot also fetch the different tokens a user have, you’ll have to use a different API to fetch that from our experience. You can use (/v1/chain/get_currency_balance) instead to fetch the tokens.
  • We have tested this on Hyperion 3.3.4-rc6 so there might be a few changes or differences in the latest release.

That’s it you are good to go with your custom WAX Hyperion API node which only stores data from the last couple of days and deletes the rest according to your specifications.

You can also check out our standalone node offerings and pricing here: https://waxgalaxy.io/tools/api-services

That’s it, guys! Thanks for taking the time to read through our updates.

If you think we are doing a great job, don’t forget to vote and support us. You can vote for us by following this link.

If you have more feedback regarding our work and would like to partner with us, please reach out to @sukeshtedla in telegram.

You can also follow us on Medium and Twitter to keep up-to-date.

--

--

WAX Galaxy goal is to produce blocks, tools and useful content for the WAX blockchain networks by engaging, empowering & educating the community.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Sukesh Tedla

CEO and Founder of Unbiased, Regional Head — Swedish Blockchain Association, Blockchain Advisor — Zeptagram