APEX Network
Published in

APEX Network

Development Progress Update — Jimmy Hu follow-up interview

As we are approaching the end of 2019, the community management team and our CEO Jimmy Hu arranged for a sitdown to talk about where APEX Network is at currently, and what’s in store for the next months. The purpose of the interview is to give the community new insights into development progress, as well as answering follow-up questions from the community that were submitted after the recent Blockchain Brad interview. Through this we are confident that our community will gain deeper and more up to date knowledge about the interesting things that are coming.

Hey Jimmy, happy to talk to you again — how are you doing?

I’m fine, thanks. The year is coming to and end soon and I hope everyone out there will enjoy their time as much as possible together with their family and friends. So before moving onto the more official part of the interview, I’d like to extend my warm wishes to everyone in the community and thank them for their support. We have a prosperous 2020 coming soon, and we plan to show the enterprise ecosystem all the tools we have in store for them.

Well thank you Jimmy, that is very kind of you! We wish you all the best too. So let’s start with the questions, shall we?

GDPR compliant Data Protocols

Jude (product manager and one of the core developers) recently mentioned working on the GDPR compliant data protocols. Could you tell us a bit more about that?

Sure. Blockchain enables innovative solutions to GDPR related problems such as data privacy and permission level protocols to ensure compliance. However, certain experts have stated that GDPR and blockchain form a paradox in which security and privacy of the data is uncertain or even compromised when a public ledger comes into the picture. Our solution to this, which we originally outlined in the development paper, is the ATDM (APEX Transactional Data Management) System. This involves storing “transactional data”, which refers to the data of the consumer/user, in off chain nodes that run parallel to the APEX Network blockchain while logging the cooperation between parties on chain to ensure accountability. These are what we have referred to as Data Cloud Nodes. The ATDM runs on the APEX VM (our virtual machine, or smart contract system). However, ATDM functions like a program or smart contract itself that would require oracle like data connectors to connect to the relevant data points off chain.

Ok, so the data of stored off of the actual blockchain on data nodes and not directly on the public ledger. What makes the ATDM different from a typical decentralized cloud like storage solution, though?

Since the ATDM protocol is smart contract based, it is predefined with a set of non alterable rules. Since that is the case, it makes a good argument that implementation of this protocol can be used to enforce GDPR rules. The legal implications of this protocol and its effectiveness in each legal jurisdiction may vary, though it is a significant upgrade from an opaque enforcement process that is hard to audit.

Sounds good, but — there’s always a but. In your words — why do we need this storage solution when there are trillion dollar companies who are running huge centralized storage solutions collecting user data from all of us already?

Hah — well that’s exactly why we need this actually. The 21st century digital economy is transforming society and business right before our eyes. There has been a ubiquitous saying that “data is the new oil” of the 21st century, driven by a proliferation of data and AI. In an increasingly tech-driven landscape, a phenomenon that goes hand in hand with this trend is also the concept of a “winner take all” economy. An easy example if you look at companies like Google and Amazon that use economies of scale and the internet to become near monopolies. In this type of environment mainly the tech monopolies and their associates benefit, leaving the working-class behind, unable to capitalize from the value of data (especially their own data) and AI. We will see a trend of consumers demanding to take control of their own data and reap the value from it — this trend is not exactly what GDPR is trying to achieve, but it is a “problem” that fits in the same bucket.

Thanks Jimmy. Yeah, a large part of our existing community would agree with you there, we have had discussions about privacy and data monopolies in the past. Now, there’s a long way to go from thinking about the solution to actually implementing it. How far have you guys actually come?

We have actually already implemented prototypes in two of the pilots and are currently testing the system for robustness. We are still trying to perfect the off chain data connection aspect. The problem we are working on currently is how to ensure maximum privacy while at the same time maintaining the performance needed for enterprise level applications.

Ok, good to know that it’s currently being tested, means it’s gone beyond the pure “thought” stage so to speak. In the past there was talks of integrating blockchain services with existing APEX Technologies products, is this still on the table?

Yeah definitely. The general technical route is clear, but we are working on improving some aspects as I mentioned above. This development cycle will take another couple of months, but we expect to roll this out as a part of the APEX SDK for our enterprise client base by March 2020 enabling them to experiment with the protocol, and it will also be available to be a part of / plugged into NEXUS. Considering the synergistic effects with our CDP products, completion of this module will make APEX Network one of a few real, commercially viable public chain projects.

Good stuff, looking forward to that! You mentioned the off chain aspect and the data nodes running parallell to the blockchain. They are however smart contract based, and will be communicating with the blockchain. How does this impact upgradability — can features be expanded, or will it be static once launched?

Good question. Since how we fetch the data from off chain sources can be abstracted from the APEX VM and the rest of the blockchain, updates to the Data Cloud nodes will not necessarily require a hardfork. We will try to ensure that the on chain portion is ready ASAP — the bulk of the logic is off chain for this module, and that is a benefit to implementation flexibility.

Node Desktop Client & APEX Node Manager GUI (by Aldo)

The community has been asking about the desktop client, what’s the progress with this item?

We’ve been working on two prototypes of the desktop node client, one using JVM (Java Virtual Machine) and one using a Microsoft C# framework. As we mentioned previously, in light of limited core resources we’ve decided to place APEX VM application level features and chain features (sidechains etc) as a priority over the node client development.

On the other hand we have recognized the skillset of our community developer Aldo and have agreed to have him work on, on a freelance basis, a graphical user friendly and efficient application to manage and run supernodes. The concept has been written out, and Aldo has already delivered a proof of concept for this application which is quite promising — feel free to share the details guys.


Provide a graphical application to manage and run supernodes. Features could be:

  • Installing, launching and performing updates of the supernode core
  • Graphical editing of the supernode settings (“settings.conf”)
  • REST API for DApps. The management application will function as a proxy
    for the supernodes and the raw RPC port of the node itself will not have to
    be exposed. This also allows custom users and custom authentication for
    each supernode.
  • Key management for DApps. Store keys in encrypted form inside the management application for users. This way dApps / users will not have to store their own keys, but can indirectly use them over the REST authentication.
    Example: dApp creates Tx → pushes unsigned Tx to management REST → management REST signs Tx with stored key for the user and passes it to the supernode
  • Deployment as Docker image which provides the supernode core and the management application internally

Completed so far (status)

  • Automated button click node deployment
  • Graphical node configuration through application
  • Node shutdown and redeployment through application
  • Pretty Graphical User Interface
  • Docker rollout
  • 2FA at registry

The progress so far can be reviewed at;

Aldo’s Dev Force

Aldo told us he would love to get additional community developers and testers working on this application alongside him, kind of a community “Dev Force”. Any thoughts on that?

Right, seeing the amount of work that was needed to get this prototype rolling (already 2k+ lines of code contributed), it would clearly be beneficial to get additional developers involved. Aldo’s work is honestly quite encouraging to see, and it has seemed to spark an interest in the tech community. Currently we are viewing this work and the engagement with his work as an important additional data point that we can utilize to assess potential supernode candidates. However, considering the amount of effort needed we are also considering various milestone based incentives for these kinds of contributions — a sort of milestone-based community development budget.


“Register screen” where the user can set up an account with 2 factor authentication
Login screen
Node screen for installation and deployment of a supernode
Configuration tabs which allows the user to graphically edit the settings of the supernode #1
Configuration tabs which allows the user to graphically edit the settings of the supernode #2
Configuration tabs which allows the user to graphically edit the settings of the supernode #3
Configuration tabs which allows the user to graphically edit the settings of the supernode #4
Configuration tabs which allows the user to graphically edit the settings of the supernode #5

Some of the community developers have already expressed great interest to join Aldo’s Dev Force, so this is a very exciting item for us!

Team and Github

You recently stated that the blockchain team now consists of 12 people. What are their specific tasks and roles?

Most of the team members are in main chain development, with the remainder in product management. Two team members are involved in app level development, while pilot level implementation support is managed outside of this team and not impacting their resources.

A couple of community members have noticed that there hasn’t been much Github commits lately, is there any specific reason for this?

As it would be a major update, we are careful to ensure that only substantial and high quality code is released — to be frank this would also be good in the eye of scrutiny.

Token swap

Moving on to another big item that the community has requested an update on: The last estimation for the token swap was EOY 2019 — can you provide us an update on the token swap and how we stand with regards to this?

It was previously announced that the token swap would take place around this time. Though looking at the current state of development, the technical road ahead and following recent communications with different ecosystem partners and players we have concluded the following;

  • The team feels that, at this point in time, performing a token swap would not actually drive token value or value for APEX Network in terms of ecosystem and logistics. One primary goal is to have the desktop client ready before organizing the tokenswap. As mentioned before, Ledger integration was key to the desktop client so as this key feature was delayed, there were some structural changes in the desktop client development. Lastly the team is still looking into different routes of managing a safe token swap event as the Ledger hardware wallet team has not yet worked out any concrete terms regarding an integration cooperation.
  • Exchange discussions have highlighted the fact that they would require a fully functioning blockchain that has been battle tested for a certain period of time before considering integration. This is very understandable from their perspective. On the one hand, the exchange needs to ensure the safety of its integration and its users’ funds, and on the other hand they will not have to account and plan for the burden of an extra future integration — in terms of human and working capital as well as extra safety measures — due to a future tokenswap.
  • We plan to potentially fork and thus update the mainnet a couple of times as we are in the process of introducing new features (such as sidechains and alliance chains). A prime example would be the machine learning GDPR compliant data protocols that were mentioned previously. We plan to integrate these early next year.


Ok Jimmy, thanks for the update. Now the final item on the agenda; What about liquidity?

So far several exchanges have been approached and currently they are either demanding unreasonably high fees that do not correspond to the value they would bring the project — or these high fees would take up a disproportionate part of the development budget — or they require the token swap to be completed and the chain to be running for a bit as was mentioned above.
Hence we will keep monitoring the situation and maintain a dialogue with interesting exchanges to see when it makes sense to strike a deal in the future.

Appreciate the update Jimmy, thanks for taking the time to sit down with us. We know many in the community have appreciated the increased frequency of updates coming out lately and would love to see this continue! As for us here in the community management team, we look forward to see what’s coming next year, as well as getting more updates on the enterprise pilots during our next talk.

Thank you guys, and see you soon!

All the best,

Jimmy Hu and the APEX Community Management Team

APEX Network



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store

Blockchain Powering the Next Generation of Consumer Applications