Account recovery in a decentralized system

Using a secret word and a list of friends to recover a smart contract wallet or digital identity

Peter Porobov
Upala Digital Identity
10 min readOct 24, 2019

--

Account recovery on blockchain

We are developing an identity proof system Upala. Its purpose is to distinguish people from bots and clones (people with multiple IDs). One person — one ID. It is a huge goal. To get there we started a series of posts, showing our thinking process. We will then transform the posts into whitepaper.

As discussed in the previous article, the first step in building digital identity is building a decentralized account recovery tool. This will allow us to start building user base around a smart wallet.

By a smart wallet we mean a smart contract wallet, which is controlled by a single or a set of addresses. A smart wallet allows enhanced security, batched transactions (ERC20 approval and transfer in one transaction), multisig, 2 factor authentication and other features. An excellent example is Gnosis Safe. However we believe there is a room for innovation concerning recovery procedures.

Here we categorize all our knowledge on the topic. This is a very high level overview of the right and wrong direction in research of a decentralized recovery.

TL;DR Give me the code.
A smart contract to manage recovery methods (called agents there). An example of recovery method using secret word and a secret list of friends (addresses hidden inside SNARK).

Features needed

Here we only discuss access recovery features (create, recover, delete).

We have to be able to recover an account of a person who lost everything: phone, computer, e-mail password and access to social networks.

Features:

  • Account recovery
    - Initiate
    - Confirm (by a number of confirmations or threshold weight)
    - Re-initiate (block, panic or mark as malicious)
  • Account deletion (on death or when hopelessly hijacked).
  • Anonymity (no personal data including social connections should exposed anywhere)

There could be different flavors of re-initiation. For the case someone notices their account being stolen we could provide several options: re-initiatiate recovery process, block account, start some special panic procedure, etc. But we’ve come to a conclusion, that there is no need for that. We just assume that Initiation means block current account, cancel current initiation process (if any) and start a new initiation process.

Searching the solution

The problem of recovery is really the problem of identification. Some trusted entity must correctly identify the one who requested recovery (the victim). The entity should also make sure that the victim is acting at their own will.

Ways to identify a person:

  • Through friends (those who the knows victim)
  • Through a third party (those who doesn’t know the victim)
    - Using biometry (face, DNA, fingerprint, walk-print, teeth, veins, retina)
    - Using state ID.

Conditions of trust, responsibility of an identifying entity.
The victim must be confident that the entity they trust:

  • Will not betray (collusion attack).
  • Will not be tricked (social engineering attack — a doll, an actor with a mask, an android).
  • Will check that the victim is acting at will (blackmail attack).

Victim’s possible sources of trust (proofs of identity):

  • Those that cannot be lost.
    - Friends
    - Biometry
    - Memory (can be forgotten though)
    - Location
  • Can be lost or broken:
    - Paper
    - Any hardware

Methods of account recovery
Now from the high level of the sources of trust let’s derive more definite methods (or procedures) of recovery.

Memory-based methods of account recovery

Secret code word. User thinks up a secret code word. The phone app reminds the secret word in a form of quiz every month or so (like Authy). The app also reminds not to play the quiz in public, so that no one sees the word. When recovery is needed the user enters the word to initialize or confirm recovery.

Spawn point. The app guides through a mnemonic process to help user memorize the place of registering an account. When recovery is needed the user or one of his/her friends must attend the place. Here we have to split Earth surface into sectors in order to adjust to GPS inaccuracy, thus possible entropy too low. To increase entropy we can ask to memorize a set of such places and order — a path.

Control questions. The app offers to select a question: name of your pet, where was your first flight, first car. Cons. Possible entropy too low — the answers will probably be among the ones listed in a password brutter DB.

Friends-based methods of account recovery

Facebook has 3 out of 5 selected friends recovery scheme. It asks a user to select 5 trusted friends, who will all get notifications when recovery is requested. But in Facebook this is just one of the possible methods. We cannot guarantee that 3 out of 5 will still use the service when recovery is needed. We cannot solely rely on the same procedure.

Attacks to keep in mind:

  • Friends can betray.
  • Friends can be blackmailed. Why not blackmail a victim then? Is victim unreachable or better protected?
  • Most important of all, friends can be tricked through social engineering, actors, dolls and deep fakes.

Friends-based methods

User selected friends (3 out of 5).
User selects a number of trusted friends. Recovery requires confirmation from any 3 of them.
Pros. Simple.
Cons. Some of friends may not be available. In early stages there may be not be enough people to trust.

Friends selected at random by the system.
User has a lot friends (>10). When recovery is initialized 3 of them at random get requests for confirmation. If any of them keeps idle, after a timeout another user gets notified. And so on.

Pros:

  • Easier on-boarding — user doesn't have to chose trusted friends.
  • A better compatibility with future Sybil-protection features — may serve as an additional incentive to avoid bots.
  • Much harder to collude than with 3 out of 5 scheme.

Cons:

  • More prone to social engineering. A remote friend will not bother enough to check the person’s identity or doesn’t know the person well enough.
  • A remote friend may be reluctant to cooperate (less social incentive).

Threshold percentage of all friends.
Say more than half of all friends confirm recovery.
Pros. Same as the above, but even simpler to implement.
Cons. A lot of friends have to take actions.

Incentive and responsibility

We we assume that social responsibility is enough. Most probably one will never forget a traitor who tried to steal their account. As for social engineering attacks the app could ask a friend to go through a quiz before taking actions. “Have you talked over a phone? Is recovery said to be urgent? Have your friend mentioned this quiz?”. No one wants to be seen as a fool or irresponsible.

The same with initiative. Probably no initiative is needed except real-world off chain “thank you”, gift or recovers party.

How to counter social engineering

A deeply faked friend’s voice is calling over a phone and asks for recovery. “Hey, I’m on a business trip. I need recovery… [then comes a list of very persuasive arguments and some intimate facts of your friendship, the voice and the manner of speech is very true]… It is urgent! As we cannot meet right now you have to alter your phone’s GPS coordinates. Here are the instructions. Please do it ASAP! It is life and death question!” Pressure could be tough. To counter this vector of attack we can add additional security measures over any of the methods above. What can we do?

Force personal intersection
Additional measures to force people meet each other in person.

Witnessed:

  • One friend initializes, another friend meets our victim in person and takes pictures, the last friend confirms the picture. Cons. The picture could be easily faked as well. No real enhancement of security.
  • Meet two friends simultaneously. Cons. Probably it could be even easier to trick two friends than one to spoof their GPSes.
  • Witnessed by a third person. Cons. Hard to create responsibility for the third person. See further (organizations).

App forced:

  • Confirming device proximity with GPS, Bluetooth, sound or camera. Or all at once so it is really hard to fake. Cons. Still possible to fake all of it.
  • FOAM’s dynamic location proof (tech from future). Cons. One may just send a phone and persuade to confirm.
  • Any of the app forced plus taking pictures together. Photos can only be taken with the designated app (Upala). Another friends confirms photo. Pros. Let’s stick with that for now.

Ask intimate questions
A question could only be defined while an account is still under owner’s control. So that we eliminate situations when during recovery a fake victim asks to ask “Where did we go yesterday together?”. This question may seem intimate to a friend (only the real victim could ask it), but it could be easily forged by someone spying a victim.

Next, we could expect two types of answers: exact and abstract.

Exact answer is just like password. A victim needs to enter an exact information to proceed with recovery… But if it is the same as password why bother inventing it?

An abstract answer could only be verified by a friend during a conversation. Otherwise we’d had to predefined all possible forms of it. Thus it is no better than just a conversation. All we have to do is to remind all parties to verify each other thoroughly.

A good usage of intimate question could be as follows. A user is offered an option. When selecting friends one may define questions he or she would like to be asked on recovery. No one should be able to see these questions before the procedure. A friend receives questions as a simple message with the recovery request.

Account recovery through an organization (DAO)

An organization could identify a person trough biometry (face, fingerprints, DNA) or a state issued id. Centralized or decentralized the only way for an organization to build trust is to be exposed to some risk. Traditional centralized organizations could be exposed to legal risks. The decentralized ones rely on penalties. But whether we are punishing an organization with lost stake or reputation we have to catch an organization on cheating first. How can we do it?

DAO-based Methods

DAOs
We could provide an incentive to create DAOs providing recovery services (and probably taking part in Sybil-protection mechanism). Whether an organization would rely on biometry or state ID, the procedure for recovery would be the same. The person would have to attend an office and confirm identity. Question is how to build trust or how to catch an organization cheating.

Did you know?
Face authentication is cruel. Apple says the possibility of braking their Face ID is 1/1000000. But what is the possibility of failing to unblock your own phone? If we tie recovery procedure to face recognition we incentivize malicious actors to make owner's face unrecognizable. It would be simpler to break face than to brake Face ID. Sounds like a joke, but could really be an issue.

Random user
A random user identifying a person is essentially the same thing as an organization. The system selects a random person and offers them to participate in recovery procedure for a reward. The problem is the same as with organization — how to punish a malicious or mistaking user.

Responsibility (catching a cheater)

Slashing (reporting a stolen account)
We could invent a special procedure to recover a stolen account — like asking friends to remove friendship connections or asking another DAO (or a number of DAOs) to confirm that the previous recovery attempt was malicious. But it all increases complexity, leads to an increased attack surface and more confusion in the long run. Good way to use slashing is after account recovery. If there is a malicious attempt of recovery a victim just re-initiates recovery procedure. After restoring their account a user could submit a complaint punishing a malicious actor with a stake (like Erapture by Richard Craib does slashing). But it seems in a decentralized system it all comes down to betting and then to politics. And there is no reliable way to confirm that an account was stolen.

Random known cheater (secret shopper)
Another approach is inspired by Truebit. As we cannot confirm a cheater, let’s create one. The system could randomly select a user and offer them to trick a random DAO defined by the system as well. The DAO would then earn reputation only by catching those secret shoppers. The secret shopper earns a little by just participating in the act and a considerable amount by outwitting the DAO. Cons. Though this method helps prevent mistakes in identifying, it doesn’t help when a DAO cheats intentionally.

General purpose oracles
There is a demand for publishing real world data on blockchain. Probably there is a way to use these future oracles in recovery process.

The role of hardware in account recovery

We cannot rely on hardware during recovery process because it could be lost or stolen. The role of any hardware then (phone, security key) is only securing day by day access. Same with seed phrase on paper. Same with biometry, considering thoughts above.

The role of time locks

Probably it would be beneficial to let user set time locks anywhere they want to enhance security.

Upala

We believe that the right way to do recovery is to share responsibility among different methods described above. By diversifying the recovery process we add flexibility and security. Among the methods above here is our choice. Here is how recovery process may look like:

  • Initialize with a secret phrase through one of pre-selected friends.
  • Get two confirmations from real-world intersection with friends.
  • Get at least one confirmation from a random friend. As a feature a user may create a list of questions which their friend would receive during recovery. A friend would then confirm personality by asking these questions.

Here is an POC smart contract, where a user can chose which methods they would like to use for recovery.

Anonymity

Here is how we plan to keep social connections secret.

Secret addresses
On establishing a friendship, a special secret address is generated by both parties. These addresses are never used except for recovery. When recovery is needed, the one who is recovering sends money to a friend’s secret address. A friend confirms recovery from that address and sends all the excessive funds with the transaction. Thus a friend has a private key to control an address, but it is only the one who recovers leaves any trace of connection with it.

Secret secret addresses
Using SNARKS we can even hide a list of pre-selected friends from everyone including those friends. Here is an example contract. The allowed msg.sender address along with a secret word is hidden inside the SNARK, so that even a trusted friend is unaware if he or she is trusted. The contract alerts if someone unauthorized is trying to initiate recovery.

Thank you!

Clap and support

Please donate: Ethereum, Bitcoin, Zcash, PayPal. Or buy advertisement space (help Upala and charity). Join us/Subscribe Twitter, Telegram channel, Telegram chat, GitHub, Reddit, Medium

--

--