Part 2 — Building an API Token system

Continuing from Part 1, this post discusses that last component of the authentication server that is essential to preserve the CIA Triad — Confidentiality, Integrity and Availability

We already discussed the usage of private-public keypair to verify the integrity of the token that it was indeed generated by the authentication server. However, the information within the token such as email address, ip address or other information pertinent to the user is not protected from the application which receives the token.

As the identity provider, it is essential for the tokens generated to be “confidentially” secure i.e., none of the information that the application is not authorized is not made available to them. In this case, the JWT token as is will not be “confidentially” secure.

By value vs. By Reference

Many token issuers use the terminology “by value” to denote that the token issued is “Bearer” in nature. It carries its value in itself and cannot be repudiated at any time as long it has the necessary expiration and scopes.

“By Reference” on the other hand means that the token issued is merely a pointer to the original token data elsewhere can be repudiated at any point of time based on external factors and the token may be rejected.

This concept of By Reference is widely used to enforce the Confidentiality part of the CIA triad.

A simple analogy for comparison is that of the token-by-value being similar to cash which when presented carries its value inherently, while token-by-reference is similar to a credit card whose value is ascertained after verifying with an authenticated system.

A sample By Value token claim might look like this:

# Token By Value
{
"sub": "110169484474386276334",
"name": "John Doe",
"iss": "https://www.ebay.com",
"iat": "1433978353",
"exp": "1433981953",
"email": "testuser@gmail.com",
"email_verified": "true",
"given_name": "Test",
"family_name": "User",
"locale": "en"
}

While a Token by reference might simply look like this:

# Token By Reference
{
"ref": "AgAAAA**AQAAAA**aAAAAAEWg**nY+sHZ2PrBmdj6wVnY+sEZ2PrA2dj6wMkIGkCJCGoA2dj6x9nY+seQ+/5wK1dsk3EOEY7BDg7VHK/CmDimCvVPbtJankHhzJUF8rU876Qzjs"
}

The token by reference obfuscates the original claim present in the token before being sent back to the application. This prevents leakage of user privacy/confidentiality.

Methods used for Token-By-Reference

  1. Every token identifier system uses its own implementation to create a token-by-reference structure. A simple implementation would be to encrypt the token by value to generate the reference.
  2. Another method would be persist the token by value in a persistent store and generate a random reference to serve as the token by reference.
  3. Generate a new symmetric key for every user consent and encrypt the token with that key. So revocation of the token is simple as revoking the symmetric key.
  4. Many more…..

<image>

Token Revocation

The usage of token-by-reference also gives an additional and most important advantage of Revocability. At any given point in time, the Identity provider and the authentication server should have the capability to revoke an individual token, tokens issued to a single app or all the issued tokens. Token-by-reference provides a simple but effective mechanism to achieve the various revocation needs.

Consistency Models

Irrespective of the method being chosen, it is imperative to follow the correct consistency models for token generation and validation. For instance, atomic consistency is paramount for new token generation and token validations. New tokens need to be generated on receiving consent and should be honored successfully when being presented right after generation. While eventual consistency might be acceptable for token auditing, cache replication and bulk token revocations etc.,

Authorization Scopes and Exposure range

While generating the token, it is important to contemplate the exposure of data or privileges that the token would create. Following the Principle of Least Privilege (PoLP), it is important to generate tokens with only minimal authorizations (scopes) and with the least amount of validity time.

Authorization is a bigger topic for exploration, however considering exposure of tokens, Authorization represents the Responsibility held by the token and its exposure. With that in consideration, the tokens and its associated authorization (scopes) should be minimal and always function as Fail-Closed.

There has been many instances where a Super token exists with no scopes and all systems allow such a token being developed as Fail-Open systems which is a big security loophole.

Building for the future

A progressive Authorization system needs to be Multi — Dimensional with its registration and scope handling processes. It should be modeled around multi tenants and differentiating granular scopes and expiration, based on type of client, hosting type, usage requested and allowed authorizations by the user etc.,

Prepare for the worst

“Hoping for the best, prepared for the worst, and unsurprised by anything in between.”
― Maya Angelou, I Know Why the Caged Bird Sings

Inspite of all the security measures, there are multiple points of failure which might lead to a potential token leakage ranging for a few to millions of tokens. It is imperative for a system of any scale handling token generation and validation to evaluate the system readiness to react to potential disasters and to work measures to help improve the response and turnaround times.

It would be desirable to build the system to have capabilities to to revoke all of the tokens issued for a particular user or a particular application. It is also prudent to estimate the time and resources necessary to undertake a full fledged token revocation of all the issued tokens over a period of time and to keep re-visiting this strategy on a timely basis to keep the operating procedures up-to-date.

Conference

I spoke earlier this year at multiple conferences on the same subject and the presentation and videos are available in Medium as well.

Following these steps may not secure your APIs completely. But they would definitely form an important and must-have feature list for any token systems developed to secure APIs.