Uniswap Insights 6 of 6 — Constructing LP Distributions
Price Behavior of Digital Assets
It can be useful for an LP to know the price dynamics of one’s assets. If we were to take a look at the history of the oldest digital asset BTC since 2015 in log-Y plot with over 3091 daily returns we notice that, with the exception of a couple of outliers, the generalized hyperbolic distribution can be a good fit historically for daily returns.
The exceptions to our fit happen to be the outliers to the right and left, which can be looked at in a log-log plot where we find the tail of negative returns in red and positive outlier returns in blue.
The tails appear similar, but there’s some mismatch in the outliers to the right. I use a kernel-density-estimate (KDE) to smooth out the histogram.
This would mean that a mix of a hyperbolic with asymmetric power laws can describe the dynamics of BTC. Note that I am using BTC since it’s the oldest time series and is the least volatile of all digital assets, meaning that all other digital assets for LPs will exhibit even more volatile behavior.
Modelling Price Dynamics
There are dozens of statistical distributions that one can mix to mimic such volatile behavior. For example, a popular approach in tradfi is to use Geometric Brownian Motion (Lognormal Distribution) and combine it with a Levy Process (Poisson Distribution) to account for jumps in price.
I’ve created a library of over 50+ statistical distributions in desmos to help users explore the distributions and how an LP position can replicate these distributions in Uniswap with Riemann integrals:
The interesting feature of desmos is the ability to change to a loglog plot so one can see how the tails of each statistical distribution behave.
If one wants to compare which distribution fits best to one’s data, one uses the Kolmogorov-Smirnov test to compare the cumulative distribution functions to one’s empirical cumulative histogram data, but one can also use a back-of-the-napkin approach below by simply assuming the worst possible distribution.
What if one has no idea about what is going to unfold given limited data? Well, we can ask ourselves what the worst possible distribution in price space is, one whose tails are never-ending power laws. One such distribution is the Cauchy Distribution (in price space one gets the Log-Cauchy Distribution).
The property of the Cauchy distribution is that the law of large numbers does not apply to it. You may take its average over the past 30 days, thinking you’re seeing a pattern, only for it to trick you. One funny example of this behavior can be the average of DOGE/ETH pair with its lack of liquidity.
We can see how the Cauchy distribution looks like in price space relative to a lognormal distribution:
The Log-Cauchy is not as bad as a full range Uniswap v2 position, but it’s the second worst thing. Given our knowledge about optimizing for capital efficiency from part 1&2, a cutoff around 80–90% for the lower bound can help improve it by not having to provide liquidity all the way to zero seeing how the distribution starts to grow as the price approaches the lower bound.
On Power Laws in Complex Systems
But can a power law distribution like Cauchy over time become a thin-tailed distribution? It’s impossible to get rid off power law phenomena in a complex system such as crypto that is constantly evolving (see appendix), but one could bring down the degree of uncertainty.
If one carefully thinks about it, at some point all assets had a moment of uncertainty in their inception. In fact, with the development of AMMs we have discovered a surprising link unavailable in traditional financial markets where people use square root laws to statistically estimate price impact. With AMMs we have the ability to predict exactly how the price is impacted merely as a function of concentrated liquidity, there is no need for volume or volatility to define the price impact at a moment in time. Take the argument to its extreme and suppose Jerome Powell downloads MetaMask and decides to LP in DOGE/ETH with his money printer and provides trillions of dollars in liquidity in such a pair. Every person trying to sell DOGE would barely negatively impact the price with the returns distribution shifting into a lower volatility regime over time from something that would look less like a Cauchy distribution.
So the interesting insight here is that it’s possible to bring down volatility of an asset merely by overloading an AMM with liquidity for a long enough time if an LP with deep pockets is brave enough. Though I doubt a lot of people have a digital money printer around to boost their bravery.
One way to overcome this without a money printer for the crypto industry would be to bring about assets on the blockchain that can give LPs assurances of constant purchases. A number of such tokens can be: largecap dividend-yielding stocks (purchased by pension funds for retirees), bonds (purchased by banks and corporations for short-term financing), forex (a single, global, centralized fiat currency is difficult to achieve, hence Yuan, USD, EUR pairs will continue to be used), and commodities (food and heating will always be in demand). One can become much more comfortable as an LP providing liquidity in an LP pair of McDonalds/Corn, knowing there will always be some demand so as to not scare liquidity away, and even in the event of a divergence loss, the LP can be tranquil knowing he’ll be loaded with being an owner of either a bunch of happy meals manufacturers or a bunch of corn.
- Thanks for reading, if you learned something useful, feel free to give this hedgehog a follow on twitter: @CK_2049
Disclaimer: This research is for general information purposes only. The Uniswap Foundation was kind enough to sponsor the publication of this research. It does not constitute investment advice or a recommendation or solicitation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. It should not be relied upon for accounting, legal or tax advice or investment recommendations. This post reflects the current opinions of the author. The opinions reflected herein are subject to change without being updated.
Appendix
On power laws and why crypto and tradfi will continue to have them:
Code to pull historic data from the web:
import math
import numpy as np
import yfinance as yf #make sure to 'pip install yfinance'
import pandas as pd
import matplotlib.pyplot as plt
import matplotlib.animation as animation
#Download BTC/EUR as default
ticker1="BTC-USD" #^GSPC, ^IXIC, CL=F,^OVX, GC=F, BTC-USD, JPY=X, EURUSD=X, ^TNX, TLT, SHY, ^VIX, LLY, XOM
ticker2="EURUSD=X"
t_0="2017-07-07"
t_f="2023-07-07"
data1=yf.download(ticker1, start=t_0, end=t_f)
data2=yf.download(ticker2, start=t_0, end=t_f)
data3=data1
dat=data1['Close']
dat = pd.to_numeric(dat, errors='coerce')
dat=dat.dropna()
dat_ret=dat.pct_change(1)
x = np.array(dat.values)
dat_recurrence=dat/max(dat)
xr = np.array(dat_recurrence.values)
fig, (ax1, ax2) = plt.subplots(nrows=1, ncols=2, figsize=(6.5,3))
# Plot the logistic map in the first subplot
ax1.plot(range(len(x)), x, '#056398', linewidth=.5)
ax1.set_xlabel('Time')
ax1.set_ylabel(str(ticker1)+'/'+str(ticker2)+' Price Ratio')
ax1.set_title(str(ticker1)+'/'+str(ticker2)+' Fluctuations since '+ str(t_0))
ax1.set_yscale('log')
n_end=len(x)
# Create a recurrence plot of the logistic map in the second subplot
R = np.zeros((n_end, n_end))
for i in range(n_end):
for j in range(i, n_end):
if abs(xr[i] - xr[j]) < 0.01:
R[i, j] = 1
R[j, i] = 1
ax2.imshow(R, cmap='viridis', origin='lower', vmin=0, vmax=1)
ax2.set_xlabel('Time step')
ax2.set_ylabel('Time step')
ax2.set_title('Recurrence Plot of ' +str(ticker1)+'/'+str(ticker2))
series = pd.Series(dat_ret).fillna(0)
fig, ax = plt.subplots()
density = stats.gaussian_kde(series)
series.hist(ax=ax, bins=400, edgecolor='black',color='#25a0e8', linewidth=.2,figsize=(6.5,2),histtype=u'step', density=True)
ax.set_xlabel('Log Returns')
ax.set_ylabel('Log Frequency')
ax.set_title('LogLog Histogram of Returns ' +str(ticker1)+'/'+str(ticker2))
ax.set_yscale('log')
ax.set_xscale('log')
ax.grid(None)
plt.scatter(series, density(series), c='#25a0d8', s=6)
fig, ax2 = plt.subplots()
series.hist(ax=ax2, bins=400, edgecolor='black',color='#25a0e8', linewidth=.2,figsize=(6.5,2),histtype=u'step', density=True)
ax2.set_xlabel('Log Returns')
ax2.set_ylabel('Log Frequency')
ax2.set_title('Log-y Histogram of Returns ' +str(ticker1)+'/'+str(ticker2))
ax2.set_yscale('log')
ax2.grid(None)
plt.scatter(series, density(series), c='#25a0d8', s=6)
plt.show()
Hyperbolic Distributions and Mixture Models
import numpy as np
from matplotlib import pyplot as plt
from scipy import stats
p, a, b, loc, scale = 1, 1, 0, 0, 1
rnge=15
x = np.linspace(-rnge, rnge, 1000)
#Mixture model for tails
w=.999
dist1=stats.genhyperbolic.pdf(x, p, a, b, loc, scale)
dist2=stats.cauchy.pdf(x, loc, scale)
mixture=np.nansum((w*dist1,(1-w)*dist2),0)
plt.figure(figsize=(16,8))
plt.subplot(1, 2, 1)
plt.title("Generalized Hyperbolic Distribution Log-Y")
plt.plot(x, stats.genhyperbolic.pdf(x, p, a, b, loc, scale), label = 'GH(p=1, a=1, b=0, loc=0, scale=1)', color='black')
plt.plot(x, stats.genhyperbolic.pdf(x, p, a, b, loc, scale),
color = 'red', alpha = .5, label='GH(p=1, 0<a<1, b=0, loc=0, scale=1)')
[plt.plot(x, stats.genhyperbolic.pdf(x, p, a, b, loc, scale),
color = 'red', alpha = 0.2) for a in np.linspace(1, 2, 10)]
plt.plot(x, stats.genhyperbolic.pdf(x, p,a,b,loc, scale),
color = 'blue', alpha = 0.2, label='GH(p=1, a=1, -1<b<0, loc=0, scale=1)')
plt.plot(x, stats.genhyperbolic.pdf(x, p,a,b,loc, scale),
color = 'green', alpha = 0.2, label='GH(p=1, a=1, 0<b<1, loc=0, scale=1)')
[plt.plot(x, stats.genhyperbolic.pdf(x, p, a, b, loc, scale),
color = 'blue', alpha = .2) for b in np.linspace(-10, 0, 100)]
[plt.plot(x, stats.genhyperbolic.pdf(x, p, a, b, loc, scale),
color = 'green', alpha = .2) for b in np.linspace(0, 10, 100)]
plt.plot(x, stats.norm.pdf(x, loc, scale), label = 'N(loc=0, scale=1)', color='purple', dashes=[3])
plt.plot(x, stats.laplace.pdf(x, loc, scale), label = 'Laplace(loc=0, scale=1)', color='black',dashes=[1])
plt.plot(x, mixture, label = 'Cauchy(loc=0, scale=1)', color='blue',dashes=[1])
plt.xlabel('Returns')
plt.ylabel('Log Density')
plt.ylim(1e-10, 1e0)
plt.yscale('log')
x = np.linspace(0, 10000, 10000)
dist1=stats.genhyperbolic.pdf(x, p, a, b, loc, scale)
dist2=stats.cauchy.pdf(x, loc, scale)
mixture=np.nansum((w*dist1,(1-w)*dist2),0)
plt.subplot(1, 2, 2)
plt.title("Generalized Hyperbolic Distribution Tail Log-Y Log-X")
plt.plot(x, stats.genhyperbolic.pdf(x, p, a, b, loc, scale), label = 'GH(p=1, a=1, b=0, loc=0, scale=1)', color='black')
plt.plot(x, stats.genhyperbolic.pdf(x, p, a, b, loc, scale),
color = 'red', alpha = .5, label='GH(p=1, 0<a<1, b=0, loc=0, scale=1)')
[plt.plot(x, stats.genhyperbolic.pdf(x, p, a, b, loc, scale),
color = 'red', alpha = 0.2) for a in np.linspace(1, 2, 10)]
plt.plot(x, stats.genhyperbolic.pdf(x, p,a,b,loc, scale),
color = 'blue', alpha = 0.2, label='GH(p=1, a=1, -1<b<0, loc=0, scale=1)')
plt.plot(x, stats.genhyperbolic.pdf(x, p,a,b,loc, scale),
color = 'green', alpha = 0.2, label='GH(p=1, a=1, 0<b<1, loc=0, scale=1)')
[plt.plot(x, stats.genhyperbolic.pdf(x, p, a, b, loc, scale),
color = 'blue', alpha = .2) for b in np.linspace(-10, 0, 100)]
[plt.plot(x, stats.genhyperbolic.pdf(x, p, a, b, loc, scale),
color = 'green', alpha = .2) for b in np.linspace(0, 10, 100)]
plt.plot(x, stats.norm.pdf(x, loc, scale), label = 'Gaussian', color='purple', dashes=[3])
plt.plot(x, stats.laplace.pdf(x, loc, scale), label = 'Laplace(loc=0, scale=1)', color='black',dashes=[1])
plt.plot(x, stats.cauchy.pdf(x, loc, scale), label = 'Cauchy(loc=0, scale=1)', color='blue',dashes=[1])
#Heavy tail mix model
plt.plot(x, mixture, label = 'GH+Cauchy Mix(loc=0, scale=1)', color='red',dashes=[1])
plt.xlabel('Log Returns')
plt.ylabel('Log Density')
plt.ylim(1e-10, 1e0)
plt.xlim(1e-0,1e4)
plt.xscale('log')
plt.yscale('log')
plt.legend(loc="upper right")
plt.subplots_adjust(right=1)
plt.show()
Since you’ve made it all the way to the bottom of the appendix, you’re rewarded with a documentary on the discovery of hyperbolic distributions. Fascinating story in and of itself.