Why some people like taking risk?

understanding people’s appetite for uncertainty

Ferenc Huszar
Oct 9, 2013 · 4 min read

In this post I write about why people have a seemingly irrational appetite for uncertainty and risk. This is particularly relevant to me, as I work in a startup, where a significant portion of my compensation in form of equity — pretty much a lottery ticket which pays off only in case the company I work for is acquired or files an IPO. Would I not be better off working for Google, where my base salary + bonuses are higher and predictable?

What triggered me to write this article is an interview with David Friedberg, founder of Climate Corp, a $1B startup which sold to Monsanto recently. Here, Friedberg argues that doing a startup for financial reward is irrational:

There’s a 0.00006% chance of building a company that will grow to be worth more than a billion dollars. […] Assuming there are three founders, your median expected payoff would be $300,000 each that’s the equivalent of $73,000 a year. And the probability of making nothing is 67%. So if your motivation for doing a startup is financial reward, you’re better off going to Google, a hedge fund, choosing a career with stable income potential.

Without questioning Friedberg’s entrepreneurial credentials, or quantitative background, this argument is actually quite limited. Doing a startup for financial reward can well be a rational strategy. His calculation simply assumes too much: that your personal reward — your own measure of success — is a linear function of the money you make.

Linear reward function: your satisfaction is measured in $s you make

Indeed, if this was the case, Friedberg’s argument is correct. His argument follows Bayesian decision theory, which says

The reality: non-linear reward

The reason why Friedberg’s calculation does not apply to me is because my money-to-reward mapping is not linear. I would argue most people do not have a linear reward curve, and this will effect how risk-averse they are.

As an example, most entrepreneurs actually adopt the quite extreme get rich or die trying mantra, which can be characterised by a utility function that looks like this:

The ‘get rich or die tryin’ reward function: You are only satisfied if you make $1million, otherwise you don’t care

This reward curve is pretty much indifferent about the precise amount of money you earn: as long as you made at least $1m, you’re happy, otherwise you failed.

This is also a gross oversimplification of reality, but already illustrates why people with different reward profiles may have different appetite to uncertain businesses: if your goal is to make, say, $1million in 2 years, a predictable career at google is not going to give you that. The rational decision is to choose something less predictable perhaps with a lower expected $-value, but which at least gives you a 0.05% chance of succeeding. Generally:

Jensen’s inequality and the appetite for uncertainty

Lastly, I want to mention Jensen’s inequality, an interesting and simple observation which I think sheds further light on how your perception of reward can shape your appetite for uncertainty.

Without going into details of how this inequality works we can say that if your money-to-reward function is convex, you will have a rational tendency to prefer uncertain deals. Conversely, if your reward function is concave, your rational behaviour pushes you towards avoiding uncertainty.

Below is a visual explanation of Jensen’s inequality (read the caption for details)

Illustration of Jensen’s inequality:
Consider two deals. Deal A gives you a fixed return of $50. Deal B gives you $25 or $75 with equal probability. The expected $-value of the two deals is equal. Yet, the uncertain strategy results in a higher expected reward if the reward function is convex.

In summary, whether or not starting or investing in a startup purely for financial reasons is a rational strategy or not depends largely on your goals, measure of success, and your perception of how money maps to happiness.

    Ferenc Huszar

    Written by

    applies machine learning to everything that moves, from humans to entangled photons. Senior data scientist at @PeerIndex.