How quantum computers would be able to process all of Wall Street financial models at once

Ashif Shereef
The Startup
Published in
6 min readMar 20, 2018

Let’s set the premise first. On the afternoon of an unsuspecting May 6, 2010, the New York stock exchange witnessed something that was analogous to a supernatural phenomenon.

$1 trillion in market value disappeared from the stocks within a matter of few seconds, just like being swallowed up by a gargantuan cyber black hole.

Traders were standing on the Wall Street grounds, with their sweaty palms atop their heads, gasping and staring at the big monitors that showcased the falling points like we had seen in the movies. They witnessed the fastest ever crash of any stock market in financial history, thus naming it “Flash crash of 2010”.

The root cause was nothing but a sequence of computer algorithms gone rogue. The explanations for the rout were widely speculated from fat-fingered trading to cyber-attacks.

The revelations came, innumerable news flashes and an arrest later.

Unbeknownst to the traders, somewhere, hidden in the stockpiles of a billion-dollar transaction infrastructure, an algorithm had chimed into action, selling bluffed future contracts specifically called e-minis in accounting language, a form of high-value bets, which can be generically called spoofs. The release of those bluffs resulted in a domino effect that slowly progressed through the trading network, causing mayhem among the buyers and sellers alike, who promptly followed the inherent human reflux that usually follows a code red situation.

Panic.

The entire thing turned into a madhouse where unprecedented selling and buying started to flood in, defined by laws of human psychology that were tightly coupled with the sensation of fear. Following suit, another set of algorithms kicked into action. Those were the high-profile outright nerdish algorithms rented out to high-frequency traders (HFT). High-frequency trading was the brainchild of a clever thought. If stock prices rose and fell every minute, owing to millions of transactions per second, someone was obviously losing money and someone was making money.

Every New York minute, millions of people were selling and millions were buying. It was a greedy battle zone. If somehow, some algorithm could be devised to buy and sell securities and stocks when their prices differ even on a minuscule amount, it would change the way people could squeeze profit out of these stock markets.

The process of HFT was computerized when there was a sudden influx of math and physics PhDs into the field of finance. The nerds built up complicated trading strategies and then constructed them into algorithms and let the software loose upon the heart of Wall Street itself.

The whole point was about speed. It has to be done really really fast, and it’s a tight competition when there are thousands of others knocking on the same door as you are. If your algorithm saves even one microsecond in terms of speed of execution, you make more money. It’s actually hassle-free. You don’t even have to sit in front of the monitor or the CNBC every damn day and make terrible guesses that mostly leads to a rage-filled remote-control-throwing episode. The stock market is a mess. Just like Matthew McConaughey’s character chats to our aspiring Leo in the Wolf of Wall Street, “nobody knows if a stock is gonna go up, down, sideways or in fucking circles”.

The mantra is to just rent out an HFT freelancer and head out to Starbucks and start dreaming about a life like in the roaring 80s, and Bob’s your uncle.

Welcome to a whole new world where you have better prospects of making more money if you have more computational resources.

But the catch is that every computer system out there is governed by an upper limit set forth by the laws of physics and thermodynamics. You can daydream all you want, but at some point in time, an alarm is going to go off (figuratively) and physics is going to stop you.

We are either in close proximity of, or has already crossed that point. Piling on more and more cores just don’t cut it anymore.

At that point, someone had to think about gaining more resources. Something that could turn the tables and take the game to a whole new level. Piling up more and more CPU and GPU cores and chaining them all together was not going to work anymore. The greediness of the dream was well off beyond that.

Enter physics and Quantum computing.

It all began when an aspiring physicist named Richard Feynman started dreaming about a world beyond digital computers. He was a man who was fascinated with things at sub-atomic levels, and he laid the keel for thinking up a machine that is capable of exploiting the Quantum laws of nature for processing.

Hitherto, the data from the whole world could be codified into binary representations for further processing and storage. No matter how much futuristic technological roadmaps we have traversed, we have always been stuck in that binary prison. Even with our feeble capabilities at full throttle, we have achieved so much. We have put man on the moon, our unmanned space probe, Voyager one, launched 39 years before is still transmitting back to earth. We have seen past 13 billion light years and we have detected gravitational waves erupted billions of years ago when two black holes collided. So much with a computer that operates on just 1’s and 0’s.

And now we are past that barrier.

A computer that mirrors the way nature operates on data, well beyond the speed of binary logic. That’s what is meant by a Quantum computer. I will elaborate, but I won’t go into root level specifics. That’s a story for another blog.

In a binary processing environment, the capabilities depend upon the number of transistors. With beyond ultra large scale integration technologies and with extreme ultraviolet lithography, technology can now squeeze in more than 100 million transistors per square millimeter of a silicon wafer.

Our question remains. How does a QC differ from a classical binary computer?

The gist is simple.

Quantum machines take advantage of parallelism. It is that power that enables Quantum computers to churn through billions of possibilities proper parallel, while traditional binary computers can only handle one at a time.

For dilettantes, well, it’s not that simple. But anything that goes beyond that explanation kind of destroys the purpose of this blog by adding layers of complexity. Let’s discuss those messy stuff in the next post.

So how are Quantum computers supposed to just process the whole of Wall Street financial models at one go?

Financial data and stock investment these days work on the principles of modern portfolio theory. It gives insights on how, a portfolio of assets can be constructed for investment, selected in such ways to maximize expected return (or profit) against a risk factor, given at various points of time. It is, on the basic level, an optimization process, and alas, Quantum computers excel at such optimization problems.

According to the theory, it is possible to find an efficient frontier of portfolios that maximize possible expected return against a certain level of risk.

Ordinary computers can work on portfolio problems forever, looping through data sets that yield infinite outcomes, and they wouldn’t be able to provide a feasible model. Until now, what our digital computers were able to do was cut the corners and depend upon approximations.

However, what if we could move away from an approximation based solution to a precision solution? Currently existing adiabatic Quantum computers like Google’s D-Wave is capable of solving the portfolio problem in finite amount of time.

Even though in its infancy now, Quantum computers have started working on that. It’s an upcoming paradigm shift.

This story is published in The Startup, Medium’s largest entrepreneurship publication followed by 307,871+ people.

Subscribe to receive our top stories here.

--

--

Ashif Shereef
The Startup

Engineer | A.I Enthusiast | Entrepreneur | Tree-Hugger | Programmer | Writer | Running a tech start-up