Democratizing Powerful Computing Resources

Allan Boyd
3 min readDec 8, 2017

--

Processing power has long been at the forefront of scientific discovery and commercial innovation. It is the engine powering notoriously complex fields like AI, IoT, VR, and AR, and has become critical even in manufacturing, communication, finance, and medicine, among others. According to the IDC and Information Technology and Innovation Foundation, enterprises spend an average $3M on ‘high-powered computing’ (HPC) per innovation project.

What’s more, another IDC study (PDF download) found that over 97 percent of HPC-adopting organizations “say that they could no longer compete or survive without it.” It’s clear that access to processing power, and increasingly, quality processing power, has become a fundamental cost of doing business.

Unfortunately, it’s not a cost most can afford, as Wolox co-founder Guido Marucci Blas discovered when he attempted to turn his blockchain idea into a reality.

“Getting access to such technology for someone that wants to do research or bootstrap a startup idea could be a deal breaker when you consider the actual cost of hardware or cloud services.” — Guido Malucci Blas, Wolox Co-founder

If we don’t make compute more accessible, we leave the advancement of the tech economy, and arguably the global economy, in the hands of those few that can afford to pay for it.

How can we unleash greater computing power, the primary driver of technological innovation, when it is consolidated in inflexible, expensive, often far away clouds?

Decentralize computing by tapping geographically appropriate resources

Today, millions of computing devices remain cold with inactivity while clustered GPUs are busy generating enough collective heat to melt Greenland. Whether they’re cell phones idly charging, desktops running the 213th consecutive geometric screen saver loop, or raspberry pis in between DIY projects, there’s enough latent computing resources around to power hundreds of Amazon’s “clouds”.

Because those latent computing resources surround us, they often have the benefit of being geographically close to devices that need processing power. Why, then, is data sent hundreds of miles to the nearest cloud for processing, when it passes so much latent computing power on the way?

If we can harness that untapped compute as part of a continuum of available processing power, we can bring an essential resource closer to developing countries, bootstrapped startups, and research institutions.

We have the technology to connect these disparate resources today, through fog computing. Incentivizing billions of device owners to lend their computing power to strangers is another challenge.

The solution: turn processing power into a commodity.

Commoditizing the processing power that individuals, businesses, schools and organizations already own not only solves cloud computing’s distance problem, it demolishes its main hindrance to affordability: a lack of competition.

Commodification transforms processing power from a high-price luxury into a low-latency revolution.

Access to compute is increasingly essential to the development of industry, and it will only prove more essential in time.

Luckily, as computer owners number in the billions and counting, there will be no shortage of host providers to rent processing power to those that need it. As a result, computers become a source of passive income for their owners instead of a depreciating asset.

This method of commodification has already helped spread solar power, giving participants the tools to harness a valuable renewable resource, and make a profit by selling what they don’t use back to the grid.

When applied to computing, all similarly benefit. Computer owners sell processing power they’re not using in a global marketplace, promoting competitive pricing, and making compute accessible in regions of the world where cloud is still too far away and too expensive to be an option.

Researchers in remote areas of the globe could have the same access to the processing power needed to tackle neural network algorithms. Students of all ages could train their own AI and learn how to develop consumer goods with the same power as the pros. And owners of computers worldwide could take part the rapidly growing marketplace for computational resources.

--

--