Part V — But What About The Computational Resource? The Natural Gas To Oil In The Digital Realm!

TCT Program Lead - JD Sutton (DeepSea)
Coinmonks
15 min readJun 22, 2019

--

When oil, one of the resources that initiated the industrial revolution was discovered, a by-product was also found when drilling. Along with Oil, natural gas was located in the bedrock reservoirs and found to be useless in the beginning. Derricks and refineries initially simply burned the natural gas that poured out of the ground so they can access the oil.

As we have seen, Data is the oil that will power the 4th Industrial Revolution, and the byproduct of digitizing that data is the computational resource. Connected devices that take physical actions such as environmental temperatures, saturation, market prices, machine actions, etc. It will be the connected devices that digitize the physical world into data streams. All of this being completed with the use of computational resources using CPU’s, FPGA’s, and GPU’s. Our sensors, cars, and machines, that collect the immense amounts of data, are idling, doing nothing when not actively used. As the Qubic network is an already built network using the secure Tangle to create a marketplace for data paid with feeless IOTA’s, it is not difficult to create a computational resource marketplace as well. Just as the industrial revolution created commodities such as Oil & Gas, the Internet of Things and the 5th Industrial revolution will create commodities such as Data & Computational Resources.

First, let us look at the definition of a Gigaflop. Technopedia.com states that “a gigaflop is equal to one billion floating-point operations per second. Floating-point operations are the calculations of floating-point numbers. Terms like “gigaflop” are typically used to understand processor speed and how a computer can handle data-intensive operations that would be common in some types of scientific or quantitative processes. Disclaimer, I’m not a techie and I’m only trying to relay the very broad concept of what the future is bringing and how IOTA, the Tangle, and the Qubic Network can be a solution. So for the below examples lets just look at floating-point operations as a very loose quantitative method to value computation as a resource. This may possibly be incorrect, or may not be the final method used, but simply lets for a discussion point use floating-point operations.

The K Computer is a supercomputer manufactured by Fujitsu and FYI, which is based on an “open-source” Linux Kernel. It consists of 80,000 computer nodes where each node consists of eight-core SPARC64 VIIIfx processors for a total of 705,024 cores. Together the K Computer can complete 10.51 Petaflops per second. The current leading supercomputer is the American Summit at 143.5 petaFLOPS (based on Wikipedia TOP500). Also, it is noted on the same page that the 10 top supercomputers use the Linux Operating System. Linux of course, is an open-source platform supported by the non-profit Linux Organization.

Now, for comparison sake here are “SOME VERY ROUGH ESTIMATES”. As I have limited personal time I, unfortunately, do not currently have the time to dive into hardware processing foundations. Yet here is an attempt to simply show the big picture. In Nick Routley’s article, ‘Visualizing the Trillion-Fold Increase in Computing Power’, an Apple Watch can produce about 3 Billion FLOPS while an iPhone 6 can produce 8 Billion FLOPS. Take a PlayStation 4 which can conduct 1,200 Billion FLOPS. Everything from the simple watch on your wrist, sensors, smart-phones, computers and gaming systems has a resource that can conduct computations.

From ‘Part I — So What is The Internet of Things? we found that in 2020 there will be 30.73 billion connected devices and this equates to an estimated 4.04 devices per person. Given that some people have a computer, others a PlayStation, and in 3rd world countries where only a smart-phone is their connected device, we can assume that the computational power per device varies. If we took a random number out of the air and assumed 350 million FLOPS was the average power of each device, this would equate to a global combined computational resource of 10.7555 Quintillion FLOPS (10.7555 to the power of 18). In the Wired.com article by John Timmer, ‘World’s Total CPU Power: One Human Brain’ written in 2011, he stated that the planets combined resources could conduct 6.4 Quintillion operations per second and that GPU’s make up 97% of this (note: supercomputers power were not included in this analysis). Considering the article was written in 2011, it doesn’t seem unplausible that a rough estimate of 10.7555 Quintillion FLOPS is what today’s global computation resource consists of.

https://www.visualcapitalist.com/visualizing-trillion-fold-increase-computing-power/

With 10.7555 Quintillion FLOPS this equates to the global computation resource consisting of either: a) 1,023 combined K Supercomputers or b) 74.94 times the world's most powerful supercomputer the American Summit. Let’s now consider that it is roughly estimated that the American Summit supercomputer cost $200 million dollars to build, and the K Supercomputer costs $10 million annually to run. So if the world has the computational power of 74.94 Summit’s, and 1,023 K Supercomputers, this equates to $14.988 Billion dollars to build (74.94 x $200 million) and $10.23 Billion to run annually (1,023 x $10 million). Are these numbers correct, or even close? No, definitely not! However, we can see that the computational power as a resource, without a doubt has a very high value within the world. More so, that there is a HUGE abundance of this resource within the world through connected devices alone. The next logical thought becomes, “what if we can offer this global resource to a digital marketplace”?

Part of the Qubic Network is the process of using the Qubic Computation Model (QCM)in which Navin Ramachandran describes in ‘somewhat’ layman's terms how Qubics runs on QCM. By using Qubics and QCM, this doesn’t necessarily mean, nor have I personally read, that there is a quantitative approach to classifying computation as a resource. That is, if the person (a) requests to purchase N amounts of FLOPS, hashes, or epochs, then person or device (B) which is offering that resource up, can quantify the resource and offer a price. What is important here is that the Eric Hop, the IOTA Foundation, and the community have found the solution and are in the works of creating the Qubic Computation Model right now. Eric wrote a 6 part article series, Qubic: Explaining the Qubic Computation Model. Though if you couldn’t understand Navin’s article you certainly will not be able to comprehend Eric’s. Even so, I implore you to give both a read. What both show, are that they have created the foundation in which to build upon. As for a digital marketplace evolving, this is as easy as an application being built by the community and economic forces finding price equilibriums and use cases. Not to beat a dead horse, but the facts show that not only is computational power a resource but that there is an abundance within the world. Just as in Part IV with the example of California, we found several people will have sensors around their homes wasting data. The same amount of people in the world will have connected devices sitting there wasting their computational power. Just as we saw the use case in which we can use unused data, so will the use case be created in which we can use unused computational resources.

Let's not forget this is about the IoT and connected devices. John Timmer found that in 2007 there were 6.4 Quintillion operations a second combined power on the planet. However, 97% of this was completed with GPU’s. Now, our smart-phones have GPU’s but the majority of connected devices specific to the IoT will not. Most IoT sensors and connected devices will be built to focus on two objectives: 1) to be energy efficient and 2) to be computational lightweight. The two pretty much complement each other because, if a device is computationally lightweight it will also be energy efficient.

For example, do you have a SmartThings or Alexa? Or even a ThermoPro? I have all three. I put the SmartThings temperature and water sensor in my basement, as well as, the ThermoPro sensor outside. I found quickly that the water sensor burned through its battery within 8 months and the ThermoPro dashboard killed its battery in 11 months. As the Internet of Things offers us this digital space where sensors can digitize the physical world not only is the operation of sending data, but also sending secured data (data that can be hashed cryptographically), devices use a lot of energy to conduct the computations. Energy efficiency is a high priority for connected and edge devices.

A connected device loses most of its battery energy through its computational operations. Thus, by using the Qubic Computational Model, which is written in Ternary programming (-1,0,1 / standard programming is binary (0,1)), the end result is a very energy efficient piece of hardware. Or in other words, the programming language that directs the electronic hardware how to act (Abra) creates the Qubic Computational Model. This hardware language creates an estimated 30% to 40% energy efficiency when combined with actual ternary hardware. All of this starts to get technical but the concept is, the Tangle, the Qubic Network, and Qubic Computational Model all combine to be specifically designed for the Internet of Things. So instead of having to change batteries in your water sensor, or your soil saturation sensor every 8 months, you may only have to change it every 4 years. Or, if the sensor has a very small solar cell on it, it may never have to be changed.

After eight months the battery died on the water sensor in my basement. I have a great secure router, the SmartThings hub, a water sensor, a smartphone to receive alerts in case water is flooding in my basement, but, if the battery dies on the device the whole system fails. Hmmm, again, we find a single-point-of-failure when considering battery life. To mitigate the Tangle and the Qubic network has been specifically designed to offer a method that allows devices a huge increase in energy efficiency. After a very quick google search I found on www.whatis5g.info (energy-consumption), “our energy calculations show that by 2015, the wireless cloud will consume up to 43 TWh, compared to 9.2 TWh in 2012, an increase in 460%”. On Wikipedia, they stated that in 2011, expenditures on energy, on a global scale, totaled over $6 Trillion dollars. That comes out to about $685 Million dollars per hour! Just conceive, if we could reduce our energy usage by 35% in connected devices for the Internet of Things? How much would this lower the cost and save the world? On $6 Trillion dollars that is $2.1 Trillion in savings. Also, 35% savings on 43 TWh of energy is 15.05 TWh saved. How much would saving 15.05 TWh lower the CO2 increase and support fighting global warming? The fact is, each year the world becomes more connected, more digitized, and in need of becoming more energy efficient, as well as, secure. So IOTA, the Tangle, and the Qubic Network are being developed not necessarily for the needs of today, but specifically for the needs of tomorrow.

So is Computational Resources Wasted?

Left: CPU usage on my desktop writgin an article at 7% Right: CPU usage on my desktop writing an article and running a 12 thread CPU mining program

The question becomes, is your computer using all of its available resources either when you are on it, and especially is it when you are not? How about your iPhone, your iWatch, your tv, your Alexa, or your thermostat? The above two pictures show the difference of my CPU using its computational resource both when I am using it, and when I use it but also run a CPU crypto mining program in the background. On regular use, it runs at 7% with a few spikes up to 30%; while as it runs the mining program it runs above 90%. Now, I would not regularly run my CPU above 80% for long periods of time, I just want to show that our computers and essentially our devices have a huge resource that goes unused. If I want to, I could run a CPU mining application using half of my threads (about 50% of the CPU power), as well as, a GPU mining program with the GPU throttled to 50%, and this would not even kick up the fans, create that much heat, or increase the sound of my computer. In other words, if I wanted to, I could take advantage of the unused power. Question is, do I? I used to; however, I realized that blockchain and mining only allow for very small profits and doesn’t do anything but waste electricity and hurt the environment. So as an average Joe miner, I don’t do it. Yet, if there was a way where I could sell my computational resource, to a University, a business, or even to a city where I live, I would. Not only would I be offering a resource that can power the Qubic Network but that unused resource can actually go to a good cause like trying to solve cancer, supporting the Internet of Things grow and helping societies be more efficient.

Though let’s talk about what tapping into such a resource can offer. Let’s discuss something I know everyone has had experience either with or with someone close to them, Cancer. Its a fact referenced in the Huffington post by Aaron Durbrow, ‘Computing a Cancer Cure: 7 Ways Supercomputers Help Scientist Understand and Treat the Disease’, that we have a 40% chance of being diagnosed with cancer. That means, if there are more than 3 people in your family, or if you know more than 3 people in the world, you most likely have had a connection with someone and have suffered through the disease process. Finding a cure for cancer is simply finding a mathematical and scientific solution to a given problem. One of the many issues researchers and scientists have is getting access to computational resources in which to run simulations whether a treatment would work.

In the article on Cancertherapyadvisor.com by Dan Neel, ‘Quantum Computing: The Future of Cancer Research?’, Dan writes, “Garry Nolan, Professor of Microbiology and Immunology at Stanford University in Stanford, CA, finds himself frustrated by the limited access he is given to existing supercomputing power. There is simply too many scientists waiting in line, and too many projects in the queue”. Dan also writes, “Nolan agrees. If the slowest aspect of cancer research, which is generally computing time, could be ported to a parallelized system like a quantum computer, it would turn hours of a problem into minutes. Mabe seconds” he said”.

Currently, IBM’s Watson supercomputer is being trained and used for Cancer research and solutions which Casey Ross and Ike Swetlitz discuss on Statnews.com, “IBM pitched its Watson supercomputer as a revolution in Cancer Care. It’s nowhere close”. As Ross and Swetlitz write that there many different issues that are inhibiting Watson’s success to solving many of the medical issues, partially due to limited amounts of data and extended time in years in which Watson takes to better its AI. The article points something very clear out. The article stated, “In response to STAT’s questions, IBM said Watson, in health care and otherwise, remains on an upward trajectory and “is already an important part” of its $20 billion analytics business. Health care is a crucial part of the Watson enterprise. IBM employs 7,0000 people in its Watson health division and sees the industry as a $200 billion market over the next several years”. There is no doubt that Watson is helping medical professionals, but we see also see that it is a market. Watson’s computational power is a valuable resource and currently controlled by one company, IBM. Yes, that is right, a digital resource offering profits and being controlled. Similarly, we saw this type of control with Oil & Gas during the 19th and 20th century specifically during the 3rd industrial revolution.

So what does that tell us?

  1. There are numerous needs in the world for computational resources that can solve not only Cancer but diseases, solve problems of the universe, potentially solve a problem that creates an endless supply of clean energy. Literally, given enough computational power any problem can be solved. Archimedes said, “give me a lever long enough and a fulcrum on which to place it, and I shall move the world”. Well, Data combined with computational power is that lever and the Qubic Network is that fulcrum. By tapping into the worlds ‘unused’ computational power we can change the world.
  2. In the Wikipedia article ‘Watson (computer)’ it states, “According to John Rennie…. Its Linpack performance stands at 80 TeraFLOPs”. As we discussed previously, with 10.7555 Quintillion FLOPS available in the world this equates to be 134,443 Watson Supercomputers. In the same Wikipedia article, it states that “IBM Watson’s former business chief, Manoj Saxena, says that 90% of nurses in the field who use Watson now follow its guidance”. So in short, the world has hundreds of thousands of supercomputers combined within our regular connected devices going unused.
  3. It is a fact. Without doubt scientists, researchers, and intellectuals agree that the lack of access to supercomputer computational resources limits progress. The demand outweighs the limited supply of “centrally” owned and controlled powerful machines. So what if there was a way to integrate all of the connected devices and use the abundant computational resource the world has to offer and do so in a decentralized, secure, and distributed way? There is an answer, and that answer could be IOTA, the Tangle, and the Qubic Network.
  4. Edge computing is progressing at a rapid pace. These very limited resource connected devices will not be able to conduct computational work. This alone is difficult to explain and will be done so in a later article. Yet the need is there, and just as we are seeing data marketplaces form as with water being traded between farmers, so will we see computation being outsourced through the Qubic Network using QCM, and eventually, someone will create an application which becomes that marketplace.

Conclusion:

We know that computation, in a very vague sense, is like natural gas to oil. Data, being the new oil, has a resource being generated when it is required; which is the computational power. Rather than letting it sit their unused and wasted, we can harness it for good. Just as we saw through the Qubic Network, a digital marketplace evolved over time for data. So to will, a marketplace for computational power evolve as well. As shown, the Qubic Network by using QCM (Qubic Computation Model) with connected devices throughout the Internet of Things, can come together to create a very efficient network saving 30% to 40% in energy usage. This is without a doubt a need for IoT devices. To be able to cryptographically secure data into anonymous streams through Masked Authenticating Messaging (MaM) so that sensors do not have to be regularly replaced, and the risk of battery failure is mitigated. Yet more so this lays the groundwork for a foundational layer which will enable computational power to be accessed on a global scale as a resource. Once the foundation is present, it only takes the community or companies to develop applications to take advantage of such a valuable resource. After this combined power can be harnessed in a secure and decentralized method, several industries, science, governments, and organizations will take advantage. Again, combining this all together allows for a distributed global supercomputer more powerful than a hundred thousand supercomputers. A force so powerful should only be built in a decentralized, distributed, secure, and open-source way.

Get Best Software Deals Directly In Your Inbox

--

--

TCT Program Lead - JD Sutton (DeepSea)
Coinmonks

I am DeepSea, the Program Lead for the Tangle Community Treasury. Below you will find articles about the TCT.