The Evolution of commercial Computing

HiveNet
Official HiveNet Blog
4 min readSep 16, 2019

--

Computers have truthfully changed the world.

Since the initial invention, several important milestones were achieved, which brought the concept of ‘computer’ to new levels.

The first computers were huge mechanical beasts, which were extremely expensive and required maintenance by several professionals at all time. The invention of microprocessors and implementation of mass production led to the mass adoption of personal computers, which themselves started the digital revolution. Although many people were suddenly put out of jobs, countless new jobs were created meanwhile, and our entire business and private life were forever changed. Today, in the developed countries it is common that everyone has one or several computers at his or her disposal.

Globalization and the exponential growth of civil aviation drove the business requirements for mobile computers. Ongoing miniaturization of computer parts finally enabled the development of notebook computers.

Advancements in software development led to timesharing operating systems, which allowed sharing of computing resources. Standardization and further technical breakthroughs in cabling led to Local Area Networks (LAN). Together these advances enabled the setup of dedicated, on-premise data centers, which allow companies and universities to pool big parts of their computing resources in one place and create significant efficiency and security gains.

The next major revolution was caused by the internet. Today, the internet has penetrated all aspects of our live. We communicate online, we buy online, we learn online, we publish online, we stream online, we even work or fall in love online. The immense amounts of transferred data required and inevitably led to the setup of public data centers, which house the servers that keep the internet’s data highway afloat.

During the internet revolution, data volumes quickly became so overwhelming that improvements in data handling and organization were essential. Service-oriented architectures soon allowed data centers to put file searches and computational business processes to the next level. The companies which were leading the development recognized the value of their software and that it was able to create a new business by itself: Cloud computing was born.

Today, the big cloud market leaders offer their customers so many opportunities and such unimaginable amounts of computing resources that on-premise data centers are more and more replaced with cloud computing. Not long ago, it was common to have all the software, which one needed for work or privately, on a storage device (first discs, then CDs, later DVDs). But today, it is more common that companies are not selling their software anymore. Instead, their software is offered as a subscription plan, which is delivered by cloud computing. So, the cloud computing industry is growing with astonishing speed and is expected to reach a revenue of 330 billion USD in 2022 (to put this in perspective: This is more than the entire Apple Inc. was making in 2018 and more than twice of what General Motors achieved in 2018).

But as we have learned from human history, progress never stops. So, what’s the next major revolution and when will it arrive? In our opinion, it has already started and is happening right now.

The blockchain technology for the first time allows distribution of ‘originalized’ digital information (for example digital money). Previously it was easy to copy, replicate and manipulate digital data at nearly no cost and it was almost impossible to determine, which set of data is the original, the true, the correct one. This caused many problems (just ask the software, film or music industry) and it also prevented computer users from trusting each other anonymously. The blockchain technology is changing that. Now, if applied, there is a technological solution to distinguish true from false.

This advancement unlocks a new revolution, the beginning of a Distributed Era in computing. Since the early days of the internet, there were already various attempts to distribute computing tasks to computer owners around the world (e.g. SETI@home). But, for a wide adoption and commercial usage, these setups were not suitable. Now, the blockchain technology allows the distribution of computing tasks to computer owners anywhere in the world and enables the monetary reward for their computing resources.

However, this way of distributing computing tasks still lacks an important feature that is crucial for mass-adoption. The validation, whether a computed result is correct or wrong (e.g. due to technical failure or due to fraud), requires full knowledge of the computational task at hand. Also, it requires the development and implementation of specific validation processes for each use case and algorithm individually. This is neither comfortable nor efficient nor economically sound.

Adaption of modern learnings from artificial intelligence enables a general, standardized validation setup. This allows to omit the development of individual validation processes for each individual algorithm. Instead, it allows to perform plausibility checks, risk evaluations and a risk-based revision of results. This way, distributed computing is unlocked for integration of a wide set of computational tasks, enabling its mass-adoption and the next major step in the history of computing.

We believe in this revolution and therefore we are developing HiveNet, which aims to enable computer owners anywhere in the world to securely rent out their idle computing power to paying customers in an efficient manner.

If you want to learn more about HiveNet, feel free to visit us on our website, subscribe to our newsletter or follow us on social media:

Website Newsletter Facebook Twitter LinkedIn Medium Telegram YouTube

--

--