DataSeries
Published in

DataSeries

Data storage as an enabler forArtificial Intelligence

For thousands of years, the human being has sought ways to establish methods by which he can perpetuate some kind of data.

Whether it be an experience, a discovery or just an observation.

When we talk about data storage, we go from a painting in a cave, to quantum storage processes, the objective is the same, what changes is the technique.

A bit of history

Data storage started thousands of years ago with cave paintings.

Until just over 150 years ago, the only reliable or consistent way to store information was writing.

Based on that writings is that we have the understanding of human history, which is called “written history.”

Punched cards

There are registers for a machine language based in punched cards from 1725 made by Basile Bouchon.

In 1837 there was a considerable new advance in using punch cards for mathematical calculations (Charles Babbage, Herman Hollerith)

Herman Hollerith used his punch card system for the 1890 census in the United States.

Punched cards were the norm for feeding different types of mechanical and electrical instruments with data, reaching their maximum expression in 1950.

To date, punched cards are used for processes such as ballots or exams.

Magnetic Storage

A technology patented since 1925 by Fritz Pfleumer.

Magnetic drums were created in 1932 by Gustav Taushek, considered the first example of magnetic storage discs.

Beginning in 1960, magnetic storage was replacing punched cards as the standard for data storage.

From music to input methods for computer systems, by 1990 magnetic storage replaced nearly 100% of punched cards systems.

IBM was one of the main drivers of this technology, being the creators of the “Floppy” disk and the Magnetic Hard Disk.

Optical Discs

Concept created in the 60’s by James T. Russel, thinking of using light as a mechanism to record data on a surface to be reproduced.

Sony funded the project to have a working prototype for the 1980s.

This technology evolved into what we know as CDs, DVDs, BlueRays, and derivatives.

Flash Drive

The first “flash” storage devices came onto the market in 2000.

Unlike magnetic or optical storage devices, it has no moving parts.

Storage is carried out using microchips and transistors.

An important advance in this type of technology was the ability to easily write, read and erase this type of device.

Solid State Drives

Works under the same concept as a flash storage device.

However, due to their technology they are faster and more reliable since they are directly connected to the computer’s motherboard.

Storage evolution Storage

Storage capacity is measured in bytes, which is the composition of 8 bits of memory, this thinking that the operation of a computer is in a binary system.

“Over the past 6 decades, the price of computer data storage has halved approximately every 2 years, which is equivalent to being a thousand times cheaper every 20 years.”

http://jcmit.net/memoryprice.htm

In 1950 the standard storage capacity was 0 bytes.

In 1960, 1 byte came into permanent storage.

In 1970 permanent storage kept .5 kb

In 1980 it was possible to have a storage device of .5 mb

In 1990 1.44 mb memory were launched

In 2000 hard drives and external storage had a capacity measured at 1 gb

In 2010 the capacity average storage had exceeded 100 gb

In 2015 the term terabyte began to be used in a normal way.

By 2020, the pentabyte begins to be considered.

How does data storage enable Artificial Intelligence?

In a very simplified way, we can understand a basic level of intelligence as the ability to take a certain amount of data, pass it through a process or algorithm to obtain information as a result.

This is done by human beings on a daily basis to make decisions, even this process remains permanently recorded in our subconscious for autonomous tasks.

However, for those same autonomous tasks it is necessary to learn something for the first time.

In order to reach a time when artificial intelligence could have a significant expansion, you had to reach a time when data storage capacity was large enough.

General-purpose artificial intelligence works with a data processing and analysis mechanism to make logical decisions based on algorithms with the information obtained from the data.

Machine Learning

Machine Learning process is based on the ability of an entity to obtain results of large clusters of information.

Elements of artificial intelligence such as Big Data or Reinforcement Learning, base part of its operation with the storage capacity, the speed of extraction of data for processing and the cost.

The human brain

The brain has an estimated storage capacity of 74 terabytes of information only in the cerebral cortex, which suggests that its capacity to retain information is much greater.

The advances that could be made to be able to lead to the creation of neural networks in an efficient way in terms of accessibility and cost are going to depend a lot on the optimization of storage.

Conclusion

Information storage is only one part of computing that has advanced to enable artificial intelligence.

Processing capacity, as well as logic in programming languages ​​and algorithms, has a perhaps more important role.

What is clear is that the evolution of our hardware has to go hand in hand with the ability of the software to bring a general artificial intelligence to a clear goal for everyone.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Carlos Villegas

Medium Writer for Tech, Artificial Intelligence and Productivity.