Deep Learning Terms to Boost Your HPC Knowledge

Kelly Kirkham
THG Hosting
Published in
3 min readMay 12, 2020

Without being hyperbolic, deep learning is a technology that is truly making the impossible, possible. For example, driverless cars, automation, cloud computing, improved healthcare, and cheaper manufacturing are all products of deep learning.

WHAT IS DEEP LEARNING?

Deep Learning, according to Investopedia, is “an artificial intelligence function that imitates the workings of the human brain in processing data and creating patterns for use in decision making” and “a subset of machine learning in artificial intelligence (AI) that has networks capable of learning unsupervised from data that is unstructured or unlabeled.”

The average American enjoys deep learning in the form of improved traffic patterns and product recommendations. Our world will soon be transformed through the power of parallel processing and high-performance computing via deep learning interfaces.

EXPAND YOUR DEEP LEARNING VOCABULARY

The average tech user may not understand the intricacies of deep learning. However, there are a few terms that you can familiarize yourself with to learn more about the technology. We’ve gathered a glossary of terms that can help you better understand the deep learning process. Some of the definitions are our own. Others we’ve borrowed from industry experts like our partner NVidia, the inventors of Graphics Processing Units, and industry leaders in deep learning processes. See below…

Artificial Neural Networks

Artificial Neural Networks are the main tool used in machine learning. They are modeled after the neuronal structure of the human brain in an attempt to mimic how we learn and comprehend data, but on a much, much larger scale.

Big Data

This is the computational analysis of large quantities of data in search of trends, patterns, associations, or other revealing information. This data is only available through the processing of extremely large data sets, specifically those reporting human behavior and interactions.

Convolutional Neural Networks

A Convolutional Neural Network is a specific type of machine learning algorithm that can take an input, like an image, and assign importance through weights and biases. The neural network can then process various aspects/objects within the image, and be able to differentiate one from the other.

Embedding

According to NVidia, embedding “is a representation of input, or an encoding.” For example, a neural word embedding is a vector that represents a word. Embedding is considered a key breakthrough for deep learning and is used for challenging natural language processing problems.

Feedforward Network

Ensures that signal can only travel in one direction from input to output. Feedforward network is an effort to prevent feedback loops that could damage or distort data output.

Generative Adversarial Networks

According to NVidia, Generative Adversarial Networks are a “type of AI algorithms used in unsupervised machine learning, implemented by a system of two neural networks competing against each other in a zero-sum game framework.” Learn more in this article that outlines Generative Adversarial Networks and the work of Ian Goodfellow.

Jitter

Jitter is artificial noise, or disturbances added to inputs during training to help generalize or regulate a neural network.

Loss Function

Nvidia defines loss function as “for each prediction, there is an associated number which is the loss. For a true prediction, the loss will be small and for a totally wrong prediction the loss will be high.”

Natural Language Processing

Natural Language Processing is the deep learning effort to help machines understand and compute using human’s natural spoken word. This includes the structure, common usage, and meaning, so that deep learning can take place using natural sentences.

Unsupervised Learning

A type of machine learning used to draw inferences from data that consists of input data without labeled responses. Cluster analysis is a type of unsupervised learning. It is most commonly used for exploratory data analysis in an attempt to discover hidden patterns or data groupings.

Yoshua Bengio & Yann Lecun

The pioneers of deep learning. Also, two of the three recipients of the Turing Award often referred to as the Nobel Prize of technology.

--

--