Geometric Deep Learning Introduction
Do you want to know why Deep Learning works so well, what are its mathematical underpinnings? Then look no further than Symmetry.
“Symmetry, as wide or narrow as you may define its meaning, is one idea by which man through the ages has tried to comprehend and create order, beauty, and perfection.” a quote from Hermann Weyl, a German mathematician who was born in the late 19th century.
The last decade has witnessed an experimental revolution in data science and machine learning, epitomized by deep learning methods. Many high-dimensional learning tasks previously thought to be beyond reach — such as computer vision, playing Go, or protein folding — are in fact tractable given enough computational horsepower.
Remarkably, the essence of deep learning is built from two simple algorithmic principles: first, the notion of representation or feature learning, and second, learning by local gradient-descent type methods, typically implemented as backpropagation.
While learning generic functions in high dimensions is a cursed estimation problem, many tasks are not uniform and have strong repeating patterns as a result of the low dimensionality and structure of the physical world.
Geometric Deep Learning unifies a broad class of ML problems from the perspectives of symmetry and invariance. These principles not only underlie the breakthrough performance of convolutional…