Paper Reading #3: Poincaré Embeddings for Learning Hierarchical Representations

Why this paper: I believe this paper opens new directions of research in deep learning applied to language modelling. Language does not has a linear structure that is why RNNs and LSTMs do not learn linguistic concepts. What this paper proposes is to model the embeddings in a hyperbolic space instead of Euclidean space, why this helps, it requires more study of hyperbolic space and poincare conjecture. For the scope of this article, we will assume that hyperbolic spaces are more suitable to model hierarchical relationships. It is an open problem to build neural networks for hierarchical structures so this is paper stands out.

What is interesting: Word embedding is a popular technique now. However the relationship it discovers is linear e.g. king minus queen is equal to man minus woman. However, most of the relationships in language has multiple levels of hierarchy, case in point Dependency Tree which is the output of any language parser, and Regular Graph Grammars.

How to reproduce results:

After thoughts:

References:

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.