What is Embedding in AI? Explained in Everyday Language for Beginners

A. Zhang
AI for Absolute Beginners
4 min readJan 27, 2024

--

Updates: AI for Absolute Beginners is an ongoing series that now covers:

Last time, we talked about RAG (What is RAG in AI? Explained in Everyday Language for AI Beginners), a critical component in AI chatbots that retrieves external knowledge and integrates it into responses. This time, let’s explore another crucial concept in Large Language Models (LLM): “embedding.”

“Embed” words in Math

The term “embedding” originates from math and topology. You don’t need to grasp the full mathematical definition, and I’m probably not an expert in these areas. However, in a basic sense, similar to ‘implanting something into something else’ in the physical world, embedding in math refers to mapping ‘something’ into a mathematical structure.

In natural language processing (NLP) and Large Language Models, embedding involves using a mathematical structure to represent words or data, such as images.

Sounds familiar? But Embedding is different.

Many might be familiar with Morse code. To some extent, if you consider dots as 0s and lines as 1s, Morse code can be seen as a mathematical form of…

--

--