Architecture

What is Embedding?

A numerical representation of text that captures its meaning as a vector.

Definition

An embedding is a way to represent text (words, sentences, or documents) as a list of numbers (vector) that captures semantic meaning. Similar concepts have similar embeddings, making it possible to measure how related two pieces of text are. Embeddings are fundamental to search, recommendation systems, and RAG applications.

๐Ÿ’ก Example

The embeddings for "dog" and "puppy" would be very close together in vector space, while "dog" and "refrigerator" would be far apart. This allows AI systems to understand that a search for "puppy care" is related to "dog health tips."

Related concepts

RAG (Retrieval-Augmented Generation)

A technique that lets AI access external knowledge bases to provide more accurate answers.

โ†’
Vector Database

A database optimized for storing and searching AI embeddings at scale.

โ†’

Explore AI tools

Find tools that use embedding in practice.

Browse all tools โ†’ Back to glossary
What is Embedding?

An embedding is a way to represent text (words, sentences, or documents) as a list of numbers (vector) that captures semantic meaning. Similar concepts have similar embeddings, making it possible to measure how related two pieces of text are. Embeddings are fundamental to search, recommendation systems, and RAG applications.

How does Embedding work in practice?

The embeddings for "dog" and "puppy" would be very close together in vector space, while "dog" and "refrigerator" would be far apart. This allows AI systems to understand that a search for "puppy care" is related to "dog health tips."