Back to glossary
Concepts

Embeddings

Numerical vector representations of text used for semantic search and similarity.

Definition

Embeddings are numerical vector representations of text (typically 384-3072 dimensions) where semantic meaning is preserved as geometric proximity. Text with similar meaning produces similar vectors. Embeddings power semantic search, retrieval-augmented generation (RAG), recommendation systems, and clustering. Modern embedding models (OpenAI text-embedding-3, Cohere, Voyage AI) are inexpensive and fast.

Example

Embeddings of 'how do I cancel my subscription' and 'I want to end my plan' are close in vector space, so semantic search finds them as similar even with no shared words.

When to use

Semantic search, RAG retrieval, document clustering, similarity matching, recommendation.

Related terms

Free Chrome Extension

Stop rewriting prompts. Start shipping.

Works with ChatGPT, Claude, Gemini, Grok, Midjourney, Ideogram, Veo3 & Kling. 5.0★ on the Chrome Web Store.

Add to Chrome — Free