Back to glossary
Concepts

In-Context Learning

Model behavior of adapting to a task using only examples shown in the prompt.

Definition

In-context learning is the phenomenon where a large language model adapts its behavior to a new task using only examples shown in the prompt — without any weight updates. This is the mechanism behind few-shot prompting. The model recognizes the input-output pattern in the examples and applies it to your real input. Discovered in GPT-3 (2020); now a foundational concept in prompt engineering.

Example

Showing 3 examples of converting 'cat' → 'kitten', 'dog' → 'puppy', 'horse' → 'foal' lets the model infer the pattern and produce 'cow' → 'calf' for a new input.

When to use

Custom tasks where you can show examples but don't want to fine-tune.

Also known as

icl

Related terms

Free Chrome Extension

Stop rewriting prompts. Start shipping.

Works with ChatGPT, Claude, Gemini, Grok, Midjourney, Ideogram, Veo3 & Kling. 5.0★ on the Chrome Web Store.

Add to Chrome — Free