in-context learning
In-context learning (ICL) is when a pretrained model performs a new task by conditioning on natural-language instructions and (optionally) a small number of input–output demonstrations included in the prompt, without updating its internal parameters.
ICL enables what appear as zero-shot or few-shot capabilities by treating the prompt itself as a temporary training set.
In practice, the success of ICL depends heavily on different factors, such as which examples are included, their order, how the instruction is phrased, and how the prompt is formatted.
By Leodanis Pozo Ramos • Updated Nov. 3, 2025