chain of thought (CoT)
Chain-of-Thought (CoT) prompting is a technique for pretrained large language models (LLMs) in which the prompt is structured to encourage the model to generate intermediate reasoning steps (a chain of thought) before producing a final answer to a multi-step or complex problem.
The prompt typically includes either an instruction like Let’s think step by step or few-shot demonstration examples that explicitly show reasoning steps leading to the answer.
A notable variation, self-consistency, enhances CoT by sampling multiple reasoning paths and then selecting the answer that’s most consistent across those paths.
By Leodanis Pozo Ramos • Updated Nov. 3, 2025