Chain-of-Thought

Process prompting published

Also known as: CoT, Step-by-Step Reasoning, Think Step by Step

Definition

A prompting technique that elicits intermediate reasoning steps from an LLM before it produces a final answer. By asking the model to "think step by step" or showing examples with reasoning traces, Chain-of-Thought dramatically improves performance on tasks requiring multi-step reasoning: math, logic, and complex question answering.

What this is NOT

  • Not the same as planning (CoT is within a single response; planning structures multiple actions)
  • Not reflection (CoT is forward reasoning; reflection evaluates past reasoning)
  • Not just verbose output (CoT specifically improves accuracy, not just length)

Alternative Interpretations

Different communities use this term differently:

llm-practitioners

Prompting the model to show its reasoning process, either through explicit instruction ("Let's think step by step") or through few-shot examples that include reasoning traces. CoT prompting is now a standard technique for reasoning tasks.

Sources: Chain-of-Thought paper (Wei et al., 2022), Zero-shot CoT paper ('Let's think step by step'), Prompt engineering best practices

Examples

  • Question: What is 23 × 17? Let's think step by step. First, 23 × 10 = 230. Then, 23 × 7 = 161. So 23 × 17 = 230 + 161 = 391.
  • Adding 'Let's think step by step' to a math word problem
  • Few-shot examples showing reasoning before answers
  • Self-consistency: generate 5 CoT paths, take majority vote

Counterexamples

Things that might seem like Chain-of-Thought but are not:

  • Direct answer without reasoning steps
  • Verbose response that doesn't actually reason (just more words)
  • Planning future actions (that's planning, not CoT)

Relations

  • overlapsWith reasoning (CoT elicits reasoning from models)
  • overlapsWith few-shot-prompting (CoT can use few-shot examples with reasoning traces)
  • overlapsWith prompt (CoT is a prompting technique)

Implementations

Tools and frameworks that implement this concept:

  • o1 primary
  • o3 primary