Prompt
Also known as: Input, Query
Definition
The text input provided to an LLM to elicit a response. A prompt can be a simple question, a complex instruction, or a carefully structured template with examples and context. Prompts are the primary interface between humans and LLMs—the quality of the prompt largely determines the quality of the response. The art and science of crafting prompts is called prompt engineering.
What this is NOT
- Not the model's response (that's the completion or output)
- Not training data (prompts are inference-time inputs)
- Not the model itself (prompts are inputs to the model)
Alternative Interpretations
Different communities use this term differently:
llm-practitioners
The complete input to an LLM API call, which may include system prompts, conversation history, retrieved context, and the current user message. In the Chat Completions API, the prompt is the messages array.
Sources: OpenAI API documentation, Anthropic prompt engineering guide, Prompt engineering literature
academic-nlp
A natural language input used to condition a language model's output, as opposed to fine-tuning which modifies model weights. "Prompting" as a paradigm emerged with GPT-3 and the in-context learning capability.
Sources: GPT-3 paper (Brown et al., 2020), Prompt-based learning surveys
Examples
- A simple question: 'What is the capital of France?'
- An instruction: 'Translate the following text to Spanish: ...'
- A structured prompt with system message, examples, and query
- An agentic prompt with tool definitions and conversation history
Counterexamples
Things that might seem like Prompt but are not:
- The model's response text
- Training examples used to fine-tune the model
- The model weights
Relations
- overlapsWith system-prompt (System prompts are one part of the full prompt)
- overlapsWith user-message (User messages are one part of the full prompt)
- overlapsWith prompt-template (Templates generate prompts)
- overlapsWith context-window (Prompts are constrained by context window)