LangChain
A framework for developing applications powered by language models. LangChain provides modular abstractions for building chains (sequential LLM calls), agents (LLM-driven decision loops), RAG systems (retrieval + generation), and memory (conversation and long-term). It's designed to make LLM application development more composable and less boilerplate-heavy.
Implements
Concepts this tool claims to implement:
- Agent Orchestration primary
AgentExecutor class, create_react_agent, create_openai_functions_agent, and other agent factory functions. LangGraph (companion library) extends this with graph-based agent workflows.
- Retrieval-Augmented Generation primary
Retriever abstractions, vector store integrations (Chroma, Pinecone, Weaviate, etc.), document loaders, text splitters, and retrieval chains.
- Tool Binding primary
Tool and StructuredTool classes for defining tools. @tool decorator for easy tool creation. Tool schemas compatible with OpenAI function calling format.
- Prompt Template secondary
PromptTemplate, ChatPromptTemplate, and related classes for structured prompt construction with variable interpolation.
- Agent Memory secondary
ConversationBufferMemory, ConversationSummaryMemory, and various memory classes for maintaining state across interactions.
Integration Surfaces
Details
- Vendor
- LangChain Inc.
- License
- MIT
- Runs On
- local, cloud, hybrid
- Used By
- human, agent, system
Links
Notes
LangChain is one of the most widely adopted LLM application frameworks. It has evolved significantly since its 2022 launch, with LangGraph now handling more complex agent patterns. The ecosystem includes LangSmith for observability and LangServe for deployment.