Amazon Bedrock
A fully managed service from AWS that provides access to foundation models from multiple providers through a single API. Bedrock offers models from Anthropic, Meta, Mistral, Cohere, AI21, Stability AI, and Amazon's own Titan models. It includes features for fine-tuning, RAG (Knowledge Bases), and agent building.
Implements
Concepts this tool claims to implement:
- Inference Endpoint primary
Unified API for invoking foundation models. InvokeModel and InvokeModelWithResponseStream for synchronous and streaming inference. Converse API for chat-style interactions.
- Model Serving primary
Managed infrastructure for serving foundation models. Provisioned throughput for guaranteed capacity. No infrastructure management required.
- Retrieval-Augmented Generation secondary
Knowledge Bases for Bedrock - managed RAG service. Automatic chunking, embedding, and retrieval. Integrates with OpenSearch, Pinecone, Redis.
- Agent Orchestration secondary
Agents for Bedrock - managed agent runtime. Action groups for tool definitions. Automatic orchestration of multi-step tasks.
- Fine-Tuning secondary
Custom model training with your data. Continued pre-training and fine-tuning options. Model evaluation and comparison tools.
Integration Surfaces
Details
- Vendor
- Amazon Web Services
- License
- Proprietary
- Runs On
- cloud
- Used By
- human, agent, system
Links
Notes
Bedrock is AWS's answer to Azure OpenAI and GCP Vertex AI. Main advantage is access to multiple model providers through one API with AWS security and compliance. Good for enterprises already on AWS. Pricing is per-token with optional provisioned throughput.