Portkey

platform active freemium

An AI gateway and observability platform for LLM applications. Portkey provides a unified API to access 200+ LLMs, with built-in features for reliability (fallbacks, retries, load balancing), security (guardrails, PII redaction), and observability (logging, analytics, debugging).

Implements

Concepts this tool claims to implement:

  • Unified API for multiple LLM providers. Automatic fallbacks between providers. Load balancing across models. Request/response transformation.

  • Intelligent routing based on cost, latency, or custom logic. Fallback chains when primary providers fail. A/B testing between models.

  • Guardrails secondary

    Built-in content moderation and PII detection. Request/response filtering. Custom guardrail rules.

  • Caching secondary

    Semantic caching for similar queries. Simple caching for identical requests. Configurable cache strategies.

Integration Surfaces

  • REST API (OpenAI-compatible)
  • Python SDK
  • JavaScript/TypeScript SDK
  • LangChain integration
  • Web dashboard

Details

Vendor
Portkey AI
License
MIT (gateway) / Proprietary (cloud)
Runs On
cloud
Used By
human, agent, system

Notes

Portkey positions itself as an "AI gateway" more than pure observability. The open-source gateway can be self-hosted. Strong focus on reliability features (fallbacks, retries) that are critical for production LLM apps.