Company

Portkey

Enterprise AI gateway + observability + guardrails three-in-one control plane, positioned as the "control tower" for production LLMOps.

1. Core Product / Service

Portkey puts three layers in one stack:

  • AI Gateway: a single API accessing 1,600+ LLMs (OpenAI, Anthropic, Google, Azure, Bedrock, self-hosted models), supporting fallback / retry / load-balance / semantic cache. The Gateway is now fully open-source (2026-03 also released the previously SaaS-only governance / observability / auth / cost control) [3].
  • Observability: 40+ metrics for traces / cost / latency / quality dashboards, sliced multi-dimensionally by user / workspace / provider.
  • Guardrails: 60+ built-in guardrails covering prompt injection detection, PII redaction, jailbreak blocking, output compliance filtering; integrated with security products such as PANW AIRS.

Deployment forms: managed SaaS (starting $49/mo) or self-host (OSS gateway).

2. Target Users & Pain Points

Target: enterprise AI teams with production LLM traffic.

Pain points:

  • Multi-provider switching / failover fallbacks must be written by hand
  • No cost visibility, token bills get out of control
  • No audit trail, can't pass compliance
  • Have to assemble guardrails themselves (prompt injection / PII / hallucination)

Portkey's selling point is packing these four things into the gateway layer; application code only changes one base URL.

3. Competitive Landscape

Dimension Portkey openrouter LangSmith Helicone LiteLLM eden-ai
Form Gateway + Obs + Guardrails SaaS marketplace Tracing/eval framework Obs + lightweight proxy OSS Python proxy API aggregation + price comparison
Main differentiation Production safety (guardrails / PII / audit) One key to 200+ models, 5% markup LangChain-ecosystem tracing, no routing Rust high-performance observability Fully self-hosted maximum flexibility Multimodal API price comparison
Billing $49/mo + self-host OSS per-token + 5% SaaS subscription Free / self-hosted Free OSS Reselling
Native billing No (BYO keys) Yes No No No Yes
Fit for enterprise production individuals / small teams / quick onboarding LangChain app debugging observability-heavy scenarios infra control freaks mixed multimodal use

Differentiation: Portkey is the only one in this set that builds guardrails + governance into the gateway layer; LangSmith doesn't route, Helicone leans observation, LiteLLM leans flexibility, OpenRouter leans onboarding convenience.

4. Unique Observations

  • 2026-03 open-sourcing the previously SaaS-exclusive governance / observability / auth / cost control is a play to use open source to capture developer mindshare and monetize via enterprise services — closer to a confrontation with LiteLLM than with OpenRouter, who isn't on the same battlefield.
  • The founding team has Indian roots, SF incorporation — a typical India-built / US-incorporated GenAI infra pattern (creating regional competition with the French team at Eden AI and the YC team at Helicone).
  • In scenarios like hermes-openrouter-models "agent multi-provider fallback", Portkey is OpenRouter's enterprise alternative; when on top of model routing you also need PII / audit / guardrails, OpenRouter isn't enough.
  • Not doing native billing is a conscious choice: BYO keys lets enterprise customers retain their direct contracts with OpenAI / Anthropic, which is exactly why OpenRouter can't break into the enterprise market.

5. Financials / Funding

  • Seed (2023-08): $3M, Lightspeed India + We Founder Circle lead [1]
  • Series A (2026-02): $15M, Elevation Capital lead, Lightspeed follow [2]
  • Total raised: ~$18M+ [2]
  • HQ: San Francisco (founders with Indian background)

6. People & Relationships

  • Founders: Rohit Agarwal (CEO), Ayush Garg (CTO, ex-Bingage). The two founded the company 2023-01.
  • Investors: Elevation Capital (Series A lead), Lightspeed (seed + Series A), We Founder Circle
  • Competitors: openrouter (gateway layer, consumer-oriented), eden-ai (API price comparison, multimodal-oriented); non-linked competitors: LangSmith, Helicone, LiteLLM
  • Complements: together-ai (as one of the underlying inference providers routed by Portkey)

Sources

Last compiled: 2026-05-09