Staxly

Helicone vs Portkey

Open-source LLM observability — 1-line integration via proxy
vs. Enterprise AI gateway + observability + guardrails + prompt mgmt

Helicone websitePortkey website

Pricing tiers

Helicone

Hobby (Free)
10,000 requests/month. 7-day retention. 1 seat. Basic monitoring.
Free
Startup Discount
<2 years, <$5M funding: 50% off first year.
$0 base (usage-based)
Self-Hosted (OSS)
MIT-licensed. Run Helicone yourself for free.
$0 base (usage-based)
Pro
$79/month. 10k free + usage-based. Unlimited seats. Alerts, reports, HQL query language. 1-month retention.
$79/mo
Team
$799/month. 5 orgs, SOC-2 + HIPAA compliance, dedicated Slack, 3-month retention.
$799/mo
Enterprise
Custom MSA, SAML SSO, on-prem deploy, bulk discounts, forever retention.
Custom
Helicone website

Portkey

Developer (Free)
Free forever. 10k logs/month. Universal API + key management. 3 prompt templates. Basic observability.
Free
Gateway (OSS)
MIT-licensed gateway only (no observability UI). Self-host for routing/fallbacks.
$0 base (usage-based)
Production
$49/month. 100k logs ($9 per additional 100k). Fallbacks, load balancing, retries, semantic caching. Unlimited prompts. RBAC.
$49/mo
Enterprise
Custom. 10M+ logs/month. Custom guardrails, advanced evals, SSO, budget controls, VPC + on-prem, SOC2, HIPAA, GDPR.
Custom
Portkey website

Free-tier quotas head-to-head

Comparing hobby on Helicone vs free on Portkey.

MetricHeliconePortkey
No overlapping quota metrics for these tiers.

Features

Helicone · 16 features

  • AlertsThresholds on error rate, latency, cost, usage. Pro+.
  • Async LoggingLog AFTER the LLM call via SDK — zero added latency.
  • Cost TrackingAutomatic cost calculation per call by provider/model.
  • DashboardRequest tables, aggregate metrics, cost breakdowns.
  • EvaluatorsLLM-as-judge + custom evaluators on runs.
  • ExperimentsA/B test different models/prompts.
  • HQL (SQL over traces)Query your logged data with SQL. Pro+.
  • PII RedactionAutomatically scrub emails, credit cards, etc. from logs.
  • Prompt CachingCache identical requests → save money.
  • Prompts & VersionsStore + version + A/B test prompts.
  • Proxy Mode1-line integration via base URL swap. Captures all requests.
  • Rate LimitingPer-user + per-key rate limit policies.
  • ReportsScheduled email reports with KPIs.
  • Self-HostingDocker + k8s deployment.
  • SessionsGroup related calls (chat sessions, agent runs).
  • User MetricsPer-user cost + usage segmentation.

Portkey · 18 features

  • AI GatewayUnified OpenAI-compatible API to 250+ LLMs.
  • AlertsThresholds on latency, error rate, cost, usage.
  • Budget ControlsPer-key + per-team spending limits.
  • EvaluationsBuilt-in evaluator templates + custom.
  • FallbacksConfig-driven provider fallback chains.
  • GuardrailsPre/post processors for safety + compliance.
  • Load BalancingRound-robin, weighted, least-latency across providers.
  • MCP SupportUse MCP servers as tools through gateway.
  • ObservabilityLogs, traces, feedback, alerts, cost tracking.
  • OSS GatewayOpen-source gateway (portkey-ai/gateway).
  • Prompt LibraryShared prompt library + public marketplace.
  • Prompt TemplatesVersion + test + collaborate on prompts.
  • RetriesConfigurable retry policies per route.
  • Role-Based Access ControlTeam permissions on prompts + keys.
  • Semantic CachingVector-based cache on query meaning.
  • Simple CachingExact-match cache.
  • Virtual KeysPer-app keys with budget + rate limits + permissions.
  • VPC Deployment (Ent)Deploy in your own VPC for compliance.

Developer interfaces

KindHeliconePortkey
CLIHelicone CLIPortkey CLI
SDKhelicone (npm), helicone-pythonportkey-ai (Node), portkey-ai (Python)
RESTAsync Logging API, Helicone Proxy, Query API (HQL)Portkey API (OpenAI-compat)
MCPPortkey MCP
OTHERHelicone Dashboard, WebhooksPortkey Dashboard
Staxly is an independent catalog of developer platforms. Outbound links to Helicone and Portkey are plain references to their official websites. Pricing is verified against vendor pages at publication time — reconfirm before buying.

Want this comparison in your AI agent's context? Install the free Staxly MCP server.