Langfuse

Open-source LLM observability and eval platform with traces, datasets, scorers, and prompt management—self-host via Docker to keep data on your own network.

Evals / Observability开源自部署Trace
Visit websiteOpens in a new tab

Best for

Teams wanting open-source, self-hostable, auditable LLM observability with unified tracing, prompt management, and offline evals.

Less ideal when

Teams unwilling to host, or very small traffic where a vendor dashboard is enough.

When comparing

Compared to LangSmith / Helicone / Phoenix: Langfuse wins on OSS and data sovereignty—you own the gap between UI polish and closed-source SaaS.

Quick checklist

  • Plan self-hosting resources and upgrade cadence
  • Tie prompt versions + traces + evals into one pipeline
  • Bridge sampled live traces with offline golden sets
  • Set SSO / RBAC / PII policies up-front

Search-driven Q&A

Is Langfuse self-hosting complex?

Docker Compose and Helm charts bring you up in ~1 hour. The real work is ongoing: Postgres/ClickHouse backups, cost-table upkeep, multi-tenant RBAC. Run a disaster-recovery drill before production.

When to use it

The summary should help you decide if this tool fits your needs. When many options look similar, consider how often you’ll use it, budget, and data privacy before choosing one.

Related tools