This integration is currently in private beta. If you'd like to be added to the beta, email Lior@posthog.com.
We've teamed up with Langfuse to track metrics for LLM applications. Langfuse is an open source solution for monitoring LLM applications. It tracks metrics such as model costs, latency, token usage, and more.
Combining your Langfuse and PostHog data makes it easy to answer questions like:
- What are my LLM costs by customer, model, and in total?
- How many of my users are interacting with my LLM features?
- Are there generation latency spikes?
- Does interacting with LLM features correlate with other metrics (retention, usage, revenue, etc.)?
Here's an example dashboard in PostHog:
Supported application types
Langfuse supports using any large language model and has simple integrations for popular application frameworks such as OpenAI, LangChain, LlamaIndex, Flowise, LiteLLM, and more. See the Langfuse integration docs for more details.
How to get started
To get started:
- First add Langfuse Tracing to your LLM app (Quickstart).
- Enable the Langfuse integration in PostHog (in private beta). If you'd like to be added to the beta, email Lior@posthog.com.
Once you've added the integration into PostHog, you can use a dashboard template to set up the most relevant insights. To do this:
- Go the dashboard tab in PostHog.
- Click the New dashboard button in the top right.
- Select LLM metrics from the list of templates.
Supported Langfuse events
The Langfuse docs have a full list of events and properties that are sent from Langfuse to PostHog.