Langfuse is an open-source observability platform for LLM applications that provides tracing, monitoring, and debugging. The integration is built into the Launchpad’s core using the native Langfuse SDK.
Why Langfuse?
- Complete Tracing: Track every workflow step, node execution, and LLM call
- Performance Monitoring: Monitor response times, costs, and success rates
- Debug Issues: Detailed logs and traces for troubleshooting failures
Quick Setup
1
Get Langfuse Account
Create a free account at langfuse.com and get your API keys
2
Update Environment
Add to your
.env files:3
Enable Tracing in Your Workflow
Pass
enable_tracing=True when initializing your workflow:4
Test Integration
Run a workflow and check your Langfuse dashboard for traces:
How It Works
The Langfuse integration uses the native Langfuse SDK to create spans around workflow and node execution:- A parent span is created for the entire workflow execution
- Each node gets its own child span with inputs and outputs
- LLM calls within AgentNodes are automatically instrumented
- Errors are captured with full context
Enabling and Disabling Tracing
Tracing is controlled per-workflow instance:Core Integration Features
- Automatic Tracing: Every workflow execution is automatically traced when enabled
- Node-Level Visibility: Individual node executions, inputs, and outputs are captured
- LLM Call Tracking: All LLM interactions including prompts, responses, and metadata
- Error Monitoring: Failed executions with full stack traces and context
- Streaming Support: SSE streaming workflows are fully traced