Langfuse offers a unique approach to swift app development by providing a managed solution through Langfuse Cloud, maintained by the Langfuse team. However, for those who prefer a self-hosted option, Langfuse can be deployed using Docker, offering flexibility and control. This article provides guidance on different deployment scenarios, including some add-on features that require a license key.

Deployment Options

Langfuse offers two primary deployment options: Langfuse Cloud and self-hosted Langfuse. Swift app development teams can choose the best option based on their specific needs and preferences.

Langfuse Cloud

Langfuse Cloud is a fully managed version of Langfuse, hosted and maintained by the Langfuse team. This option provides an easy and fast way to get started with Langfuse at an affordable price point. With Langfuse Cloud, teams can focus on developing their apps without worrying about infrastructure management.

Self-Hosted Langfuse

For those who prefer a self-hosted solution, Langfuse can be deployed using Docker Compose. This option is recommended for testing and low-scale deployments, but lacks the high-availability, scaling capabilities, and backup functionality provided by Langfuse Cloud.

Architecture

Langfuse's architecture is designed to provide a robust and scalable infrastructure for swift app development. The platform consists of two application containers, storage components, and an optional LLM API/Gateway. These components work together to provide a seamless user experience.

Application Containers

  • Langfuse Web: The main web application serving the Langfuse UI and APIs.
  • Langfuse Worker: A worker that asynchronously processes events.

Storage Components

  • Postgres: The main database for transactional workloads.
  • Clickhouse: High-performance OLAP database storing traces, observations, and scores.
  • Redis/Valkey cache: A fast in-memory data structure store used for queue and cache operations.
  • S3/Blob Store: Object storage persisting all incoming events, multi-modal inputs, and large exports.

LLM API / Gateway

Some features depend on an external LLM API or gateway. Langfuse can be deployed within a VPC or on-premises in high-security environments, with internet access optional. See networking documentation for more details.

Optimized for Performance, Reliability, and Uptime

Langfuse self-hosted is optimized for production environments, offering the same codebase as Langfuse Cloud, just deployed on your own infrastructure. The Langfuse team serves thousands of teams with Langfuse Cloud, providing high availability (status page) and performance.

Some optimizations include:

  • Queued trace ingestion: All traces are received in batches by the Langfuse Web container and immediately written to S3. Only a reference is persisted in Redis for queueing.
  • Caching of API keys: API keys are cached in-memory in Redis, reducing database hits and unauthorized requests.
  • Caching of prompts (SDKs and API): Prompts are cached in a read-through cache in Redis, allowing hot prompts to be fetched from Langfuse without hitting the database.
  • OLAP database: All read-heavy analytical operations are offloaded to an OLAP database (Clickhouse) for fast query performance.
  • Multi-modal traces in S3: Multi-modal traces can include large videos or arbitrary files. To enable support for these, they are directly uploaded to S3/Blob Storage from the client SDKs.

Features

Langfuse supports many configuration options and self-hosted features. For more details, please refer to the configuration guide.

Subscribe to Updates

Release notes are published on GitHub. Langfuse uses tagged semver releases (versioning policy). You can subscribe to our mailing list to get notified about new releases and new major versions. You can also watch the GitHub releases to get notified about new releases:

Support

If you experience any issues when self-hosting Langfuse, please:

  • Check out Troubleshooting & FAQ page.
  • Use Ask AI to get instant answers to your questions.
  • Ask the maintainers on GitHub Discussions.
  • Create a bug report or feature request on GitHub.