Step-by-Step Guides

Async API

Async API

Introduction

INFO

This implementation is still in development. Contributions are welcome!

The Async API is an implementation of the Langflow API that uses Celery to run the tasks asynchronously, using a message broker to send and receive messages, a result backend to store the results and a cache to store the task states and session data.

Configuration

The folder ./deploy in the Github repository contains a .env.example file that can be used to configure a Langflow deployment. The file contains the variables required to configure a Celery worker queue, Redis cache and result backend and a RabbitMQ message broker.

To set it up locally you can copy the file to .env and run the following command:

docker compose up -d

This will set up the following containers:

  • Langflow API

  • Celery worker

  • RabbitMQ message broker

  • Redis cache

  • PostgreSQL database

  • PGAdmin

  • Flower

  • Traefik

  • Grafana

  • Prometheus

Testing

To run the tests for the Async API, you can run the following command:

docker compose -f docker-compose.with_tests.yml up --exit-code-from tests tests result_backend broker celeryworker db --build

Getting Started

👋 Welcome to Langflow
📦 How to install?
🤗 HuggingFace Spaces
🎨 Creating Flows

Guidelines

Sign up and Sign in
API Keys
Assynchronous Processing
Component
Features
Collection
Prompt Customization
Chat Interface
Chat Widget
Custom Components

Step-by-Step Guides

Async API
Integrating documents with prompt variables
Building chatbots with System Message
Integrating Langfuse with Langflow

Examples

FlowRunner Component
Conversation Chain
Buffer Memory
MidJourney Prompt Chain
CSV Loader
Serp API Tool
Multiple Vector Stores
Python Function
📚 How to Upload Examples?

Deployment

Deploy on Google Cloud Platform

Contributing

How to contribute?
GitHub Issues
Community

Search Docs…

Search Docs…