Skip to main content
This page recognizes the open-source technologies that power, extend, and strengthen LLM Controls. We are grateful to the communities and maintainers whose work forms part of the foundation on which the platform operates.

Core Architecture Foundations

LLM Controls is built on top of Langflow. Langflow provides the underlying flow-based orchestration engine that enables node-based workflow construction and execution. Within LLM Controls, this foundation has been extended and adapted to support expanded execution controls, additional abstractions, and platform-level capabilities. Flows originally created in Langflow can be migrated into LLM Controls with light adjustments where required. We acknowledge and appreciate the Langflow team for releasing their work under an MIT license and enabling broader innovation in AI workflow tooling.

Open-Source Tools & Libraries (Backend)

LLM Controls is Python-first and integrates a focused set of open-source technologies across orchestration, retrieval, evaluation, and data processing.

Orchestration & LLM Integration

LLM Controls integrates:
  • LangChain and its provider ecosystem
  • LiteLLM for unified model routing
  • MCP for tool and agent interoperability
These integrations enable flexible model usage and structured tool execution within workflows.

Retrieval & Vector Infrastructure

LLM Controls includes a native LLM Controls Vector Database, powered by Qdrant and hosted within the platform. This managed layer provides:
  • High-performance similarity search
  • Scalable embedding storage
  • Production-ready vector retrieval
LLM Controls also provides first-party connectors for:
  • Chroma
  • Qdrant
  • Weaviate
  • FAISS
Search-grade indexing is supported through open-source Elasticsearch integrations. We extend particular appreciation to the Qdrant team for their contributions to the vector database ecosystem.

Observability & Experiment Tracking

LLM Controls integrates MLflow for experiment tracking and evaluation workflows. MLflow enables structured logging, reproducibility, and lifecycle management across AI experiments.

Data Ingestion & Utilities

For ingestion and structured processing, LLM Controls leverages:
  • Beautiful Soup
  • DuckDuckGo Search integrations
  • Wikipedia and YouTube transcript APIs
  • pytube
For NLP and data handling:
  • NLTK
  • tiktoken
  • pandas
  • Apache Arrow (via pyarrow)
These tools support efficient handling of text, documents, and structured datasets.

Infrastructure

LLM Controls relies on:
  • SQLAlchemy
  • psycopg
  • PostgreSQL
  • Redis
  • Docker for containerized deployment
These components provide persistence, state management, and operational reliability.

Frontend Technology Stack

The LLM Controls frontend is built on a modern open-source React ecosystem:
  • React
  • React Router
  • Vite
  • Radix UI
  • Shadcn-ui
  • Tailwind CSS
  • Framer Motion
  • React Flow
This stack enables a responsive and extensible visual workflow-building experience.

Acknowledgments

We extend our gratitude to:
  • The team behind Langflow
  • The maintainers of LangChain and LiteLLM
  • The contributors to Qdrant
  • The teams behind React and React Flow
  • The broader open-source AI and developer community
We remain committed to supporting and contributing back to the open-source ecosystem.