src/lfx/PLUGGABLE_SERVICES.md
LFX now supports a pluggable service architecture that allows you to customize and extend service implementations without modifying core code.
The pluggable services system supports three discovery mechanisms:
Discovery order (services are discovered in this order, with later sources able to override earlier ones):
LFX also supports adapter registries -- collections of swappable implementations that share the same protocol.
Use them when you need to choose between several adapters at runtime (for example, deployment adapters keyed as local and remote in your own config/plugin setup).
| Concern | Service | Adapter registry |
|---|---|---|
| Lookup key | ServiceType enum | String key within a registry |
| Cardinality | One implementation per type | Many adapters per registry |
| Type safety | Protocol per ServiceType | Protocol-typed registry per AdapterType |
| Example | storage_service | deployment adapters by key, e.g. "local", "remote" |
Each adapter type exposes a typed accessor in lfx.services.deps:
from lfx.services.deps import get_deployment_adapter
adapter = get_deployment_adapter("local") # returns None if key is unknown
Adapters are created lazily on first access and cached as singletons — subsequent calls with the same key return the same instance.
Discovery config is resolved internally (settings config_dir when available, otherwise current working directory).
The key ("local" above) is an example; adapter keys are fully user-defined.
from lfx.services import register_adapter
from lfx.services.adapters.schema import AdapterType
@register_adapter(AdapterType.DEPLOYMENT, "local")
class LocalAdapter:
...
The decorator registers the class immediately at import time (same pattern as @register_service for top-level services).
Defaults to override=True; set override=False to keep an existing key untouched.
"local" is only an example adapter key.
lfx.toml (preferred when both files exist):
[deployment.adapters]
local = "my_package.deployment:LocalAdapter"
remote = "my_package.deployment:RemoteAdapter"
pyproject.toml:
[tool.lfx.deployment.adapters]
local = "my_package.deployment:LocalAdapter"
remote = "my_package.deployment:RemoteAdapter"
local/remote are illustrative keys; choose names that match your adapter set.
Entry-point groups follow the naming convention lfx.<adapter_type>.adapters (e.g. lfx.deployment.adapters).
[project.entry-points."lfx.deployment.adapters"]
local = "my_package.deployment:LocalAdapter"
remote = "my_package.deployment:RemoteAdapter"
Entry-point names are adapter keys and are fully user-defined.
Adapter registration mirrors the top-level @register_service model:
override=True by default)override=False) — discovered during discover()override=True) — discovered during discover()Effective precedence (what wins when multiple sources register the same key):
Config files > Decorators > Entry points
Config files are intended as deploy-time overrides and take highest priority.
Discovery is one-time per registry instance. Subsequent calls to discover() are no-ops.
Entry-point groups and config section paths are derived from the AdapterType value automatically (e.g. lfx.deployment.adapters and [deployment.adapters]).
get_<type>_adapter() callWhen adapters are part of core LFX, prefer namespacing them under services/adapters:
lfx/services/
adapters/
deployment/
__init__.py
base.py
service.py
exceptions.py
schema.py
This keeps top-level services (DI-managed singletons) separate from adapter implementations (keyed registries with multiple implementations per type).
Payload contracts intentionally split ownership across layers:
lfx.services.adapters.payload defines shared primitives (PayloadSlot, ProviderPayloadSchemas)lfx.services.adapters.deployment.payloads)*PayloadSchemas registry with adapter-side modelsThis boundary allows both layers to share one slot taxonomy while keeping API exceptions and transformation logic outside adapter implementations.
Deployment is the concrete example in this repository:
DeploymentPayloadFields / DeploymentPayloadSchemas)DeploymentApiPayloads) and mapper behaviormodule:ClassName) are ignored with warning logsAdapterTypeCreate an lfx.toml in your project root:
[services]
database_service = "langflow.services.database.service:DatabaseService"
storage_service = "langflow.services.storage.local:LocalStorageService"
cache_service = "langflow.services.cache.service:ThreadingInMemoryCache"
Then run:
lfx serve my_flow.json
Services will be automatically discovered from:
lfx.toml or pyproject.toml)Note: Service discovery happens automatically on first service access.
In your service module:
from lfx.services import register_service
from lfx.services.base import Service
from lfx.services.schema import ServiceType
@register_service(ServiceType.DATABASE_SERVICE)
class MyDatabaseService(Service):
name = "database_service"
def __init__(self, settings_service):
self.settings = settings_service.settings
self.set_ready()
async def teardown(self) -> None:
# Cleanup logic
pass
Simply importing this module will register the service.
In your plugin's pyproject.toml:
[project.entry-points."lfx.services"]
database_service = "my_plugin.services:MyDatabaseService"
storage_service = "my_plugin.services:MyStorageService"
When your package is installed, LFX will automatically discover these services.
LFX searches for configuration in this order:
./lfx.toml (project-specific config)./pyproject.toml under [tool.lfx.services] (Python project integration)Standalone lfx.toml:
[services]
database_service = "package.module:ClassName"
storage_service = "package.module:ClassName"
In pyproject.toml:
[tool.lfx.services]
database_service = "package.module:ClassName"
storage_service = "package.module:ClassName"
Service keys must match ServiceType enum values exactly:
database_serviceauth_servicestorage_servicecache_servicechat_servicesession_servicetask_servicestore_servicevariable_servicetelemetry_servicetracing_servicestate_servicejob_queue_serviceshared_component_cache_servicemcp_composer_servicetransaction_serviceImportant: settings_service is not pluggable and cannot be overridden. It is always created using the built-in factory and provides the foundational configuration for all other services.
from lfx.services.base import Service
class MyCustomStorageService(Service):
name = "storage_service" # Must match ServiceType value
def __init__(self, settings_service):
"""Dependencies are auto-injected based on __init__ signature."""
self.settings = settings_service
self.set_ready()
async def save_file(self, flow_id: str, file_name: str, data: bytes) -> None:
# Your implementation
pass
async def get_file(self, flow_id: str, file_name: str) -> bytes:
# Your implementation
pass
async def teardown(self) -> None:
# Cleanup when service manager shuts down
pass
Option A: Decorator (recommended for libraries)
from lfx.services import register_service
from lfx.services.schema import ServiceType
@register_service(ServiceType.STORAGE_SERVICE)
class MyCustomStorageService(Service):
...
Option B: Config File (recommended for CLI users)
Add to lfx.toml:
[services]
storage_service = "my_package.services:MyCustomStorageService"
Option C: Entry Points (recommended for plugins)
Add to your pyproject.toml:
[project.entry-points."lfx.services"]
storage_service = "my_package.services:MyCustomStorageService"
from lfx.services.deps import get_storage_service
storage = get_storage_service()
await storage.save_file("flow_123", "data.json", b'{"key": "value"}')
Services can depend on other services. Dependencies are resolved automatically based on __init__ parameter type hints:
from lfx.services.base import Service
from lfx.services.settings.service import SettingsService
class MyDatabaseService(Service):
name = "database_service"
def __init__(self, settings_service: SettingsService, cache_service: CacheService):
"""
Dependencies are auto-injected:
- settings_service: SettingsService will be created first
- cache_service: CacheService will be created second
"""
self.settings = settings_service.settings
self.cache = cache_service
self.set_ready()
The ServiceManager will:
__init__When multiple registration mechanisms provide the same service, config files have highest priority:
override=Falseoverride=True by defaultoverride=TrueExample:
# Entry point registers: MyPluginStorage (override=False, won't replace existing)
# Decorator registers: CustomStorageService (override=True, replaces entry point)
# Config file registers: LocalStorageService (override=True, replaces decorator)
# Result: LocalStorageService wins (config file is highest priority)
You can control decorator override behavior:
@register_service(ServiceType.STORAGE_SERVICE, override=False)
class MyService(Service):
... # Won't override if already registered (e.g., from config file loaded earlier)
Note: In practice, decorators typically run during module import before discover_plugins() is called, but config files are explicitly designed to override decorators for deployment-time flexibility.
import pytest
from lfx.services.manager import get_service_manager
from lfx.services.schema import ServiceType
def test_custom_service():
# Register your test service
manager = get_service_manager()
manager.register_service_class(
ServiceType.STORAGE_SERVICE,
MyTestStorageService,
override=True
)
# Use the service
from lfx.services.deps import get_storage_service
storage = get_storage_service()
assert isinstance(storage, MyTestStorageService)
Error: ValueError: Settings service cannot be registered via plugins
Cause: You attempted to register settings_service in a config file or via decorator.
Solution: Remove settings_service from your config. The settings service is foundational and always uses the built-in implementation. Its config_dir is used to discover other plugins.
Error: NoServiceRegisteredError: No factory registered for database_service
Solutions:
ServiceType enum value exactly"module.path:ClassName"{settings.config_dir}/lfx.toml./lfx.toml or ./pyproject.toml in current directoryexport LANGFLOW_LOG_LEVEL=DEBUGError: ModuleNotFoundError: No module named 'langflow'
Solutions:
pip install langflowpython -c "import langflow.services.database"Services should not depend on each other in a circular way:
❌ Bad:
class ServiceA(Service):
def __init__(self, service_b: ServiceB): ...
class ServiceB(Service):
def __init__(self, service_a: ServiceA): ...
✅ Good:
class ServiceA(Service):
def __init__(self, settings: SettingsService): ...
class ServiceB(Service):
def __init__(self, settings: SettingsService, service_a: ServiceA): ...
See:
lfx.toml.example - Example configuration file showing Langflow service registrationsrc/lfx/services/ - Minimal built-in service implementations (auth, telemetry, tracing, variable, storage, etc.)
lfx.services.auth.base (BaseAuthService) and lfx.services.auth.service (AuthService). Use get_auth_service() from lfx.services.deps. Override with auth_service = "langflow.services.auth.service:AuthService" in config for full JWT/API key auth.src/backend/base/langflow/services/ - Full-featured Langflow servicesThe pluggable service system provides: