docs/versioned_docs/version-1.9.0/Deployment/deployment-kubernetes-prod.mdx
import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem';
The Langflow runtime Helm chart is tailored for deploying applications in a production environment. It is focused on stability, performance, isolation, and security to ensure that applications run reliably and efficiently.
:::warning
For security reasons, the default Langflow runtime Helm chart sets readOnlyRootFilesystem: true. This setting prevents modifications to the container's root filesystem at runtime, which is a recommended security measure in production environments.
If readOnlyRootFilesystem is disabled (false), it degrades your deployment's security posture. Only disable this setting if you understand the security implications and you have implemented other security measures.
For more information, see the Kubernetes documentation. :::
Add the repository to Helm:
helm repo add langflow https://langflow-ai.github.io/langflow-helm-charts
helm repo update
Install the Langflow app with the default options in the langflow namespace.
If you have a custom image with packaged flows, you can deploy Langflow by overriding the default values.yaml with the --set flag:
helm install my-langflow-app langflow/langflow-runtime -n langflow --create-namespace --set image.repository=myuser/langflow-hello-world --set image.tag=1.0.0
Install the chart and download flows from a URL with the --set flag:
helm install my-langflow-app-with-flow langflow/langflow-runtime \
-n langflow \
--create-namespace \
--set 'downloadFlows.flows[0].url=https://raw.githubusercontent.com/langflow-ai/langflow/dev/tests/data/basic_example.json'
If your shell requires escaping square brackets, modify the --set path as needed.
For example, --set 'downloadFlows.flows\[0\].url=https://raw.githubusercontent.com/langflow-ai/langflow/dev/tests/data/basic_example.json'.
Check the status of the pods:
kubectl get pods -n langflow
Get your service name:
kubectl get svc -n langflow
The service name is your release name suffixed by -langflow-runtime. For example, if you used helm install my-langflow-app-with-flow, then the service name is my-langflow-app-with-flow-langflow-runtime.
Enable port forwarding to access Langflow from your local machine:
kubectl port-forward -n langflow svc/my-langflow-app-with-flow-langflow-runtime 7860:7860
Confirm you can access the API by calling http://localhost:7860/api/v1/flows/:
curl -v http://localhost:7860/api/v1/flows/
A successful request returns a list of flows.
Run a packaged flow. The following example gets the first flow ID from the flows list, and then runs the flow:
# Get flow ID
id=$(curl -s "http://localhost:7860/api/v1/flows/" | jq -r '.[0].id')
# Run flow
curl -X POST \
"http://localhost:7860/api/v1/run/$id?stream=false" \
-H 'Content-Type: application/json' \
-d '{
"input_value": "Hello!",
"output_type": "chat",
"input_type": "chat"
}'
Use the .env section of the Langflow runtime Helm chart's values.yaml file to define environment variables for your Langflow deployment.
This includes built-in Langflow environment variables, as well as global variables used by your flows.
Langflow can source global variables from your runtime environment, such as Kubernetes secrets referenced in values.yaml.
For example, the Langflow runtime Helm chart's example flow JSON uses a global variable that is a secret.
If you want to run this flow in your Langflow deployment on Kubernetes, you need to include the secret in your runtime configuration.
:::tip When you export flows as JSON files, it's recommended to omit secrets. Whether or not a secret is included depends on how you declare the secret in your flow and whether you use the Save with my API keys option. For more information, see Import and export flows. :::
Kubernetes secrets are the recommended way to store sensitive values and credentials.
Use secretKeyRef to reference a Kubernetes secret in values.yaml:
env:
- name: OPENAI_API_KEY
valueFrom:
secretKeyRef:
name: openai-credentials
key: openai-key
You can use kubectl and helm commands to create and set secrets:
Create a secret:
kubectl create secret generic openai-credentials \
--namespace langflow \
--from-literal=OPENAI_API_KEY=sk...
Verify the secret exists:
kubectl get secrets -n langflow openai-credentials
The result is encrypted.
Upgrade the Helm release to use the secret:
helm upgrade my-langflow-app-image langflow/langflow-runtime -n langflow \
--reuse-values \
--set "extraEnv[0].name=OPENAI_API_KEY" \
--set "extraEnv[0].valueFrom.secretKeyRef.name=openai-credentials" \
--set "extraEnv[0].valueFrom.secretKeyRef.key=OPENAI_API_KEY"
Escape square brackets if required by your shell.
For non-sensitive variables, such as LANGFLOW_LOG_LEVEL, you can set the value directly in values.yaml:
env:
- name: LANGFLOW_LOG_LEVEL
value: "INFO"
Use replicaCount and resources in the Langflow runtime Helm chart's values.yaml file to configure scaling:
Horizontal scaling: Use replicaCount to set the number of replicas for your Langflow deployment.
replicaCount: 3
Vertical scaling: Use the resources section to adjust pod resources depending on your application's needs.
resources:
requests:
memory: "2Gi"
cpu: "1000m"