docs/integrations/ai-engines/litellm.mdx
This documentation describes the integration of MindsDB with LiteLLM, a framework that simplifies access to models of various providers.
Before proceeding, ensure the following prerequisites are met:
Create an AI engine from the LiteLLM handler.
CREATE ML_ENGINE litellm
FROM litellm;
Create a model using litellm as an engine.
CREATE MODEL litellm_model
PREDICT target_column
USING
engine = "litellm_engine",
model = "gpt-4",
base_url = "https://api.openai.com/v1",
api_key = "sk-xxx",
prompt_template = "answer questions in three bullet points: {question}";
The parameters include:
engine is the LiteLLM engine created based on the LiteLLM handler with the CREATE ML_ENGINE statement.
model is the one of the models supported by LiteLLM. See the complete list of the supported providers and models here.
base_url is an optional parameter that stores the base URL for accessing models.
api_key stores the API key of the provider whose model is used.
prompt_template stores the instructions to the model.
Here is how to create and use models through LiteLLM in MindsDB.
CREATE ML_ENGINE litellm
FROM litellm;
CREATE MODEL chat_model
PREDICT answer
USING
engine = "litellm",
model = "ollama/llama2:latest",
base_url = "http://localhost:11434";
SELECT *
FROM chat_model
WHERE question = "what is ai?";