pgml-cms/docs/open-source/pgml/guides/llms/text-generation.md
Text generation is the task of producing text. It has various use cases, including code generation, story generation, chatbots and more.
Use this for conversational AI applications or when you need to provide instructions and maintain context.
SELECT pgml.transform(
task => '{
"task": "text-generation",
"model": "meta-llama/Meta-Llama-3.1-8B-Instruct"
}'::JSONB,
inputs => ARRAY[
'{"role": "system", "content": "You are a friendly and helpful chatbot"}'::JSONB,
'{"role": "user", "content": "Tell me about yourself."}'::JSONB
]
) AS answer;
Result
["I'm so glad you asked! I'm a friendly and helpful chatbot, designed to assist and converse with users like you. I'm a large language model, which means I've been trained on a massive dataset of text from various sources, including books, articles, and conversations. Th is training enables me to understand and respond to a wide range of topics and questions.\n\nI'm constantly learning and improving my la nguage processing abilities, so I can become more accurate and helpful over time. My primary goal is to provide accurate and relevant in formation, answer your questions, and engage in productive conversations.\n\nI'm not just limited to answering questions, though! I can also:\n\n1. Generate text on a given topic or subject\n2. Offer suggestions and recommendations\n3. Summarize lengthy texts or articles\ n4. Translate text from one language to another\n5. Even create stories, poems, or jokes (if you'd like!)\n\nI'm here to help you with a ny questions, concerns, or topics you'd like to discuss. Feel free to ask me anything, and I'll do my best to assist you!"]
We follow OpenAI's standard for model parameters:
frequency_penalty - Penalizes the frequency of tokenslogit_bias - Modify the likelihood of specified tokenslogprobs - Return logprobs of the most likely token(s)top_logprobs - The number of most likely tokens to return at each token positionmax_tokens - The maximum number of tokens to generaten - The number of completions to build outpresence_penalty - Control new token penalizationresponse_format - The format of the responseseed - The seed for randomnessstop - An array of sequences to stop ontemperature - The temperature for samplingtop_p - An alternative sampling methodFor more information on these parameters see OpenAI's docs.
An example with some common parameters:
SELECT pgml.transform(
task => '{
"task": "text-generation",
"model": "meta-llama/Meta-Llama-3.1-8B-Instruct"
}'::JSONB,
inputs => ARRAY[
'{"role": "system", "content": "You are a friendly and helpful chatbot"}'::JSONB,
'{"role": "user", "content": "Tell me about yourself."}'::JSONB
],
args => '{
"max_tokens": 10,
"temperature": 0.75,
"seed": 10
}'::JSONB
) AS answer;
Result
["I'm so glad you asked! I'm a"]
Use this for simpler text-generation tasks like completing sentences or generating content based on a prompt.
SELECT pgml.transform(
task => '{
"task": "text-generation",
"model": "meta-llama/Meta-Llama-3.1-8B-Instruct"
}'::JSONB,
inputs => ARRAY[
'Three Rings for the Elven-kings under the sky, Seven for the Dwarf-lords in their halls of stone'
]
) AS answer;
Result
[", Nine for Mortal Men doomed to die, One for the Dark Lord on"]
We follow OpenAI's standard for model parameters:
best_of - Generates "best_of" completionsecho - Echo back the promptfrequency_penalty - Penalizes the frequency of tokenslogit_bias - Modify the likelihood of specified tokenslogprobs - Return logprobs of the most likely token(s)max_tokens - The maximum number of tokens to generaten - The number of completions to build outpresence_penalty - Control new token penalizationseed - The seed for randomnessstop - An array of sequences to stop ontemperature - The temperature for samplingtop_p - An alternative sampling methodFor more information on these parameters see OpenAI's docs.
An example with some common parameters:
SELECT pgml.transform(
task => '{
"task": "text-generation",
"model": "meta-llama/Meta-Llama-3.1-8B-Instruct"
}'::JSONB,
inputs => ARRAY[
'Three Rings for the Elven-kings under the sky, Seven for the Dwarf-lords in their halls of stone'
],
args => '{
"max_tokens": 10,
"temperature": 0.75,
"seed": 10
}'::JSONB
) AS answer;
Result
[", Nine for Mortal Men doomed to die,"]