docs/examples/llm/konko.ipynb
<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/llm/Konko.ipynb" target="_parent"></a>
If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.
Konko API is a fully managed Web API designed to help application developers:
Konko API is a fully managed API designed to help application developers:
Explore Available Models: Start by browsing through the available models on Konko. Each model caters to different use cases and capabilities.
Identify Suitable Endpoints: Determine which endpoint (ChatCompletion or Completion) supports your selected model.
Selecting a Model: Choose a model based on its metadata and how well it fits your use case.
Prompting Guidelines: Once a model is selected, refer to the prompting guidelines to effectively communicate with it.
Using the API: Finally, use the appropriate Konko API endpoint to call the model and receive responses.
To run this notebook, you'll need Konko API key. You can create one by signing up on Konko.
This example goes over how to use LlamaIndex to interact with Konko ChatCompletion models and Completion models
%pip install llama-index-llms-konko
!pip install llama-index
chat with ChatMessage ListYou need to set env var KONKO_API_KEY
import os
os.environ["KONKO_API_KEY"] = "<your-api-key>"
from llama_index.llms.konko import Konko
from llama_index.core.llms import ChatMessage
llm = Konko(model="meta-llama/llama-2-13b-chat")
messages = ChatMessage(role="user", content="Explain Big Bang Theory briefly")
resp = llm.chat([messages])
print(resp)
chat with OpenAI ModelsYou need to either set env var OPENAI_API_KEY
import os
os.environ["OPENAI_API_KEY"] = "<your-api-key>"
llm = Konko(model="gpt-3.5-turbo")
message = ChatMessage(role="user", content="Explain Big Bang Theory briefly")
resp = llm.chat([message])
print(resp)
message = ChatMessage(role="user", content="Tell me a story in 250 words")
resp = llm.stream_chat([message], max_tokens=1000)
for r in resp:
print(r.delta, end="")
complete with Promptllm = Konko(model="numbersstation/nsql-llama-2-7b", max_tokens=100)
text = """CREATE TABLE stadium (
stadium_id number,
location text,
name text,
capacity number,
highest number,
lowest number,
average number
)
CREATE TABLE singer (
singer_id number,
name text,
country text,
song_name text,
song_release_year text,
age number,
is_male others
)
CREATE TABLE concert (
concert_id number,
concert_name text,
theme text,
stadium_id text,
year text
)
CREATE TABLE singer_in_concert (
concert_id number,
singer_id text
)
-- Using valid SQLite, answer the following questions for the tables provided above.
-- What is the maximum capacity of stadiums ?
SELECT"""
response = llm.complete(text)
print(response)
llm = Konko(model="phind/phind-codellama-34b-v2", max_tokens=100)
text = """### System Prompt
You are an intelligent programming assistant.
### User Message
Implement a linked list in C++
### Assistant
..."""
resp = llm.stream_complete(text, max_tokens=1000)
for r in resp:
print(r.delta, end="")
llm = Konko(model="meta-llama/llama-2-13b-chat")
resp = llm.stream_complete(
"Show me the c++ code to send requests to HTTP Server", max_tokens=1000
)
for r in resp:
print(r.delta, end="")