embedchain/docs/api-reference/app/chat.mdx
chat() method allows you to chat over your data sources using a user-friendly chat API. You can find the signature below:
If citations=True, returns a tuple with answer and citations respectively.
</ResponseField>
If you want to get the answer to question and return both answer and citations, use the following code snippet:
from embedchain import App
# Initialize app
app = App()
# Add data source
app.add("https://www.forbes.com/profile/elon-musk")
# Get relevant answer for your query
answer, sources = app.chat("What is the net worth of Elon?", citations=True)
print(answer)
# Answer: The net worth of Elon Musk is $221.9 billion.
print(sources)
# [
# (
# 'Elon Musk PROFILEElon MuskCEO, Tesla$247.1B$2.3B (0.96%)Real Time Net Worthas of 12/7/23 ...',
# {
# 'url': 'https://www.forbes.com/profile/elon-musk',
# 'score': 0.89,
# ...
# }
# ),
# (
# '74% of the company, which is now called X.Wealth HistoryHOVER TO REVEAL NET WORTH BY YEARForbes ...',
# {
# 'url': 'https://www.forbes.com/profile/elon-musk',
# 'score': 0.81,
# ...
# }
# ),
# (
# 'founded in 2002, is worth nearly $150 billion after a $750 million tender offer in June 2023 ...',
# {
# 'url': 'https://www.forbes.com/profile/elon-musk',
# 'score': 0.73,
# ...
# }
# )
# ]
If you just want to return answers and don't want to return citations, you can use the following example:
from embedchain import App
# Initialize app
app = App()
# Add data source
app.add("https://www.forbes.com/profile/elon-musk")
# Chat on your data using `.chat()`
answer = app.chat("What is the net worth of Elon?")
print(answer)
# Answer: The net worth of Elon Musk is $221.9 billion.
If you want to maintain chat sessions for different users, you can simply pass the session_id keyword argument. See the example below:
from embedchain import App
app = App()
app.add("https://www.forbes.com/profile/elon-musk")
# Chat on your data using `.chat()`
app.chat("What is the net worth of Elon Musk?", session_id="user1")
# 'The net worth of Elon Musk is $250.8 billion.'
app.chat("What is the net worth of Bill Gates?", session_id="user2")
# "I don't know the current net worth of Bill Gates."
app.chat("What was my last question", session_id="user1")
# 'Your last question was "What is the net worth of Elon Musk?"'
If you want to customize the context window that you want to use during chat (default context window is 3 document chunks), you can do using the following code snippet:
from embedchain import App
from embedchain.config import BaseLlmConfig
app = App()
app.add("https://www.forbes.com/profile/elon-musk")
query_config = BaseLlmConfig(number_documents=5)
app.chat("What is the net worth of Elon Musk?", config=query_config)
Mem0 is a cutting-edge long-term memory for LLMs to enable personalization for the GenAI stack. It enables LLMs to remember past interactions and provide more personalized responses.
In order to use Mem0 to enable memory for personalization in your apps:
mem0 package using pip install mem0ai.memory, refer Configurations.from embedchain import App
config = {
"memory": {
"top_k": 5
}
}
app = App.from_config(config=config)
app.add("https://www.forbes.com/profile/elon-musk")
app.chat("What is the net worth of Elon Musk?")
top_k parameter in the memory configuration specifies the number of top memories to consider during retrieval.