Back to Mem0

Aws Bedrock

embedchain/notebooks/aws-bedrock.ipynb

2.0.11.9 KB
Original Source

Cookbook for using Azure OpenAI with Embedchain

Step-1: Install embedchain package

python
!pip install embedchain

You can find these env variables on your AWS Management Console.

python
import os

os.environ["AWS_ACCESS_KEY_ID"] = "AKIAIOSFODNN7EXAMPLE" # replace with your AWS_ACCESS_KEY_ID
os.environ["AWS_SECRET_ACCESS_KEY"] = "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY" # replace with your AWS_SECRET_ACCESS_KEY
os.environ["AWS_SESSION_TOKEN"] = "IQoJb3JpZ2luX2VjEJr...==" # replace with your AWS_SESSION_TOKEN
os.environ["AWS_DEFAULT_REGION"] = "us-east-1" # replace with your AWS_DEFAULT_REGION

from embedchain import App

Step-3: Define your llm and embedding model config

May need to install langchain-anthropic to try with claude models

python
config = """
llm:
  provider: aws_bedrock
  config:
    model: 'amazon.titan-text-express-v1'
    deployment_name: ec_titan_express_v1
    temperature: 0.5
    max_tokens: 1000
    top_p: 1
    stream: false

embedder:
  provider: aws_bedrock
  config:
    model: amazon.titan-embed-text-v2:0
    deployment_name: ec_embeddings_titan_v2
"""

# Write the multi-line string to a YAML file
with open('aws_bedrock.yaml', 'w') as file:
    file.write(config)

Step-4 Create two embedchain apps based on the config

python
app = App.from_config(config_path="aws_bedrock.yaml")
app.reset() # Reset the app to clear the cache and start fresh

Step-5: Add a data source to unrelated to the question you are asking

python
app.add("https://www.lipsum.com/")

Step-6: Notice the underlying context changing with the updated data source

python
question = "Who is Elon Musk?"
context = " ".join([a['context'] for a in app.search(question)])
print("Context:", context)
app.add("https://www.forbes.com/profile/elon-musk")
context = " ".join([a['context'] for a in app.search(question)])
print("Context with updated memory:", context)