Back to Llama Index

Upstash Vector Store

docs/examples/vector_stores/UpstashVectorDemo.ipynb

0.14.213.0 KB
Original Source

Upstash Vector Store

We're going to look at how to use LlamaIndex to interface with Upstash Vector!

python
! pip install -q llama-index upstash-vector
python
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from llama_index.core.vector_stores import UpstashVectorStore
from llama_index.core import StorageContext
import textwrap
import openai
python
# Setup the OpenAI API
openai.api_key = "sk-..."
python
# Download data
! mkdir -p 'data/paul_graham/'
! wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'

Now, we can load the documents using the LlamaIndex SimpleDirectoryReader

python
documents = SimpleDirectoryReader("./data/paul_graham/").load_data()

print("# Documents:", len(documents))

To create an index on Upstash, visit https://console.upstash.com/vector, create an index with 1536 dimensions and Cosine distance metric. Copy the URL and token below

python
vector_store = UpstashVectorStore(url="https://...", token="...")

storage_context = StorageContext.from_defaults(vector_store=vector_store)
index = VectorStoreIndex.from_documents(
    documents, storage_context=storage_context
)

Now we've successfully created an index and populated it with vectors from the essay! The data will take a second to index and then it'll be ready for querying.

python
query_engine = index.as_query_engine()
res1 = query_engine.query("What did the author learn?")
print(textwrap.fill(str(res1), 100))

print("\n")

res2 = query_engine.query("What is the author's opinion on startups?")
print(textwrap.fill(str(res2), 100))

Metadata Filtering

You can pass MetadataFilters with your VectorStoreQuery to filter the nodes returned from Upstash vector store.

python
import os

from llama_index.vector_stores.upstash import UpstashVectorStore
from llama_index.core.vector_stores.types import (
    MetadataFilter,
    MetadataFilters,
    FilterOperator,
)

vector_store = UpstashVectorStore(
    url=os.environ.get("UPSTASH_VECTOR_URL") or "",
    token=os.environ.get("UPSTASH_VECTOR_TOKEN") or "",
)

index = VectorStoreIndex.from_vector_store(vector_store=vector_store)

filters = MetadataFilters(
    filters=[
        MetadataFilter(
            key="author", value="Marie Curie", operator=FilterOperator.EQ
        )
    ],
)

retriever = index.as_retriever(filters=filters)

retriever.retrieve("What is inception about?")

We can also combine multiple MetadataFilters with AND or OR condition

python
from llama_index.core.vector_stores import FilterOperator, FilterCondition

filters = MetadataFilters(
    filters=[
        MetadataFilter(
            key="theme",
            value=["Fiction", "Horror"],
            operator=FilterOperator.IN,
        ),
        MetadataFilter(key="year", value=1997, operator=FilterOperator.GT),
    ],
    condition=FilterCondition.AND,
)

retriever = index.as_retriever(filters=filters)
retriever.retrieve("Harry Potter?")