vectorstores/chroma/README.md
You can access Chroma via the included implementation of the
vectorstores.VectorStore interface
by creating and using a Chroma client Store instance with
the New function API.
Until an "in-memory" version is released, only client/server mode is available.
Note: Additional ways to run Chroma locally can be found in Chroma Cookbook
Use the WithChromaURL API or the CHROMA_URL environment
variable to specify the URL of the Chroma server when creating the client instance.
To use the OpenAI LLM with Chroma, use either the
WithOpenAIAPIKey API or the OPENAI_API_KEY environment
variable when creating the client.
Running a Chroma server in a local docker instance can be especially useful for testing and development workflows. An example invocation scenario is presented below:
As of this writing, the newest release of the Chroma docker image is chroma:0.5.0. Running it directly while exposing its port to your local machine can be accomplished with:
$ docker run -p 8000:8000 ghcr.io/chroma-core/chroma:0.5.0
langchaingo ApplicationWith the "Simple Docker Server" running (see above), running the included
example langchaingo app should produce the following result:
$ export CHROMA_URL=http://localhost:8000
$ export OPENAI_API_KEY=YourOpenApiKeyGoesHere
$ go run ./examples/chroma-vectorstore-example/chroma_vectorstore_example.go
Results:
1. case: Up to 5 Cities in Japan
result: Tokyo, Nagoya, Kyoto, Fukuoka, Hiroshima
2. case: A City in South America
result: Buenos Aires
3. case: Large Cities in South America
result: Sao Paulo, Rio de Janeiro
The test suite chroma_test.go started as a clone of the adjacent pinecone_test.go,
and is initially quite sparse. Consider contributing new test cases, or adding
coverage to accompany any changes made to the code.