examples/cybertron-embedding-example/README.md
Hello there! 👋 This example demonstrates how to use the Cybertron embedding model with LangChain in Go. It's a fun and practical way to explore document embeddings and similarity searches. Let's break down what this example does!
This example showcases two main features:
The exampleInMemory function does the following:
This helps you understand how semantically related different words are in the embedding space.
The exampleWeaviate function demonstrates how to use the Cybertron embeddings with a Weaviate vector store:
This shows how you can use embeddings for more advanced document retrieval tasks.
Cybertron Embedder: The example uses the "BAAI/bge-small-en-v1.5" model to generate embeddings. This model is automatically downloaded and cached.
Cosine Similarity: A custom function is implemented to calculate the similarity between embeddings.
Weaviate Integration: The example shows how to set up and use a Weaviate vector store with the Cybertron embeddings.
To run this example:
WEAVIATE_SCHEMEWEAVIATE_HOSTgo run cybertron-embedding.goThe Cybertron model runs locally on your CPU, so larger models might be slow. The example uses a smaller model for better performance.
Have fun exploring embeddings and semantic similarity with this example! 🚀🔍