packages/pieces/community/goodmem/README.md
This piece provides a complete integration with Goodmem, a powerful vector-based memory storage and semantic retrieval system for AI applications. Store documents as memories with vector embeddings and perform similarity-based semantic search across your data.
You need a running Goodmem instance. Install it on your VM or local machine:
Visit: https://goodmem.ai/
Follow the installation instructions for your platform (Docker, local installation, or cloud deployment).
Before you can create spaces and memories, you need to set up an embedder model:
This piece uses Custom Authentication:
http://localhost:8080, https://api.goodmem.ai)gm_)Create a new space (container for memories) with configurable settings. If a space with the same name already exists, it will be reused instead of creating a duplicate.
Options:
Store a document or plain text as a memory in a space. The content is automatically chunked and embedded for semantic search.
Options:
Perform semantic search across one or more spaces to find relevant memory chunks. Supports advanced post-processing with reranking and LLM-generated contextual responses.
Options:
Advanced Post-Processing:
Retrieve a specific memory by its ID, including metadata, processing status, and optionally the original content.
Options:
Permanently delete a memory and all its associated chunks and vector embeddings.
Options: