Run the maintained Java quickstart example and retrieve your first Memind memory. This guide uses the default pure Java path:Documentation Index
Fetch the complete documentation index at: https://docs.openmemind.com/llms.txt
Use this file to discover all available pages before exploring further.
memind-core, Spring AI, SQLite, and a local file-based vector store. You will run a complete memory cycle:
Prerequisites
Before you start, make sure you have:- Java 21 - Maven - An OpenAI-compatible API key
| Setting | Default |
|---|---|
OPENAI_BASE_URL | https://api.openai.com |
OPENAI_CHAT_MODEL | gpt-4o-mini |
OPENAI_EMBEDDING_MODEL | text-embedding-3-small |
| SQLite database | target/example-runtime/quickstart/memind.db |
| Vector store | target/example-runtime/quickstart/vector-store.json |
Clone the repository
Run the quickstart example
Run the maintained quickstart example:OPENAI_API_KEY, you can omit it from the command:
What the example does
The quickstart example runs two steps: it extracts memory from a prepared conversation, then retrieves relevant memory with a natural-language query.Step 1: Extract memory
The example loads conversation messages from:| Field | Meaning |
|---|---|
userId | The user whose memory is being written or retrieved. |
agentId | The agent or application that owns this memory namespace. |
Step 2: Retrieve memory
After extraction, the example asks a natural-language question:SIMPLE strategy:
SIMPLE retrieval is the low-latency retrieval path. It can combine vector search, keyword search, temporal signals, graph assist, memory-thread assist, and result fusion depending on the runtime options.
Understand the core API
For most applications, the core API starts with three concepts.Build a runtime
Memind is assembled throughMemory.builder().
The quickstart runtime wires together:
- a structured chat client - a memory store - a conversation buffer - text search - a vector store - runtime options
Write memory
UseaddMessages() when you already have a complete conversation segment:
addMessage(), which buffers messages and commits them when a conversation boundary is detected.
Retrieve memory
Useretrieve() when your agent needs relevant context:
Runtime data
The quickstart writes local runtime data under:Troubleshooting
Missing API key
IfOPENAI_API_KEY is missing, the example cannot create the model client.
Set it before running the example:
Wrong Java version
Memind requires Java 21. Check your local Java version:Maven dependency download is slow
The first run may take longer because Maven downloads dependencies for the multi-module build. Run the same command again after dependencies are cached.OpenAI-compatible provider issues
If you use a non-default provider, check thatOPENAI_BASE_URL, OPENAI_CHAT_MODEL, and OPENAI_EMBEDDING_MODEL match that provider.
For example:

