Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.openmemind.com/llms.txt

Use this file to discover all available pages before exploring further.

Run the maintained Java quickstart example and retrieve your first Memind memory. This guide uses the default pure Java path: memind-core, Spring AI, SQLite, and a local file-based vector store. You will run a complete memory cycle:
conversation messages
  -> memory.addMessages(...)
  -> memory.retrieve(..., SIMPLE)
  -> retrieved context

Prerequisites

Before you start, make sure you have:
  • Java 21 - Maven - An OpenAI-compatible API key
Set your API key as an environment variable:
export OPENAI_API_KEY=your-api-key
By default, the Java examples use:
SettingDefault
OPENAI_BASE_URLhttps://api.openai.com
OPENAI_CHAT_MODELgpt-4o-mini
OPENAI_EMBEDDING_MODELtext-embedding-3-small
SQLite databasetarget/example-runtime/quickstart/memind.db
Vector storetarget/example-runtime/quickstart/vector-store.json
If you use an OpenAI-compatible provider, override the model endpoint and model names:
export OPENAI_BASE_URL=https://api.openai.com
export OPENAI_CHAT_MODEL=gpt-4o-mini
export OPENAI_EMBEDDING_MODEL=text-embedding-3-small

Clone the repository

git clone https://github.com/openmemind/memind.git
cd memind

Run the quickstart example

Run the maintained quickstart example:
OPENAI_API_KEY=your-api-key \
mvn -pl memind-examples/memind-example-java -am -DskipTests exec:java \
  -Dexec.mainClass=com.openmemind.ai.memory.example.java.quickstart.QuickStartExample
If you already exported OPENAI_API_KEY, you can omit it from the command:
mvn -pl memind-examples/memind-example-java -am -DskipTests exec:java \
  -Dexec.mainClass=com.openmemind.ai.memory.example.java.quickstart.QuickStartExample

What the example does

The quickstart example runs two steps: it extracts memory from a prepared conversation, then retrieves relevant memory with a natural-language query.

Step 1: Extract memory

The example loads conversation messages from:
memind-examples/data/quickstart/messages.json
Then it creates a memory identity:
var memoryId = DefaultMemoryId.of("user-quickstart", "memind");
A Memind memory identity is made of:
FieldMeaning
userIdThe user whose memory is being written or retrieved.
agentIdThe agent or application that owns this memory namespace.
The example writes the conversation messages with:
memory.addMessages(
        memoryId,
        messages,
        ExtractionConfig.defaults().withLanguage("Chinese"))
    .block();
This extracts memory from the message batch. Internally, Memind stores source-level Raw Data, extracts structured Memory Items, and updates insight-related buffers when insight extraction is enabled.

Step 2: Retrieve memory

After extraction, the example asks a natural-language question:
var query = "这个用户的技术背景是什么?";
Then it retrieves memory with the SIMPLE strategy:
var retrieval = memory.retrieve(
        memoryId,
        query,
        RetrievalConfig.Strategy.SIMPLE).block();
SIMPLE retrieval is the low-latency retrieval path. It can combine vector search, keyword search, temporal signals, graph assist, memory-thread assist, and result fusion depending on the runtime options.

Understand the core API

For most applications, the core API starts with three concepts.

Build a runtime

Memind is assembled through Memory.builder(). The quickstart runtime wires together:
  • a structured chat client - a memory store - a conversation buffer - text search - a vector store - runtime options
The full builder configuration is handled by the example support code. You do not need to write it manually for this quickstart.

Write memory

Use addMessages() when you already have a complete conversation segment:
memory.addMessages(memoryId, messages).block();
For streaming chat applications, Memind also supports addMessage(), which buffers messages and commits them when a conversation boundary is detected.

Retrieve memory

Use retrieve() when your agent needs relevant context:
var retrieval = memory.retrieve(
        memoryId,
        "What does the user prefer?",
        RetrievalConfig.Strategy.SIMPLE).block();
Retrieval can return memory items, insights, raw-data references, evidence, and debug traces depending on the interface and configuration.

Runtime data

The quickstart writes local runtime data under:
target/example-runtime/quickstart/
The default files are:
target/example-runtime/quickstart/memind.db
target/example-runtime/quickstart/vector-store.json
If you want to rerun the example from a clean state, delete that directory:
rm -rf target/example-runtime/quickstart

Troubleshooting

Missing API key

If OPENAI_API_KEY is missing, the example cannot create the model client. Set it before running the example:
export OPENAI_API_KEY=your-api-key

Wrong Java version

Memind requires Java 21. Check your local Java version:
java -version

Maven dependency download is slow

The first run may take longer because Maven downloads dependencies for the multi-module build. Run the same command again after dependencies are cached.

OpenAI-compatible provider issues

If you use a non-default provider, check that OPENAI_BASE_URL, OPENAI_CHAT_MODEL, and OPENAI_EMBEDDING_MODEL match that provider. For example:
export OPENAI_BASE_URL=https://your-provider.example.com
export OPENAI_CHAT_MODEL=your-chat-model
export OPENAI_EMBEDDING_MODEL=your-embedding-model