Documentation Index
Fetch the complete documentation index at: https://docs.openmemind.com/llms.txt
Use this file to discover all available pages before exploring further.
Memind Open Source can be used in several ways depending on how you want to run memory in your application.
Use this page to choose the right setup path:
| Path | Best for |
|---|
| Docker Compose | Fastest path. Start a local self-hosted stack with memind-server and Memind UI. |
| Java Library | Embed Memind directly inside a Java or Spring application. |
| Source Build | Build Memind from source, run examples, start the server locally, run Memind UI, or contribute to the project. |
Requirements
Memind requires Java 21 for all Java runtime paths.
| Requirement | Docker Compose | Java Library | Source Build |
|---|
| Java 21 | Not required locally when using Docker images | Required | Required |
| Maven | Not required locally when using Docker images | Required | Required |
| Docker | Required | Optional | Optional |
| Docker Compose | Required | Optional | Optional |
| Node.js | Not required locally when using Docker images | Not required | Required for memind-ui |
| pnpm | Not required locally when using Docker images | Not required | Required for memind-ui |
| OpenAI-compatible API key | Required | Required | Required for examples and server runtime |
For model access, set at least:
export OPENAI_API_KEY=your-api-key
Optional OpenAI-compatible settings:
export OPENAI_BASE_URL=https://api.openai.com
export OPENAI_CHAT_MODEL=gpt-4o-mini
export OPENAI_EMBEDDING_MODEL=text-embedding-3-small
OPENAI_BASE_URL, OPENAI_CHAT_MODEL, and OPENAI_EMBEDDING_MODEL are optional.
The chat and embedding model choices directly affect memory extraction, insight quality,
and retrieval quality.
Docker Compose
Use this path when you want the fastest local self-hosted Memind setup. Docker Compose starts:
memind-server
- Memind UI
- A persistent Docker volume for local SQLite and vector-store data
Clone the repository:
git clone https://github.com/openmemind/memind.git
cd memind
Create a local .env file in the repository root. docker-compose.yml reads these values
automatically:
# Required.
OPENAI_API_KEY=your-api-key
# Optional provider and model overrides.
OPENAI_BASE_URL=https://openrouter.ai/api
OPENAI_CHAT_MODEL=openai/gpt-4o-mini
OPENAI_EMBEDDING_MODEL=openai/text-embedding-3-small
# Optional. Required only when you want an external rerank provider for deep retrieval.
MEMIND_RERANK_BASE_URL=https://aihubmix.com
MEMIND_RERANK_API_KEY=
MEMIND_RERANK_MODEL=jina-reranker-v3
# Optional host ports.
MEMIND_SERVER_PORT=8366
MEMIND_UI_PORT=8080
OPENAI_BASE_URL, OPENAI_CHAT_MODEL, and OPENAI_EMBEDDING_MODEL are optional.
The chat and embedding model choices directly affect memory extraction, insight quality,
and retrieval quality. If your embedding provider uses a different endpoint or key from chat,
also set EMBEDDING_BASE_URL and EMBEDDING_API_KEY.
Start the stack:
docker compose up -d --build
After the images are built and the containers start:
| Service | URL |
|---|
| Memind UI | http://localhost:8080 |
| Server health check | http://localhost:8366/open/v1/health |
| Open API base path | http://localhost:8366/open/v1 |
| Admin API base path | http://localhost:8366/admin/v1 |
The UI container proxies /open/* and /admin/* to memind-server, so the browser can use
Memind UI as a same-origin local admin console.
Common commands:
# View logs
docker compose logs -f memind-server
docker compose logs -f memind-ui
# Stop containers but keep persisted memory data
docker compose down
# Stop containers and remove persisted memory data
docker compose down -v
By default, memind-server stores SQLite data and the fallback file vector store in the
named Docker volume memind-data, mounted at /app/data inside the container.
Docker Compose supports these common environment variables:
| Variable | Required | Default |
|---|
OPENAI_API_KEY | Yes | None |
OPENAI_BASE_URL | No | https://openrouter.ai/api |
OPENAI_CHAT_MODEL | No | openai/gpt-4o-mini |
OPENAI_EMBEDDING_MODEL | No | openai/text-embedding-3-small |
EMBEDDING_BASE_URL | No | Same as OPENAI_BASE_URL |
EMBEDDING_API_KEY | No | Same as OPENAI_API_KEY |
MEMIND_SERVER_PORT | No | 8366 |
MEMIND_UI_PORT | No | 8080 |
MEMIND_DATASOURCE_URL | No | jdbc:sqlite:/app/data/memind-server.db |
MEMIND_VECTOR_STORE_PATH | No | /app/data/vector-store.json |
MEMIND_RERANK_BASE_URL | No | https://aihubmix.com |
MEMIND_RERANK_API_KEY | No | Empty |
MEMIND_RERANK_MODEL | No | jina-reranker-v3 |
Memind UI currently has no login or authorization flow. Keep it on localhost and do not expose the UI or proxied Memind server endpoints to public networks.
For deeper server configuration, see Server. For UI usage, see Admin UI.
Java Library
Use this path when you want to embed Memind directly into a Java application.
Import the Memind BOM first:
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.openmemind.ai</groupId>
<artifactId>memind-dependencies</artifactId>
<version>0.2.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
Then add the core runtime, Spring AI integration, and one storage plugin.
For a simple SQLite setup:
<dependencies>
<dependency>
<groupId>com.openmemind.ai</groupId>
<artifactId>memind-core</artifactId>
</dependency>
<dependency>
<groupId>com.openmemind.ai</groupId>
<artifactId>memind-plugin-ai-spring-ai</artifactId>
</dependency>
<dependency>
<groupId>com.openmemind.ai</groupId>
<artifactId>memind-plugin-jdbc-sqlite</artifactId>
</dependency>
</dependencies>
For MySQL or PostgreSQL, replace the SQLite plugin with one of:
<dependency>
<groupId>com.openmemind.ai</groupId>
<artifactId>memind-plugin-jdbc-mysql</artifactId>
</dependency>
<dependency>
<groupId>com.openmemind.ai</groupId>
<artifactId>memind-plugin-jdbc-postgresql</artifactId>
</dependency>
After adding dependencies, assemble the runtime with Memory.builder().
The detailed Java SDK setup is covered in Java SDK. If you want to run a working example first, start with the Quickstart.
Source Build
Use this path when you want to build Memind from source, run examples, start the server locally, run Memind UI, or contribute to the project.
Clone the repository:
git clone https://github.com/openmemind/memind.git
cd memind
Build the Java modules:
mvn clean install -DskipTests
Run the maintained Java quickstart example:
OPENAI_API_KEY=your-api-key \
mvn -pl memind-examples/memind-example-java -am -DskipTests exec:java \
-Dexec.mainClass=com.openmemind.ai.memory.example.java.quickstart.QuickStartExample
Run memind-server from source
Start memind-server locally:
OPENAI_API_KEY=your-api-key \
mvn -pl memind-server -am -DskipTests spring-boot:run
The server starts on:
Check health:
curl http://localhost:8366/open/v1/health
Run Memind UI from source
memind-ui is a standalone Vite React application. It is not bundled into memind-server.
Requirements:
- Node.js
^20.19.0 || ^22.12.0 || >=24.0.0
- pnpm
>=9.0.0
- Corepack is recommended
Start memind-server first, then start the UI:
cd memind-ui
corepack enable
pnpm install
pnpm dev
Open the Vite URL shown in the terminal.
During development, Memind UI proxies same-origin API requests to the local server:
| UI request path | Proxied to |
|---|
/admin | http://127.0.0.1:8366 |
/open | http://127.0.0.1:8366 |
You can also start the UI with local mock data:
Mock mode is frontend-only. It is useful when a real memind-server has little or no data.
Memind UI currently has no login or authorization flow. Keep it on localhost and do not expose the dev server, built frontend, or proxied Memind server endpoints to public networks.
Build Memind UI
To build the frontend bundle:
To preview the built UI locally: