Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.openmemind.com/llms.txt

Use this file to discover all available pages before exploring further.

Memind Open Source can be used in several ways depending on how you want to run memory in your application. Use this page to choose the right setup path:
PathBest for
Docker ComposeFastest path. Start a local self-hosted stack with memind-server and Memind UI.
Java LibraryEmbed Memind directly inside a Java or Spring application.
Source BuildBuild Memind from source, run examples, start the server locally, run Memind UI, or contribute to the project.

Requirements

Memind requires Java 21 for all Java runtime paths.
RequirementDocker ComposeJava LibrarySource Build
Java 21Not required locally when using Docker imagesRequiredRequired
MavenNot required locally when using Docker imagesRequiredRequired
DockerRequiredOptionalOptional
Docker ComposeRequiredOptionalOptional
Node.jsNot required locally when using Docker imagesNot requiredRequired for memind-ui
pnpmNot required locally when using Docker imagesNot requiredRequired for memind-ui
OpenAI-compatible API keyRequiredRequiredRequired for examples and server runtime
For model access, set at least:
export OPENAI_API_KEY=your-api-key
Optional OpenAI-compatible settings:
export OPENAI_BASE_URL=https://api.openai.com
export OPENAI_CHAT_MODEL=gpt-4o-mini
export OPENAI_EMBEDDING_MODEL=text-embedding-3-small
OPENAI_BASE_URL, OPENAI_CHAT_MODEL, and OPENAI_EMBEDDING_MODEL are optional. The chat and embedding model choices directly affect memory extraction, insight quality, and retrieval quality.

Docker Compose

Use this path when you want the fastest local self-hosted Memind setup. Docker Compose starts:
  • memind-server
  • Memind UI
  • A persistent Docker volume for local SQLite and vector-store data
Clone the repository:
git clone https://github.com/openmemind/memind.git
cd memind
Create a local .env file in the repository root. docker-compose.yml reads these values automatically:
# Required.
OPENAI_API_KEY=your-api-key

# Optional provider and model overrides.
OPENAI_BASE_URL=https://openrouter.ai/api
OPENAI_CHAT_MODEL=openai/gpt-4o-mini
OPENAI_EMBEDDING_MODEL=openai/text-embedding-3-small

# Optional. Required only when you want an external rerank provider for deep retrieval.
MEMIND_RERANK_BASE_URL=https://aihubmix.com
MEMIND_RERANK_API_KEY=
MEMIND_RERANK_MODEL=jina-reranker-v3

# Optional host ports.
MEMIND_SERVER_PORT=8366
MEMIND_UI_PORT=8080
OPENAI_BASE_URL, OPENAI_CHAT_MODEL, and OPENAI_EMBEDDING_MODEL are optional. The chat and embedding model choices directly affect memory extraction, insight quality, and retrieval quality. If your embedding provider uses a different endpoint or key from chat, also set EMBEDDING_BASE_URL and EMBEDDING_API_KEY. Start the stack:
docker compose up -d --build
After the images are built and the containers start:
ServiceURL
Memind UIhttp://localhost:8080
Server health checkhttp://localhost:8366/open/v1/health
Open API base pathhttp://localhost:8366/open/v1
Admin API base pathhttp://localhost:8366/admin/v1
The UI container proxies /open/* and /admin/* to memind-server, so the browser can use Memind UI as a same-origin local admin console. Common commands:
# View logs
docker compose logs -f memind-server
docker compose logs -f memind-ui

# Stop containers but keep persisted memory data
docker compose down

# Stop containers and remove persisted memory data
docker compose down -v
By default, memind-server stores SQLite data and the fallback file vector store in the named Docker volume memind-data, mounted at /app/data inside the container. Docker Compose supports these common environment variables:
VariableRequiredDefault
OPENAI_API_KEYYesNone
OPENAI_BASE_URLNohttps://openrouter.ai/api
OPENAI_CHAT_MODELNoopenai/gpt-4o-mini
OPENAI_EMBEDDING_MODELNoopenai/text-embedding-3-small
EMBEDDING_BASE_URLNoSame as OPENAI_BASE_URL
EMBEDDING_API_KEYNoSame as OPENAI_API_KEY
MEMIND_SERVER_PORTNo8366
MEMIND_UI_PORTNo8080
MEMIND_DATASOURCE_URLNojdbc:sqlite:/app/data/memind-server.db
MEMIND_VECTOR_STORE_PATHNo/app/data/vector-store.json
MEMIND_RERANK_BASE_URLNohttps://aihubmix.com
MEMIND_RERANK_API_KEYNoEmpty
MEMIND_RERANK_MODELNojina-reranker-v3
Memind UI currently has no login or authorization flow. Keep it on localhost and do not expose the UI or proxied Memind server endpoints to public networks.
For deeper server configuration, see Server. For UI usage, see Admin UI.

Java Library

Use this path when you want to embed Memind directly into a Java application. Import the Memind BOM first:
<dependencyManagement>
  <dependencies>
    <dependency>
      <groupId>com.openmemind.ai</groupId>
      <artifactId>memind-dependencies</artifactId>
      <version>0.2.0</version>
      <type>pom</type>
      <scope>import</scope>
    </dependency>
  </dependencies>
</dependencyManagement>
Then add the core runtime, Spring AI integration, and one storage plugin. For a simple SQLite setup:
<dependencies>
  <dependency>
    <groupId>com.openmemind.ai</groupId>
    <artifactId>memind-core</artifactId>
  </dependency>

  <dependency>
    <groupId>com.openmemind.ai</groupId>
    <artifactId>memind-plugin-ai-spring-ai</artifactId>
  </dependency>

  <dependency>
    <groupId>com.openmemind.ai</groupId>
    <artifactId>memind-plugin-jdbc-sqlite</artifactId>
  </dependency>
</dependencies>
For MySQL or PostgreSQL, replace the SQLite plugin with one of:
<dependency>
  <groupId>com.openmemind.ai</groupId>
  <artifactId>memind-plugin-jdbc-mysql</artifactId>
</dependency>
<dependency>
  <groupId>com.openmemind.ai</groupId>
  <artifactId>memind-plugin-jdbc-postgresql</artifactId>
</dependency>
After adding dependencies, assemble the runtime with Memory.builder(). The detailed Java SDK setup is covered in Java SDK. If you want to run a working example first, start with the Quickstart.

Source Build

Use this path when you want to build Memind from source, run examples, start the server locally, run Memind UI, or contribute to the project. Clone the repository:
git clone https://github.com/openmemind/memind.git
cd memind
Build the Java modules:
mvn clean install -DskipTests
Run the maintained Java quickstart example:
OPENAI_API_KEY=your-api-key \
mvn -pl memind-examples/memind-example-java -am -DskipTests exec:java \
  -Dexec.mainClass=com.openmemind.ai.memory.example.java.quickstart.QuickStartExample

Run memind-server from source

Start memind-server locally:
OPENAI_API_KEY=your-api-key \
mvn -pl memind-server -am -DskipTests spring-boot:run
The server starts on:
http://localhost:8366
Check health:
curl http://localhost:8366/open/v1/health

Run Memind UI from source

memind-ui is a standalone Vite React application. It is not bundled into memind-server. Requirements:
  • Node.js ^20.19.0 || ^22.12.0 || >=24.0.0
  • pnpm >=9.0.0
  • Corepack is recommended
Start memind-server first, then start the UI:
cd memind-ui
corepack enable
pnpm install
pnpm dev
Open the Vite URL shown in the terminal. During development, Memind UI proxies same-origin API requests to the local server:
UI request pathProxied to
/adminhttp://127.0.0.1:8366
/openhttp://127.0.0.1:8366
You can also start the UI with local mock data:
pnpm dev:mock
Mock mode is frontend-only. It is useful when a real memind-server has little or no data.
Memind UI currently has no login or authorization flow. Keep it on localhost and do not expose the dev server, built frontend, or proxied Memind server endpoints to public networks.

Build Memind UI

To build the frontend bundle:
cd memind-ui
pnpm build
To preview the built UI locally:
pnpm preview