Deployment
Ahnlich consists of two services that work together:
- ahnlich-db: In-memory vector store with exact similarity search
- ahnlich-ai: AI proxy that transforms raw inputs (text/image) into embeddings
The recommended production setup runs both services using Docker.
Official Docker Imagesβ
Ahnlich provides prebuilt images on GitHub Container Registry:
- DB:
ghcr.io/deven96/ahnlich-db:latest - AI:
ghcr.io/deven96/ahnlich-ai:latest
Docker Compose Setupβ
The easiest deployment for local or cloud use:
version: "3.8"
services:
ahnlich_db:
image: ghcr.io/deven96/ahnlich-db:latest
command: >
ahnlich-db run --host 0.0.0.0
--enable-persistence
--persist-location /root/.ahnlich/data/db.dat
--persistence-interval 300
ports:
- "1369:1369"
volumes:
- ./data:/root/.ahnlich/data
ahnlich_ai:
image: ghcr.io/deven96/ahnlich-ai:latest
command: >
ahnlich-ai run --host 0.0.0.0
--db-host ahnlich_db
--enable-persistence
--persist-location /root/.ahnlich/data/ai.dat
--persistence-interval 300
ports:
- "1370:1370"
volumes:
- ./data:/root/.ahnlich/data
- ./ahnlich_ai_model_cache:/root/.ahnlich/models
This configuration:
- Enables disk persistence (data survives restarts)
- Maps ports 1369 (DB) and 1370 (AI)
- Caches AI models across restarts
Persistenceβ
Without persistence, all data is in-memory and lost on restart. To enable:
--enable-persistence
--persist-location /root/.ahnlich/data/db.dat
--persistence-interval 300 # seconds
Mount the persist location to a host volume:
volumes:
- ./data:/root/.ahnlich/data
Cloud Deploymentsβ
AWS EC2β
- Launch EC2 instance
- Install Docker
- Run DB:
docker run -d \
--name ahnlich_db \
-p 1369:1369 \
-v /data/ahnlich:/root/.ahnlich/data \
ghcr.io/deven96/ahnlich-db:latest \
ahnlich-db run --host 0.0.0.0 \
--enable-persistence \
--persist-location /root/.ahnlich/data/db.dat - Run AI:
docker run -d \
--name ahnlich_ai \
-p 1370:1370 \
--link ahnlich_db \
-v /data/ahnlich:/root/.ahnlich/data \
-v /data/models:/root/.ahnlich/models \
ghcr.io/deven96/ahnlich-ai:latest \
ahnlich-ai run --host 0.0.0.0 \
--db-host ahnlich_db \
--enable-persistence \
--persist-location /root/.ahnlich/data/ai.dat
Open ports 1369 and 1370 in your security group.
GCP Compute Engineβ
- Create VM instance
- Install Docker
- Follow same Docker commands as AWS EC2
- Create firewall rules for TCP ports 1369 and 1370
- Mount a persistent disk to
/datafor persistence
Coolifyβ
Coolify is a self-hosted PaaS supporting Docker images.
Steps:
- Create new app β Docker Image
- Set images:
- DB:
ghcr.io/deven96/ahnlich-db:latest - AI:
ghcr.io/deven96/ahnlich-ai:latest
- DB:
- Configure run commands:
- DB:
ahnlich-db run --host 0.0.0.0 --enable-persistence --persist-location /root/.ahnlich/data/db.dat - AI:
ahnlich-ai run --host 0.0.0.0 --db-host ahnlich_db --enable-persistence --persist-location /root/.ahnlich/data/ai.dat
- DB:
- Mount volumes:
/root/.ahnlich/data(persistence)/root/.ahnlich/models(AI model cache)
- Expose ports 1369 and 1370
Google Cloud Runβ
Cloud Run supports gRPC containers with these requirements:
- Containers listen on
$PORT(use--port $PORT) - Expose endpoints over HTTPS (port 443)
- Configure
ahnlich-aiwith--db-host <Cloud Run URL>
Production Checklistβ
| Item | Recommendation |
|---|---|
| Ports | Expose 1369 (DB) and 1370 (AI) |
| DB Connection | ahnlich-ai must use --db-host with reachable address |
| Persistence | Enable with --enable-persistence and bind volumes |
| Model Caching | Mount /root/.ahnlich/models for AI |
| Tracing | Optional: --enable-tracing --otel-endpoint <collector> |
| Security | Use TLS via proxy/load balancer for external exposure |