Skip to main content

Deployment

Ahnlich consists of two services that work together:

  • ahnlich-db: In-memory vector store with exact similarity search
  • ahnlich-ai: AI proxy that transforms raw inputs (text/image) into embeddings

The recommended production setup runs both services using Docker.

Official Docker Images​

Ahnlich provides prebuilt images on GitHub Container Registry:

  • DB: ghcr.io/deven96/ahnlich-db:latest
  • AI: ghcr.io/deven96/ahnlich-ai:latest

Docker Compose Setup​

The easiest deployment for local or cloud use:

version: "3.8"

services:
ahnlich_db:
image: ghcr.io/deven96/ahnlich-db:latest
command: >
ahnlich-db run --host 0.0.0.0
--enable-persistence
--persist-location /root/.ahnlich/data/db.dat
--persistence-interval 300
ports:
- "1369:1369"
volumes:
- ./data:/root/.ahnlich/data

ahnlich_ai:
image: ghcr.io/deven96/ahnlich-ai:latest
command: >
ahnlich-ai run --host 0.0.0.0
--db-host ahnlich_db
--enable-persistence
--persist-location /root/.ahnlich/data/ai.dat
--persistence-interval 300
ports:
- "1370:1370"
volumes:
- ./data:/root/.ahnlich/data
- ./ahnlich_ai_model_cache:/root/.ahnlich/models

This configuration:

  • Enables disk persistence (data survives restarts)
  • Maps ports 1369 (DB) and 1370 (AI)
  • Caches AI models across restarts

Persistence​

Without persistence, all data is in-memory and lost on restart. To enable:

--enable-persistence
--persist-location /root/.ahnlich/data/db.dat
--persistence-interval 300 # seconds

Mount the persist location to a host volume:

volumes:
- ./data:/root/.ahnlich/data

Cloud Deployments​

AWS EC2​

  1. Launch EC2 instance
  2. Install Docker
  3. Run DB:
    docker run -d \
    --name ahnlich_db \
    -p 1369:1369 \
    -v /data/ahnlich:/root/.ahnlich/data \
    ghcr.io/deven96/ahnlich-db:latest \
    ahnlich-db run --host 0.0.0.0 \
    --enable-persistence \
    --persist-location /root/.ahnlich/data/db.dat
  4. Run AI:
    docker run -d \
    --name ahnlich_ai \
    -p 1370:1370 \
    --link ahnlich_db \
    -v /data/ahnlich:/root/.ahnlich/data \
    -v /data/models:/root/.ahnlich/models \
    ghcr.io/deven96/ahnlich-ai:latest \
    ahnlich-ai run --host 0.0.0.0 \
    --db-host ahnlich_db \
    --enable-persistence \
    --persist-location /root/.ahnlich/data/ai.dat

Open ports 1369 and 1370 in your security group.

GCP Compute Engine​

  1. Create VM instance
  2. Install Docker
  3. Follow same Docker commands as AWS EC2
  4. Create firewall rules for TCP ports 1369 and 1370
  5. Mount a persistent disk to /data for persistence

Coolify​

Coolify is a self-hosted PaaS supporting Docker images.

Steps:

  1. Create new app β†’ Docker Image
  2. Set images:
    • DB: ghcr.io/deven96/ahnlich-db:latest
    • AI: ghcr.io/deven96/ahnlich-ai:latest
  3. Configure run commands:
    • DB: ahnlich-db run --host 0.0.0.0 --enable-persistence --persist-location /root/.ahnlich/data/db.dat
    • AI: ahnlich-ai run --host 0.0.0.0 --db-host ahnlich_db --enable-persistence --persist-location /root/.ahnlich/data/ai.dat
  4. Mount volumes:
    • /root/.ahnlich/data (persistence)
    • /root/.ahnlich/models (AI model cache)
  5. Expose ports 1369 and 1370

Google Cloud Run​

Cloud Run supports gRPC containers with these requirements:

  • Containers listen on $PORT (use --port $PORT)
  • Expose endpoints over HTTPS (port 443)
  • Configure ahnlich-ai with --db-host <Cloud Run URL>

See Cloud Run gRPC Guide

Production Checklist​

ItemRecommendation
PortsExpose 1369 (DB) and 1370 (AI)
DB Connectionahnlich-ai must use --db-host with reachable address
PersistenceEnable with --enable-persistence and bind volumes
Model CachingMount /root/.ahnlich/models for AI
TracingOptional: --enable-tracing --otel-endpoint <collector>
SecurityUse TLS via proxy/load balancer for external exposure

References​