Skip to content

Deployment

LeanCore runs as a containerized application stack deployed via Docker Compose.

Production Environment

ComponentURL
Web Applicationapp.leancoreai.com
API Serverserver.leancoreai.com
Monitoringops.leancoreai.com

Architecture

The production deployment includes:

  • Backend API -- Java Spring Boot application
  • Frontend -- Vue 3 SPA served via Cloudflare Pages
  • Database -- MySQL 8 with multi-tenant schemas
  • Vector Store -- Apache Cassandra for semantic search
  • MCP Connectors -- 20+ Node.js MCP servers
  • Messaging -- Evolution API for WhatsApp
  • Observability -- Langfuse for AI tracing
  • Monitoring -- Prometheus + Grafana + Tempo

Health Checks

ServiceHealth Endpoint
Backend APIGET /actuator/health
MCP ServersGET /health
MonitoringGrafana dashboards

Deployment Process

  1. Build -- compile backend JAR + build frontend assets
  2. Docker Build -- create container images
  3. Deploy -- update running containers
  4. Verify -- health checks on all services
  5. Post-Deploy -- reseed skills, generate routing cards if needed

Post-Deploy Steps

After deploying code changes:

  1. Reseed Default Skills -- ensures new skills are available to all specialists
  2. Generate Routing Cards -- creates tool discovery metadata for any new tools
  3. Audit Routing Cards -- verifies all tools have proper routing metadata

Monitoring

Three pre-built Grafana dashboards provide operational visibility:

System Overview

  • CPU and memory usage
  • Disk utilization
  • Network I/O
  • Container health and restarts

Application Performance

  • JVM heap and garbage collection
  • HTTP request rate and latency (p50, p95, p99)
  • Error rates by status code
  • Database connection pool status

MCP Health Grid

  • Server UP/DOWN status (green/red grid)
  • Health response times
  • Container restarts
  • Memory usage per connector

LeanCore AI - Hire smarter. Not more.