- PROJECT.md updated to reflect v1.0 completion (10 phases, 39 plans, 67 requirements). All key decisions marked as shipped. - README.md: comprehensive project documentation with quick start, architecture, tech stack, configuration, and project structure. - CHANGELOG.md: detailed changelog covering all 10 phases with feature descriptions organized by phase. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Konstruct
Build your AI workforce. Deploy AI employees that work in the channels your team already uses — Slack, WhatsApp, and the built-in web chat. Zero behavior change required.
What is Konstruct?
Konstruct is an AI workforce platform where SMBs subscribe to AI employees. Each AI employee has a name, role, persona, and tools — and communicates through familiar messaging channels. Think of it as "hire an AI department" rather than "subscribe to another SaaS dashboard."
Key Features
- Channel-native AI employees — Agents respond in Slack, WhatsApp, and the portal web chat
- Knowledge base — Upload documents (PDF, DOCX, PPTX, Excel, CSV, TXT, Markdown), URLs, and YouTube videos. Agents search them automatically.
- Google Calendar — Agents check availability, list events, and book meetings via OAuth
- Web search — Agents search the web via Brave Search API
- Real-time streaming — Web chat streams LLM responses word-by-word
- 6 pre-built templates — Customer Support Rep, Sales Assistant, Marketing Manager, Office Manager, Project Coordinator, Finance & Accounting Manager
- Employee wizard — 5-step guided setup or one-click template deployment
- 3-tier RBAC — Platform admin, customer admin, customer operator with email invitation flow
- Multilanguage — English, Spanish, Portuguese (portal UI + agent responses)
- Mobile + PWA — Bottom tab bar, full-screen chat, push notifications, offline support
- Stripe billing — Per-agent monthly pricing with 14-day free trial
- BYO API keys — Tenants can bring their own LLM provider keys (Fernet encrypted)
Quick Start
Prerequisites
- Docker + Docker Compose
- Ollama running on the host (port 11434)
- Node.js 22+ (for portal development)
- Python 3.12+ with
uv(for backend development)
Setup
# Clone
git clone https://git.oe74.net/adelorenzo/konstruct.git
cd konstruct
# Configure
cp .env.example .env
# Edit .env — set OLLAMA_MODEL, API keys, SMTP, etc.
# Start all services
docker compose up -d
# Create admin user
curl -X POST http://localhost:8001/api/portal/auth/register \
-H "Content-Type: application/json" \
-d '{"email": "admin@example.com", "password": "YourPassword123", "name": "Admin"}'
# Set as platform admin
docker exec konstruct-postgres psql -U postgres -d konstruct \
-c "UPDATE portal_users SET role = 'platform_admin' WHERE email = 'admin@example.com';"
Open http://localhost:3000 and sign in.
Services
| Service | Port | Description |
|---|---|---|
| Portal | 3000 | Next.js admin dashboard |
| Gateway | 8001 | FastAPI API + WebSocket |
| LLM Pool | internal | LiteLLM router (Ollama + commercial) |
| Celery Worker | internal | Background task processing |
| PostgreSQL | internal | Primary database with RLS + pgvector |
| Redis | internal | Cache, sessions, pub-sub, task queue |
Architecture
Client (Slack / WhatsApp / Web Chat)
│
▼
┌─────────────────────┐
│ Channel Gateway │ Unified ingress, normalizes to KonstructMessage
│ (FastAPI :8001) │
└────────┬────────────┘
│
▼
┌─────────────────────┐
│ Agent Orchestrator │ Memory, tools, escalation, audit
│ (Celery / Direct) │ Web chat streams directly (no Celery)
└────────┬────────────┘
│
▼
┌─────────────────────┐
│ LLM Backend Pool │ LiteLLM → Ollama / Anthropic / OpenAI
└─────────────────────┘
Tech Stack
Backend
- Python 3.12+ — FastAPI, SQLAlchemy 2.0, Pydantic v2, Celery
- PostgreSQL 16 — RLS multi-tenancy, pgvector for embeddings
- Redis — Cache, pub-sub, task queue, sliding window memory
- LiteLLM — Unified LLM provider routing with fallback
Frontend
- Next.js 16 — App Router, standalone output
- Tailwind CSS v4 — Utility-first styling
- shadcn/ui — Component library (base-nova style)
- next-intl — Internationalization (en/es/pt)
- Serwist — Service worker for PWA
- DM Sans — Primary font
Infrastructure
- Docker Compose — Development and deployment
- Alembic — Database migrations (14 migrations)
- Playwright — E2E testing (7 flows, 3 browsers)
- Gitea Actions — CI/CD pipeline
Configuration
All configuration is via environment variables in .env:
| Variable | Description | Default |
|---|---|---|
OLLAMA_MODEL |
Ollama model for local inference | qwen3:32b |
OLLAMA_BASE_URL |
Ollama server URL | http://host.docker.internal:11434 |
ANTHROPIC_API_KEY |
Anthropic API key (optional) | — |
OPENAI_API_KEY |
OpenAI API key (optional) | — |
BRAVE_API_KEY |
Brave Search API key | — |
FIRECRAWL_API_KEY |
Firecrawl API key for URL scraping | — |
STRIPE_SECRET_KEY |
Stripe billing key | — |
AUTH_SECRET |
JWT signing secret | — |
PLATFORM_ENCRYPTION_KEY |
Fernet key for BYO API key encryption | — |
See .env.example for the complete list.
Project Structure
konstruct/
├── packages/
│ ├── gateway/ # Channel Gateway (FastAPI)
│ ├── orchestrator/ # Agent Orchestrator (Celery tasks)
│ ├── llm-pool/ # LLM Backend Pool (LiteLLM)
│ ├── router/ # Message Router (tenant resolution, rate limiting)
│ ├── shared/ # Shared models, config, API routers
│ └── portal/ # Admin Portal (Next.js 16)
├── migrations/ # Alembic DB migrations
├── tests/ # Backend test suite
├── docker-compose.yml # Service definitions
├── .planning/ # GSD planning artifacts
└── .env # Environment configuration
License
Proprietary. All rights reserved.
Description
Languages
Python
99.9%