Features: - 20 team modes (brainstorm, debate, consensus, red team, etc.) - 3 autonomous pipelines (research, model eval, knowledge extraction) - AutoResearch Lab with ratchet engine (Karpathy-inspired) - Multi-provider support (Ollama, OpenRouter, OpenAI, Anthropic) - Admin panel (providers, models, timeouts, OpenRouter browser) - History panel with copy/iterate/re-pipe workflow - Context budget system (smart truncation, safe_query, overflow recovery) - PostgreSQL persistence (team_runs, pipeline_runs, lab_experiments, lab_trials) - Pure Python + embedded HTML/CSS/JS, no external JS dependencies - Inline SVG score charts in Lab monitor - SSE streaming for real-time output - Systemd service with auto-restart Stack: Flask + Ollama + PostgreSQL + Bun-compatible Hardware: RTX A4000 (16GB) + 128GB RAM Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
15 lines
274 B
Desktop File
15 lines
274 B
Desktop File
[Unit]
|
|
Description=LLM Team UI - Multi-model team web interface
|
|
After=network.target ollama.service
|
|
|
|
[Service]
|
|
Type=simple
|
|
User=root
|
|
WorkingDirectory=/root
|
|
ExecStart=/usr/bin/python3 /root/llm_team_ui.py
|
|
Restart=on-failure
|
|
RestartSec=5
|
|
|
|
[Install]
|
|
WantedBy=multi-user.target
|