llm-team-ui/server/llm-team-ui.service
root 6ea457d01d Add server security configs and setup script
- Nginx configs with security headers (X-Frame-Options, CSP, etc.)
- fail2ban jails for nginx (botsearch, bad-request, forbidden)
- Kernel hardening via sysctl (rp_filter, no redirects, log martians)
- SSH hardening (no root, max 3 attempts, no X11)
- UFW rules export
- Idempotent setup.sh to restore all configs on fresh install
- Flask bound to 127.0.0.1 (nginx-only access)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 04:47:54 -05:00

15 lines
274 B
Desktop File

[Unit]
Description=LLM Team UI - Multi-model team web interface
After=network.target ollama.service
[Service]
Type=simple
User=root
WorkingDirectory=/root
ExecStart=/usr/bin/python3 /root/llm_team_ui.py
Restart=on-failure
RestartSec=5
[Install]
WantedBy=multi-user.target