Documentation
Welcome to the PenLocal-AI documentation portal - your comprehensive guide to the self-hosted AI-powered penetration testing platform.
What is PenLocal-AI?
PenLocal-AI is an offline-first, fully self-hosted penetration testing platform that combines:
- AI-Powered Analysis - Local LLM inference via Ollama
- Automated Workflows - n8n orchestration for pentest execution
- Secure Architecture - Network isolation, encryption at rest, and MFA
- Team Collaboration - Share pentests, templates, and resources
flowchart LR
subgraph User["User"]
Pentester["Pentester"]
end
subgraph WebApp["Pentest Manager"]
UI["Web UI"]
end
subgraph N8N["n8n Engine"]
WF["Workflows"]
end
subgraph AI["AI Subsystem"]
OLLAMA["Ollama LLM"]
QDRANT[(Qdrant Vector DB)]
KALI["Kali Linux"]
end
subgraph Storage["Storage"]
MINIO[(MinIO)]
PG[(PostgreSQL)]
end
Pentester --> UI
UI --> WF
WF --> OLLAMA
OLLAMA --> QDRANT
WF --> KALI
WF --> MINIO
UI --> PG
Quick Links
Key Features
AI-Powered Pentesting
- Local LLM inference via Ollama
- Vector search for pentest knowledge retrieval
- Automated vulnerability analysis and reporting
Secure by Design
- Network isolation between Kali and internal services
- Fernet encryption for sensitive data at rest
- TOTP-based MFA with brute force protection
- API key authentication for all services
Team Collaboration
- Share pentests, templates, and Ollama connections
- Team management with role-based access
- Vulnerability edit history and audit trails
Flexible Deployment
- CPU, NVIDIA GPU, or AMD GPU support
- Optional Ollama installation (use remote instances)
- Cross-platform: Linux, macOS, Windows
Service Ports
| Service |
Port |
Description |
| n8n |
443 |
Workflow automation (HTTPS) |
| Pentest Manager |
8000 |
Web application (HTTPS) |
| Ollama |
11434 |
LLM inference (internal) |
Getting Started
# Clone the repository
git clone <repository-url>
cd PenLocal-AI
# Run the installer
./install.sh # Linux/macOS
.\install.ps1 # Windows PowerShell
# Access the platform
# 1. Create account n8n: https://127.0.0.1
# 2. Setup API key: https://127.0.0.1:8000
# 3. Add Ollama connection
# 4. Start pentesting!
Technology Stack
| Component |
Technology |
| Web Framework |
Flask + Gunicorn |
| Database |
PostgreSQL 16 |
| Vector DB |
Qdrant |
| LLM Server |
Ollama |
| Workflow Engine |
n8n |
| Object Storage |
MinIO |
| Reverse Proxy |
Nginx |
| Container Runtime |
Docker Compose |