Your diagnostic data contains sensitive server configurations, IP addresses, and system internals. Here is exactly how we handle it — with full transparency.
When you upload a diagnostic archive, here is exactly what happens:
| Stage | What Happens | Duration |
|---|---|---|
| Upload | Archive saved to temporary storage on the server | Seconds |
| Analysis | Archive is extracted, parsed across 13 diagnostic categories, and a structured report is generated | 10–60 seconds |
| Archive Deletion | The uploaded archive is permanently deleted immediately after analysis completes (even if analysis fails) | Immediate |
| Report Storage | The parsed report (JSON) is stored for you to access the dashboard and export PDFs | 7 days (configurable) |
| Auto-Cleanup | Reports are automatically deleted after the TTL expires. No manual action required. | Automatic |
The application makes absolutely zero outbound HTTP requests. This is verifiable:
requests, urllib, httpx, or any HTTP clientdefault deny outgoing, blocking all outbound connections except DNS, apt updates, and NTPss -tunp | grep -v '127.0.0.1' on the hosting server during an upload — you will see only inbound nginx connections, zero outbound from the application.
For organizations that require on-premises deployment, the Linux Diagnostic Analyzer can be self-hosted:
docker-compose upREPORT_TTL_DAYS to control data retention (default: 7 days)ENCRYPTION_KEY for at-rest encryptionDAILY_UPLOAD_LIMIT per your usage requirementsWe built this tool to solve a real problem for SREs. We handle your data the way we would want ours handled — with minimal collection, maximum protection, and full transparency.
Start Analyzing