4.2 KiB
ngxstat
Per-domain Nginx log analytics with hybrid static reports and live insights.
Generating Reports
Use the generate_reports.py
script to build aggregated JSON and HTML snippet files from database/ngxstat.db
.
Create a virtual environment and install dependencies:
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
Then run one or more of the interval commands:
python scripts/generate_reports.py hourly
python scripts/generate_reports.py daily
python scripts/generate_reports.py weekly
python scripts/generate_reports.py monthly
Each command accepts optional flags to generate per-domain reports. Use
--domain <name>
to limit output to a specific domain or --all-domains
to generate a subdirectory for every domain found in the database:
# Hourly reports for example.com only
python scripts/generate_reports.py hourly --domain example.com
# Weekly reports for all domains individually
python scripts/generate_reports.py weekly --all-domains
Reports are written under the output/
directory. Each command updates the corresponding <interval>.json
file and writes one HTML snippet per report. These snippets are loaded dynamically by the main dashboard using Chart.js and DataTables.
Configuring Reports
Report queries are defined in reports.yml
. Each entry specifies the name
,
optional label
and chart
type, and a SQL query
that must return bucket
and value
columns. The special token {bucket}
is replaced with the
appropriate SQLite strftime
expression for each interval (hourly, daily,
weekly or monthly) so that a single definition works across all durations.
When generate_reports.py
runs, every definition is executed for the requested
interval and creates output/<interval>/<name>.json
plus a small HTML snippet
output/<interval>/<name>.html
used by the dashboard.
Example snippet:
- name: hits
chart: bar
query: |
SELECT {bucket} AS bucket,
COUNT(*) AS value
FROM logs
GROUP BY bucket
ORDER BY bucket
Add or modify entries in reports.yml
to tailor the generated metrics.
Importing Logs
Use the run-import.sh
script to set up the Python environment if needed and import the latest Nginx log entries into database/ngxstat.db
.
./run-import.sh
This script is suitable for cron jobs as it creates the virtual environment on first run, installs dependencies and reuses the environment on subsequent runs.
The importer handles rotated logs in order from oldest to newest so entries are processed exactly once. If you rerun the script, it only ingests records with a timestamp newer than the latest one already stored in the database, preventing duplicates.
Cron Report Generation
Use the run-reports.sh
script to run all report intervals in one step. The script sets up the Python environment the same way as run-import.sh
, making it convenient for automation via cron.
./run-reports.sh
Running this script will create or update the hourly, daily, weekly and monthly reports under output/
. It also detects all unique domains found in the database and writes per-domain reports to output/domains/<domain>/<interval>
alongside the aggregate data. After generation, open output/index.html
in your browser to browse the reports.
Log Analysis
The run-analysis.sh
script runs helper routines that inspect the database. It
creates or reuses the virtual environment and then executes a set of analysis
commands to spot missing domains, suggest cache rules and detect potential
threats.
./run-analysis.sh
Serving Reports with Nginx
To expose the generated HTML dashboards and JSON files over HTTP you can use a
simple Nginx server block. Point the root
directive to the repository's
output/
directory and optionally restrict access to your local network.
server {
listen 80;
server_name example.com;
# Path to the generated reports
root /path/to/ngxstat/output;
location / {
try_files $uri $uri/ =404;
}
# Allow access only from private networks
allow 192.0.0.0/8;
allow 10.0.0.0/8;
deny all;
}
With this configuration the generated static files are served directly by
Nginx while connections outside of 192.*
and 10.*
are denied.