pytest-resource-mon
A pytest plugin that snapshots system resources (CPU, memory, disk) for every test and ships the metrics to Tinybird or a local file.
Key Features
- Per-test resource snapshots — captures CPU, memory, and disk usage before and after each test.
- Batched delivery — groups test records into configurable batches to reduce network overhead.
- CI context — automatically captures GitHub Actions environment variables so metrics can be correlated with runs, commits, and workflows.
- Flexible output — send metrics to Tinybird over HTTP or write NDJSON to a local file.
How It Works
The plugin emits three event types:
session_start— records total CPU count, memory, and disk at the beginning of the test session.test— records before/after snapshots of CPU %, available memory, and free disk for each test, plus duration.session_end— records final resource state and the pytest exit status.
Next Steps
See Getting Started to install and run the plugin.
Getting Started
Installation
pip install pytest-resource-mon
Or with uv:
uv add pytest-resource-mon
The plugin requires Python 3.9+ and depends on pytest>=7.0 and psutil>=5.9.
Quick Start
Write metrics to a local file
pytest --tinybird-file metrics.ndjson
This writes one NDJSON line per event to metrics.ndjson.
Send metrics to Tinybird
export TINYBIRD_WRITE_TOKEN=your-token
pytest
The plugin activates automatically when TINYBIRD_WRITE_TOKEN is set.
Disable the plugin
pytest --tinybird-disable
This flag prevents the plugin from registering, even if a token or file path is configured.
Activation Rules
The plugin activates when either of these conditions is true:
- The
TINYBIRD_WRITE_TOKENenvironment variable is set (Tinybird HTTP mode). - The
--tinybird-fileflag is provided (local file mode).
If TINYBIRD_WRITE_TOKEN is set, it takes priority over --tinybird-file. If neither is configured, the plugin does nothing.
Passing --tinybird-disable always prevents activation regardless of other settings.
Configuration
CLI Flags
| Flag | Default | Description |
|---|---|---|
--tinybird-batch-size | 50 | Number of test records to buffer before sending a batch. |
--tinybird-disable | false | Disable the plugin entirely. |
--tinybird-file | (none) | Write NDJSON metrics to a local file instead of sending over HTTP. |
Environment Variables
| Variable | Required | Default | Description |
|---|---|---|---|
TINYBIRD_WRITE_TOKEN | No | (none) | Tinybird auth token. When set, the plugin sends metrics over HTTP. |
TINYBIRD_API_URL | No | https://api.tinybird.co | Tinybird API base URL. Only used when TINYBIRD_WRITE_TOKEN is set. |
Activation Logic
if --tinybird-disable:
plugin does not register
elif TINYBIRD_WRITE_TOKEN is set:
use Tinybird HTTP writer
elif --tinybird-file is set:
use local file writer
else:
plugin does not register
TINYBIRD_WRITE_TOKEN takes priority over --tinybird-file. If both are set, the HTTP writer is used.
Output Formats
The plugin supports two output backends: Tinybird HTTP and local file.
Tinybird HTTP
Activated when TINYBIRD_WRITE_TOKEN is set.
Sends NDJSON payloads to the Tinybird Events API:
POST {TINYBIRD_API_URL}/v0/events?name=ci_test_metrics
Authorization: Bearer {TINYBIRD_WRITE_TOKEN}
Content-Type: application/x-ndjson
Batching
Test records are buffered and sent in batches controlled by --tinybird-batch-size (default: 50). Session start and session end events are sent immediately (not batched).
Each record in a batch is stamped with a batch_num field that increments per flush.
Retries
If a request fails, the plugin retries once. If the retry also fails, the batch is dropped and a warning is logged. Failures never break the test run.
Local File
Activated when --tinybird-file is provided (and TINYBIRD_WRITE_TOKEN is not set).
pytest --tinybird-file metrics.ndjson
Writes one JSON object per line (NDJSON format). The file is opened in append mode, so multiple runs accumulate in the same file.
Example Output
{"event_type": "session_start", "test_nodeid": "__session__", "timestamp": "2025-01-15T10:30:00+00:00", "cpu_count": 8, "mem_total_bytes": 17179869184, "disk_total_bytes": 499963174912, "gh_run_id": "", "gh_sha": "", ...}
{"event_type": "test", "test_nodeid": "tests/test_example.py::test_one", "timestamp": "2025-01-15T10:30:01+00:00", "duration_s": 0.1234, "cpu_before": 12.5, "cpu_after": 15.0, "mem_available_before": 8589934592, "mem_available_after": 8489934592, "batch_num": 1, ...}
{"event_type": "session_end", "test_nodeid": "__session__", "timestamp": "2025-01-15T10:30:05+00:00", "exit_status": 0, "cpu_count": 8, "mem_total_bytes": 17179869184, "disk_total_bytes": 499963174912, ...}
GitHub Actions Integration
When running in GitHub Actions, the plugin automatically captures environment variables and includes them in every event record.
Captured Variables
| GitHub Env Var | Metric Field | Description |
|---|---|---|
GITHUB_RUN_ID | gh_run_id | Unique ID for the workflow run |
GITHUB_SHA | gh_sha | Commit SHA that triggered the run |
GITHUB_REF_NAME | gh_ref_name | Branch or tag name |
GITHUB_WORKFLOW | gh_workflow | Workflow name |
GITHUB_JOB | gh_job | Job ID |
GITHUB_ACTOR | gh_actor | User or app that triggered the run |
GITHUB_REPOSITORY | gh_repository | Owner/repo (e.g. org/repo) |
GITHUB_RUN_ATTEMPT | gh_run_attempt | Retry attempt number |
If a variable is not set (e.g. running locally), the corresponding field is an empty string.
Example Workflow
name: Tests
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.12"
- run: pip install pytest-resource-mon
- run: pytest
env:
TINYBIRD_WRITE_TOKEN: ${{ secrets.TINYBIRD_WRITE_TOKEN }}
No extra configuration is needed — the GITHUB_* variables are set automatically by the Actions runner.
Metrics Schema
The ci_test_metrics datasource stores all events. The schema is sorted by (timestamp, test_nodeid).
Event Types
Every record has an event_type field. The plugin emits three types:
| Event Type | test_nodeid | When |
|---|---|---|
session_start | __session__ | Beginning of the pytest session |
test | Full node ID (e.g. tests/test_example.py::test_one) | After each test’s teardown |
session_end | __session__ | End of the pytest session |
Full Schema
| Field | Type | Description | Events |
|---|---|---|---|
event_type | String | session_start, test, or session_end | all |
test_nodeid | String | Test node ID or __session__ | all |
timestamp | DateTime64(3) | UTC timestamp (ISO 8601) | all |
batch_num | Int32 | Batch sequence number | test |
duration_s | Float64 | Test wall-clock duration in seconds | test |
cpu_before | Float64 | CPU usage % before test | test |
cpu_after | Float64 | CPU usage % after test | test |
cpu_count | Int32 | Logical CPU count | session_start, session_end |
mem_total_bytes | Int64 | Total physical memory in bytes | session_start, session_end |
mem_available_before | Int64 | Available memory before test (bytes) | test |
mem_available_after | Int64 | Available memory after test (bytes) | test |
mem_percent_before | Float64 | Memory usage % before test | test |
mem_percent_after | Float64 | Memory usage % after test | test |
disk_total_bytes | Int64 | Total disk space in bytes | session_start, session_end |
disk_free_before | Int64 | Free disk space before test (bytes) | test |
disk_free_after | Int64 | Free disk space after test (bytes) | test |
disk_percent_before | Float64 | Disk usage % before test | test |
disk_percent_after | Float64 | Disk usage % after test | test |
exit_status | Int32 | Pytest exit code | session_end |
gh_run_id | String | GitHub Actions run ID | all |
gh_sha | String | Git commit SHA | all |
gh_ref_name | String | Branch or tag name | all |
gh_workflow | String | Workflow name | all |
gh_job | String | Job ID | all |
gh_actor | String | Actor who triggered the run | all |
gh_repository | String | Repository (owner/repo) | all |
gh_run_attempt | String | Run attempt number | all |