Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

pytest-resource-mon

A pytest plugin that snapshots system resources (CPU, memory, disk) for every test and ships the metrics to Tinybird or a local file.

Key Features

  • Per-test resource snapshots — captures CPU, memory, and disk usage before and after each test.
  • Batched delivery — groups test records into configurable batches to reduce network overhead.
  • CI context — automatically captures GitHub Actions environment variables so metrics can be correlated with runs, commits, and workflows.
  • Flexible output — send metrics to Tinybird over HTTP or write NDJSON to a local file.

How It Works

The plugin emits three event types:

  1. session_start — records total CPU count, memory, and disk at the beginning of the test session.
  2. test — records before/after snapshots of CPU %, available memory, and free disk for each test, plus duration.
  3. session_end — records final resource state and the pytest exit status.

Next Steps

See Getting Started to install and run the plugin.

Getting Started

Installation

pip install pytest-resource-mon

Or with uv:

uv add pytest-resource-mon

The plugin requires Python 3.9+ and depends on pytest>=7.0 and psutil>=5.9.

Quick Start

Write metrics to a local file

pytest --tinybird-file metrics.ndjson

This writes one NDJSON line per event to metrics.ndjson.

Send metrics to Tinybird

export TINYBIRD_WRITE_TOKEN=your-token
pytest

The plugin activates automatically when TINYBIRD_WRITE_TOKEN is set.

Disable the plugin

pytest --tinybird-disable

This flag prevents the plugin from registering, even if a token or file path is configured.

Activation Rules

The plugin activates when either of these conditions is true:

  1. The TINYBIRD_WRITE_TOKEN environment variable is set (Tinybird HTTP mode).
  2. The --tinybird-file flag is provided (local file mode).

If TINYBIRD_WRITE_TOKEN is set, it takes priority over --tinybird-file. If neither is configured, the plugin does nothing.

Passing --tinybird-disable always prevents activation regardless of other settings.

Configuration

CLI Flags

FlagDefaultDescription
--tinybird-batch-size50Number of test records to buffer before sending a batch.
--tinybird-disablefalseDisable the plugin entirely.
--tinybird-file(none)Write NDJSON metrics to a local file instead of sending over HTTP.

Environment Variables

VariableRequiredDefaultDescription
TINYBIRD_WRITE_TOKENNo(none)Tinybird auth token. When set, the plugin sends metrics over HTTP.
TINYBIRD_API_URLNohttps://api.tinybird.coTinybird API base URL. Only used when TINYBIRD_WRITE_TOKEN is set.

Activation Logic

if --tinybird-disable:
    plugin does not register

elif TINYBIRD_WRITE_TOKEN is set:
    use Tinybird HTTP writer

elif --tinybird-file is set:
    use local file writer

else:
    plugin does not register

TINYBIRD_WRITE_TOKEN takes priority over --tinybird-file. If both are set, the HTTP writer is used.

Output Formats

The plugin supports two output backends: Tinybird HTTP and local file.

Tinybird HTTP

Activated when TINYBIRD_WRITE_TOKEN is set.

Sends NDJSON payloads to the Tinybird Events API:

POST {TINYBIRD_API_URL}/v0/events?name=ci_test_metrics
Authorization: Bearer {TINYBIRD_WRITE_TOKEN}
Content-Type: application/x-ndjson

Batching

Test records are buffered and sent in batches controlled by --tinybird-batch-size (default: 50). Session start and session end events are sent immediately (not batched).

Each record in a batch is stamped with a batch_num field that increments per flush.

Retries

If a request fails, the plugin retries once. If the retry also fails, the batch is dropped and a warning is logged. Failures never break the test run.

Local File

Activated when --tinybird-file is provided (and TINYBIRD_WRITE_TOKEN is not set).

pytest --tinybird-file metrics.ndjson

Writes one JSON object per line (NDJSON format). The file is opened in append mode, so multiple runs accumulate in the same file.

Example Output

{"event_type": "session_start", "test_nodeid": "__session__", "timestamp": "2025-01-15T10:30:00+00:00", "cpu_count": 8, "mem_total_bytes": 17179869184, "disk_total_bytes": 499963174912, "gh_run_id": "", "gh_sha": "", ...}
{"event_type": "test", "test_nodeid": "tests/test_example.py::test_one", "timestamp": "2025-01-15T10:30:01+00:00", "duration_s": 0.1234, "cpu_before": 12.5, "cpu_after": 15.0, "mem_available_before": 8589934592, "mem_available_after": 8489934592, "batch_num": 1, ...}
{"event_type": "session_end", "test_nodeid": "__session__", "timestamp": "2025-01-15T10:30:05+00:00", "exit_status": 0, "cpu_count": 8, "mem_total_bytes": 17179869184, "disk_total_bytes": 499963174912, ...}

GitHub Actions Integration

When running in GitHub Actions, the plugin automatically captures environment variables and includes them in every event record.

Captured Variables

GitHub Env VarMetric FieldDescription
GITHUB_RUN_IDgh_run_idUnique ID for the workflow run
GITHUB_SHAgh_shaCommit SHA that triggered the run
GITHUB_REF_NAMEgh_ref_nameBranch or tag name
GITHUB_WORKFLOWgh_workflowWorkflow name
GITHUB_JOBgh_jobJob ID
GITHUB_ACTORgh_actorUser or app that triggered the run
GITHUB_REPOSITORYgh_repositoryOwner/repo (e.g. org/repo)
GITHUB_RUN_ATTEMPTgh_run_attemptRetry attempt number

If a variable is not set (e.g. running locally), the corresponding field is an empty string.

Example Workflow

name: Tests
on: [push, pull_request]

jobs:
  test:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - uses: actions/setup-python@v5
        with:
          python-version: "3.12"

      - run: pip install pytest-resource-mon

      - run: pytest
        env:
          TINYBIRD_WRITE_TOKEN: ${{ secrets.TINYBIRD_WRITE_TOKEN }}

No extra configuration is needed — the GITHUB_* variables are set automatically by the Actions runner.

Metrics Schema

The ci_test_metrics datasource stores all events. The schema is sorted by (timestamp, test_nodeid).

Event Types

Every record has an event_type field. The plugin emits three types:

Event Typetest_nodeidWhen
session_start__session__Beginning of the pytest session
testFull node ID (e.g. tests/test_example.py::test_one)After each test’s teardown
session_end__session__End of the pytest session

Full Schema

FieldTypeDescriptionEvents
event_typeStringsession_start, test, or session_endall
test_nodeidStringTest node ID or __session__all
timestampDateTime64(3)UTC timestamp (ISO 8601)all
batch_numInt32Batch sequence numbertest
duration_sFloat64Test wall-clock duration in secondstest
cpu_beforeFloat64CPU usage % before testtest
cpu_afterFloat64CPU usage % after testtest
cpu_countInt32Logical CPU countsession_start, session_end
mem_total_bytesInt64Total physical memory in bytessession_start, session_end
mem_available_beforeInt64Available memory before test (bytes)test
mem_available_afterInt64Available memory after test (bytes)test
mem_percent_beforeFloat64Memory usage % before testtest
mem_percent_afterFloat64Memory usage % after testtest
disk_total_bytesInt64Total disk space in bytessession_start, session_end
disk_free_beforeInt64Free disk space before test (bytes)test
disk_free_afterInt64Free disk space after test (bytes)test
disk_percent_beforeFloat64Disk usage % before testtest
disk_percent_afterFloat64Disk usage % after testtest
exit_statusInt32Pytest exit codesession_end
gh_run_idStringGitHub Actions run IDall
gh_shaStringGit commit SHAall
gh_ref_nameStringBranch or tag nameall
gh_workflowStringWorkflow nameall
gh_jobStringJob IDall
gh_actorStringActor who triggered the runall
gh_repositoryStringRepository (owner/repo)all
gh_run_attemptStringRun attempt numberall