Skip to content

Testing

Running Tests

# Run the full test suite
uv run pytest

# Run a specific test file
uv run pytest tests/metrics/test_calculator.py

# Run a single test by name
uv run pytest tests/metrics/test_calculator.py::test_function_name

# Run with verbose output
uv run pytest -v

# Run tests matching a keyword expression
uv run pytest -k "miso"

Test Structure

The test directory mirrors the source layout:

tests/
    __init__.py
    aws/
        __init__.py
        test_s3.py
    common/
        __init__.py
        test_config.py
        test_registry.py
    metrics/
        conftest.py
        test_calculator_smoke.py
        test_dashboard.py
        test_engine.py
        test_invariants.py
        test_iso_extensibility.py
        test_miso_metrics.py
        test_schema.py
        test_series.py
        test_visualization.py
    utils/
        __init__.py
        test_downloader_batch_skip_head.py
        test_parser.py
    test_enverus_api.py

Each test module corresponds to a source module. New tests should be placed in the matching subdirectory.

Fixtures

Session-Scoped Fixtures (tests/metrics/conftest.py)

The metrics tests use session-scoped fixtures to avoid recomputing expensive operations:

  • trade_df -- Loads the test trade parquet file (tests/metrics/trade.parquet). Scoped to the entire test session.
  • miso_engine -- Creates and computes a MetricsEngine from the trade data. Scoped to the entire test session so that compute() runs only once.

Use Session Scope for Expensive Fixtures

When a fixture involves loading large files or running computations, scope it to session to share the result across all tests in the session.

Writing New Fixtures

Place fixtures in a conftest.py within the appropriate test subdirectory. Prefer narrow scope (function or module) unless the fixture is genuinely expensive.

pytest Configuration

The pyproject.toml defines pytest settings:

[tool.pytest.ini_options]
testpaths = ["tests"]
norecursedirs = ["ref", ".venv", ".ruff_cache"]
  • testpaths: pytest discovers tests only in tests/.
  • norecursedirs: Excludes reference data, the virtual environment, and ruff cache from test discovery.

Testing Patterns

Invariant Tests

The metrics module maintains mathematical invariants that must hold for any input. These are verified in tests/metrics/test_invariants.py:

  1. drawdowns <= 0 (by definition)
  2. max_drawdown <= 0
  3. cvar <= 0 and var <= 0
  4. If returns > 0, then ratios > 0 (unless risk is 0, in which case ratios are NaN)

estimated_risk and Rolling Period Metrics Are Not Clamped

estimated_risk and rolling-period metrics (e.g., worst_90d_loss) can be positive or negative. They are not subject to the sign invariants above.

Smoke Tests

test_calculator_smoke.py verifies that the full metrics computation pipeline runs without errors on the test dataset, without asserting specific numeric values.

Schema and Extensibility Tests

  • test_schema.py verifies column mapping, dtype enforcement, and validation hooks.
  • test_iso_extensibility.py tests that new ISOs can be registered and their schemas used correctly.

Writing New Tests

  1. Place the test file in the directory matching the source module.
  2. Name the file test_<module>.py.
  3. Use descriptive test function names: test_<what>_<condition>_<expected>.
  4. Prefer direct assertions over custom assertion helpers.
  5. If the test needs external data, add a fixture in the local conftest.py.
  6. Ensure tests are deterministic -- avoid reliance on network calls or wall-clock time.