Skip to content

Development

Prerequisites

  • Python 3.12+ (required by pyproject.toml)
  • uv -- the project uses uv for dependency management. Do not use pip, conda, or poetry.

Environment Setup

Clone the repository and install all dependencies:

git clone <repo-url>
cd ProgridPy

# Install all dependency groups (dev, docs, etc.)
uv sync --all-groups

# Verify the installation
uv run python -c "import progridpy; print('OK')"

The virtual environment is managed at .venv/ by uv. Always use uv run to execute commands within the project environment, or activate it explicitly with source .venv/bin/activate.

Do Not Use Conda or Pip

The project is managed exclusively by uv. Running pip install or conda install inside the project environment may cause dependency conflicts. CI runs uv sync --locked --group dev.

Branching Strategy

Branch Naming

The repository uses descriptive branch prefixes:

Prefix Purpose
datalake/<iso> ISO-specific data pipeline work (e.g., datalake/spp, datalake/miso)
feat/<description> New features
fix/<description> Bug fixes
refactor/<description> Code restructuring
docs/<description> Documentation changes

Commit Messages

Follow Conventional Commits:

<type>(<scope>): <description>

[optional body]

Common types: feat, fix, refactor, docs, style, test, chore.

Examples:

feat(spp): add trade clearing, metrics engine, and dashboard scripts
fix(downloader): retry truncated chunks and cover integer-division remainder
refactor(ercot): migrate to iso package and move ercot api under iso/ercot/api
style: ruff formatting, PEP 695 generics, and absolute imports

Linting and Formatting

The project uses ruff for both linting and formatting:

# Check for lint errors
uv run ruff check .

# Auto-fix lint errors where possible
uv run ruff check --fix .

# Format code
uv run ruff format .

Type Checking

The project uses ty for type checking:

ty check

Code Style Guidelines

Imports

  • Use absolute imports everywhere. No relative imports.
  • Group imports in the standard order: stdlib, third-party, local.

Data Processing

  • All ISO data processing uses Polars. The SPP implementation is the reference pattern.
  • The metrics module uses pandas (the metrics pipeline predates the Polars migration and operates on pandas DataFrames).

Type Annotations

  • Use PEP 695 generics (class Foo[T: Bound]) rather than TypeVar where possible.
  • All public functions and methods must have type annotations.
  • Use from __future__ import annotations where needed for forward references.

Pull Request Process

  1. Create a branch from main with an appropriate prefix.
  2. Make changes, ensuring ruff check . and uv run pytest pass.
  3. Write or update tests for any new functionality.
  4. Push the branch and open a pull request against main.
  5. Ensure CI passes (dependency lock check, lint, tests).