An open API service indexing awesome lists of open source software.

https://github.com/tslateman/spec-trace

Requirements traceability for Python projects — connect specs to tests, see what's verified
https://github.com/tslateman/spec-trace

python requirements testing traceability

Last synced: 22 days ago
JSON representation

Requirements traceability for Python projects — connect specs to tests, see what's verified

Awesome Lists containing this project

README

          

# SpecTrace

Requirements traceability for Python projects. Connect specs to tests, see what's verified.

## Prerequisites

SpecTrace uses [uv](https://github.com/astral-sh/uv) for fast, reliable package management:

```bash
# Install uv (if not already installed)
curl -LsSf https://astral.sh/uv/install.sh | sh
# or: pip install uv
```

## Quick Start

```bash
# Install with uv (recommended)
make install
# or: uv pip install -e .

# Setup database
make migrate

# Create admin user
make setup

# Import your specs
python spectrace/manage.py parse_specs specs/

# Run development server
make run

# Open http://localhost:8000/admin/
```

## Development Commands

SpecTrace includes a Makefile for common development tasks (uses `uv` for package management):

| Command | Description |
|---------|-------------|
| `make help` | Show all available commands |
| `make install` | Install package in editable mode (uses `uv pip install`) |
| `make install-dev` | Install with dev dependencies (uses `uv pip install`) |
| `make test` | Run tests with pytest |
| `make migrate` | Run Django migrations |
| `make makemigrations` | Create new migrations |
| `make shell` | Open Django shell |
| `make run` | Start development server |
| `make clean` | Remove caches and build artifacts |
| `make setup` | Create admin user (admin/admin) |
| `make demo` | Run the SpecTrace demo |

**Note:** If you don't have `uv` installed, the Makefile commands will fail. Install it first: `pip install uv`

## Workflow Example

```bash
# 1. Import requirements from specs
python spectrace/manage.py parse_specs specs/

# 2. Run tests with JUnit output
make test
# or: pytest --junitxml=test_results.xml

# 3. Extract test-requirement links
python spectrace/manage.py extract_links --output links.json

# 4. Import results and compute status
python spectrace/manage.py import_results test_results.xml --links links.json

# 5. View dashboard
make run
# Open http://localhost:8000/admin/
```

## Examples

See the **[Document Pipeline Example](examples/document-pipeline/)** for a comprehensive demonstration of spec-trace features:

- Nested requirement hierarchy (3 levels)
- Multiple verification methods (test, inapp, both)
- Passing, failing, and skipped tests
- SLO integration with OpenSLO YAML
- Various pytest patterns (parametrized, async, class-based, xfail)
- CI/CD workflow example

Run the demo:
```bash
make demo
# or: python scripts/demo_pipeline.py
```

## Writing Specs

Create markdown files in `specs/` with frontmatter:

```markdown
---
id: REQ-AUTH-001
title: User Login
priority: high
tags: [authentication, security]
verification_method: test # test, inapp, or both
---

Users must be able to log in with email and password.
```

## Linking Tests

Use the `@pytest.mark.requirement` decorator:

```python
import pytest

@pytest.mark.requirement("REQ-AUTH-001")
def test_user_can_login():
# test implementation
pass

@pytest.mark.requirement("REQ-AUTH-001", "REQ-AUTH-002")
def test_login_creates_session():
# test can link to multiple requirements
pass
```

## Management Commands

SpecTrace provides Django management commands for various operations:

| Command | Description |
|---------|-------------|
| `parse_specs ` | Import requirements from markdown specs |
| `extract_links` | Extract test-requirement links from test files |
| `import_results ` | Import pytest JUnit XML and compute status |
| `validate_links ` | Validate links for drift detection (CI) |
| `import_slos ` | Import SLOs from OpenSLO YAML files |
| `update_slo_status --from-json ` | Update SLO status from observability data |
| `import_inapp_validations ` | Import in-app validation results |
| `check_invariants` | Validate data consistency (INV-A through INV-K) |

**Agent Task Commands** (see [docs/agent-tasks.md](docs/agent-tasks.md)):

| Command | Description |
|---------|-------------|
| `agent_register` | Register an agent with role (planner/coder/reviewer) |
| `agent_tasks` | List tasks with filtering |
| `agent_claim` | Claim an unclaimed task with lease |
| `agent_start` | Begin work on claimed task |
| `agent_submit` | Submit work for review |
| `agent_review` | Approve or request changes |
| `agent_merge` | Mark approved task as merged |
| `expire_leases` | Release stale task claims (cron) |

All commands are run via: `python spectrace/manage.py `

## Verification Status

- **Passing** - All linked tests pass
- **Failing** - Any linked test fails
- **Untested** - No tests linked to requirement

## Verification Methods

Requirements can specify how they should be verified:

- **test** - Verified by automated tests (default)
- **inapp** - Verified by in-app validation buttons/endpoints
- **both** - Must pass both test and in-app validation

## SLO Integration

Link requirements to Service Level Objectives using OpenSLO YAML:

```yaml
apiVersion: openslo/v1
kind: SLO
metadata:
name: api-availability
labels:
requirement: REQ-API-001
spec:
service: api-gateway
objectives:
- target: 0.999
timeWindow:
duration: 30d
```

Import with: `python spectrace/manage.py import_slos slos/`

## REST API

External systems can push status updates:

| Endpoint | Method | Description |
|----------|--------|-------------|
| `/api/slo/status/` | POST | Update SLO status from observability platforms |
| `/api/validation/result/` | POST | Submit in-app validation results |
| `/api/requirement//status/` | GET | Get requirement verification status |

### Example: Update SLO Status

```bash
curl -X POST http://localhost:8000/api/slo/status/ \
-H "Content-Type: application/json" \
-d '{
"slos": [
{"name": "api-availability", "status": "met", "current_value": 0.9995}
]
}'
```

### Example: Submit Validation Result

```bash
curl -X POST http://localhost:8000/api/validation/result/ \
-H "Content-Type: application/json" \
-d '{
"source": "production-app",
"validations": [
{"requirement_id": "REQ-AUTH-001", "name": "Login Flow", "status": "success"}
]
}'
```

## CI Integration

Validate test-requirement links in CI to catch drift:

```bash
python spectrace/manage.py validate_links links.json --strict
```

- `--strict` - Exit with error on warnings (missing coverage)
- `--format json` - Output JSON for programmatic parsing

Example in CI pipeline:
```yaml
# .github/workflows/test.yml
- name: Run tests
run: make test

- name: Validate requirements coverage
run: |
python spectrace/manage.py extract_links --output links.json
python spectrace/manage.py validate_links links.json --strict
```

## License

MIT