348 lines
21 KiB
Markdown
348 lines
21 KiB
Markdown
# Test Catalog
|
||
|
||
Comprehensive description of every test case in the framework — what each
|
||
one does, what it expects, what hardware it needs, and how to run it.
|
||
Generated by hand from the source files; rerun
|
||
`pytest --collect-only -q --no-cov` to see the live list.
|
||
|
||
## Quick reference
|
||
|
||
| Category | Files | Tests (incl. parametrize expansions) | Hardware? |
|
||
| --- | --- | --- | --- |
|
||
| Unit (pure logic) | 6 | 28 | none |
|
||
| Mock-loopback smoke | 2 | 6 | none |
|
||
| Plugin self-test | 1 | 1 | none |
|
||
| Hardware – MUM | 4 | 12 | MUM + ECU |
|
||
| Hardware – BabyLIN (legacy) | 4 | 4 | BabyLIN + ECU + Owon PSU |
|
||
| Hardware – Owon PSU | 1 | 1 | Owon PSU |
|
||
| **Total** | **18** | **52** | mixed |
|
||
|
||
The numbers count the cases pytest reports when collecting. Some tests are
|
||
`@parametrize`-expanded (e.g. `test_linframe_invalid_id_raises[-1]`,
|
||
`[64]`) and listed once below with a note on the parameters.
|
||
|
||
### How to run a category
|
||
|
||
```powershell
|
||
pytest -m "unit" # pure unit tests
|
||
pytest -m "not hardware" # everything except hardware (≈ 35 cases)
|
||
pytest -m "hardware and mum" # MUM-only hardware tests
|
||
pytest -m "hardware and babylin" # legacy BabyLIN hardware tests
|
||
pytest -m "hardware and not slow" # hardware excluding the slow auto-addressing test
|
||
```
|
||
|
||
---
|
||
|
||
## 1. Unit tests — `tests/unit/`
|
||
|
||
Pure-Python tests that don't touch hardware or external I/O. Run on every PR.
|
||
|
||
### 1.1 `test_linframe.py` — `LinFrame` validation
|
||
|
||
Source: [tests/unit/test_linframe.py](tests/unit/test_linframe.py)
|
||
|
||
| Test | Markers | Purpose |
|
||
| --- | --- | --- |
|
||
| `test_linframe_accepts_valid_ranges` | `unit` | Construct a `LinFrame(id=0x3F, data=8 bytes of zero)` and assert id/length round-trip cleanly. Ensures the maximum legal LIN classic ID and 8-byte payload are accepted. |
|
||
| `test_linframe_invalid_id_raises[-1]` / `[64]` | `unit` | Parametrized: `LinFrame(id=-1)` and `LinFrame(id=0x40)` must raise `ValueError`. Confirms the 0x00–0x3F clamp on classic LIN IDs. |
|
||
| `test_linframe_too_long_raises` | `unit` | `LinFrame(id=0x01, data=9 bytes)` must raise `ValueError`. Confirms the 8-byte payload upper bound. |
|
||
|
||
**Why it matters:** `LinFrame` is the type every adapter (Mock/MUM/BabyLIN) hands back to tests. If validation drifts, all downstream tests get more permissive silently.
|
||
|
||
---
|
||
|
||
### 1.2 `test_config_loader.py` — YAML configuration precedence
|
||
|
||
Source: [tests/unit/test_config_loader.py](tests/unit/test_config_loader.py)
|
||
|
||
| Test | Markers | Purpose |
|
||
| --- | --- | --- |
|
||
| `test_config_precedence_env_overrides` | `unit` | Writes a temp YAML with `interface.type: babylin` / `channel: 7`, points `ECU_TESTS_CONFIG` at it, then loads with `overrides={"interface": {"channel": 9}}`. Asserts the YAML's `type` made it through and the in-code override beat the YAML's `channel`. |
|
||
| `test_config_defaults_when_no_file` | `unit` | With no `ECU_TESTS_CONFIG` and no workspace root, `load_config()` must return defaults (`type: mock`, `flash.enabled: false`). |
|
||
|
||
**Precedence order asserted:** in-code `overrides` > `ECU_TESTS_CONFIG` env > `config/test_config.yaml` > built-in defaults.
|
||
|
||
---
|
||
|
||
### 1.3 `test_babylin_adapter_mocked.py` — BabyLIN adapter error path
|
||
|
||
Source: [tests/unit/test_babylin_adapter_mocked.py](tests/unit/test_babylin_adapter_mocked.py)
|
||
|
||
| Test | Markers | Purpose |
|
||
| --- | --- | --- |
|
||
| `test_connect_sdf_error_raises` | `unit` | Inject a fake BabyLIN wrapper whose `BLC_loadSDF` returns a non-OK code. `BabyLinInterface.connect()` must raise `RuntimeError`. Validates that SDK error codes during SDF download surface as Python exceptions instead of being silently ignored. |
|
||
|
||
---
|
||
|
||
### 1.4 `test_mum_adapter_mocked.py` — MUM adapter plumbing
|
||
|
||
Source: [tests/unit/test_mum_adapter_mocked.py](tests/unit/test_mum_adapter_mocked.py)
|
||
|
||
All cases inject fake `pymumclient` and `pylin` modules so the adapter can be exercised with no MUM hardware.
|
||
|
||
| Test | Markers | Purpose |
|
||
| --- | --- | --- |
|
||
| `test_connect_opens_mum_and_powers_up` | `unit` | `connect()` calls `MelexisUniversalMaster.open_all(host)`, `linmaster.setup()`, sets `lin_dev.baudrate`, and powers up the ECU exactly once. |
|
||
| `test_disconnect_powers_down_and_tears_down` | `unit` | `disconnect()` calls `power_control.power_down()` and `linmaster.teardown()` exactly once each. |
|
||
| `test_send_publishes_master_frame` | `unit` | `lin.send(LinFrame(0x0A, 8 bytes))` calls `lin_dev.send_message(master_to_slave=True, frame_id=0x0A, data_length=8, data=[...])`. |
|
||
| `test_receive_uses_frame_lengths_default` | `unit` | `lin.receive(id=0x11)` reads the configured length (4) from the default `frame_lengths` map and returns the slave bytes wrapped in a `LinFrame`. |
|
||
| `test_receive_returns_none_on_pylin_exception` | `unit` | If pylin raises during `send_message(master_to_slave=False, ...)`, `receive()` must return `None` (treated as timeout). Stops tests from having to wrap every receive in try/except. |
|
||
| `test_receive_without_id_raises` | `unit` | `lin.receive(id=None)` must raise `NotImplementedError`. The MUM is master-driven; passive listen is unsupported. |
|
||
| `test_send_raw_uses_classic_checksum_path` | `unit` | `lin.send_raw(bytes)` calls `transport_layer.ld_put_raw(data, baudrate=19200)`. This is the path BSM-SNPD diagnostic frames need (Classic checksum). |
|
||
| `test_power_cycle_calls_down_then_up` | `unit` | `lin.power_cycle(wait=0)` issues at least one extra `power_down()` and the matching `power_up()` on top of the connect-time power up. |
|
||
|
||
---
|
||
|
||
### 1.5 `test_ldf_database.py` — LDF parser wrapper
|
||
|
||
Source: [tests/unit/test_ldf_database.py](tests/unit/test_ldf_database.py)
|
||
|
||
Module is skipped automatically if `ldfparser` isn't installed. Uses `vendor/4SEVEN_color_lib_test.ldf` as fixture data.
|
||
|
||
| Test | Markers | Purpose |
|
||
| --- | --- | --- |
|
||
| `test_loads_metadata` | `unit` | `db.protocol_version` is one of `1.3`/`2.0`/`2.1` and `db.baudrate == 19200` for the 4SEVEN LDF. |
|
||
| `test_lookup_by_name_and_id` | `unit` | `db.frame("ALM_Req_A")` and `db.frame(0x0A)` return the same frame; id/name/length match the LDF Frames block. |
|
||
| `test_unknown_frame_raises` | `unit` | `db.frame("not_a_real_frame")` raises `FrameNotFound`. |
|
||
| `test_signal_layout_matches_ldf` | `unit` | `frame.signal_layout()` for `ALM_Req_A` contains the exact `(start_bit, name, width)` tuples from the LDF (spot-checks `AmbLightColourRed`, `AmbLightUpdate`, `AmbLightMode`, `AmbLightLIDTo`). |
|
||
| `test_pack_kwargs_full_payload` | `unit` | `frame.pack(...)` with all signals provided produces an 8-byte payload `ffffffff00000101`. |
|
||
| `test_pack_unspecified_signals_use_init_value` | `unit` | `frame.pack()` with no kwargs uses each signal's LDF `init_value`. Verified by decoding the packed output for `ColorConfigFrameRed` (which has non-zero init values like 5665). |
|
||
| `test_pack_dict_argument` | `unit` | `frame.pack({...})` and `frame.pack(**{...})` produce identical bytes. |
|
||
| `test_pack_rejects_args_and_kwargs_together` | `unit` | `frame.pack({"X": 1}, Y=2)` raises `TypeError`. |
|
||
| `test_unpack_round_trip` | `unit` | A non-trivial value set (RGB, intensity, mode bits, LID range) packs and unpacks back to the same dict. |
|
||
| `test_alm_status_decode_real_payload` | `unit` | `unpack(b"\\x07\\x00\\x00\\x00")` on `ALM_Status` yields `ALMNadNo == 7`. |
|
||
| `test_frame_lengths_includes_all_unconditional_frames` | `unit` | `db.frame_lengths()` contains every unconditional frame ID with a positive length (sanity: ALM_Req_A=8, ALM_Status=4, ConfigFrame=3). |
|
||
| `test_frames_returns_wrapped_frame_objects` | `unit` | `db.frames()` returns wrapped `Frame` objects whose names cover the expected set (ALM_Req_A, ALM_Status, ConfigFrame…). |
|
||
| `test_ldf_repr_does_not_explode` | `unit` | `repr(db)` includes `LdfDatabase` and doesn't raise. |
|
||
| `test_missing_file_raises_filenotfounderror` | `unit` | `LdfDatabase(missing_path)` raises `FileNotFoundError`. |
|
||
|
||
---
|
||
|
||
### 1.6 `test_hex_flasher.py` — flashing scaffold
|
||
|
||
Source: [tests/unit/test_hex_flasher.py](tests/unit/test_hex_flasher.py)
|
||
|
||
| Test | Markers | Purpose |
|
||
| --- | --- | --- |
|
||
| `test_hex_flasher_sends_basic_sequence` | `unit` | Writes a minimal Intel HEX (EOF-only) and runs `HexFlasher(stub_lin).flash_hex(path)`. Asserts no exception and that `lin.sent` is a list. Placeholder until the flasher is fleshed out with UDS — once real UDS is wired in, this test gains real assertions about the byte sequence. |
|
||
|
||
---
|
||
|
||
## 2. Mock-loopback smoke — `tests/`
|
||
|
||
Tests that exercise the full LinInterface API (send / receive / request) using either the in-process Mock adapter or the BabyLIN adapter with a mock SDK wrapper.
|
||
|
||
### 2.1 `test_smoke_mock.py` — Mock adapter end-to-end
|
||
|
||
Source: [tests/test_smoke_mock.py](tests/test_smoke_mock.py)
|
||
|
||
Module-local `lin` fixture forces `MockBabyLinInterface` regardless of the central config, so these always run as mock-only tests.
|
||
|
||
| Test | Markers | Purpose |
|
||
| --- | --- | --- |
|
||
| `TestMockLinInterface::test_mock_send_receive_echo` | `smoke req_001 req_003` | Send `LinFrame(0x12, [1,2,3])` and receive it back through the mock's loopback. ID and data must match exactly. |
|
||
| `TestMockLinInterface::test_mock_request_synthesized_response` | `smoke req_002` | `lin.request(id=0x21, length=4)` returns a deterministic frame where `data[i] == (id + i) & 0xFF`. The mock implements this pattern so request/response logic can be tested without hardware. |
|
||
| `TestMockLinInterface::test_mock_receive_timeout_behavior` | `smoke req_004` | `lin.receive(id=0xFF, timeout=0.1)` (no matching frame queued) returns `None` and doesn't block longer than the requested timeout. |
|
||
| `TestMockLinInterface::test_mock_frame_validation_boundaries[…]` | `boundary req_001 req_003` | Parametrized 4 ways: `(id, payload)` ∈ `{(0x01, [0x55]), (0x3F, [0xAA,0x55]), (0x20, 5 bytes), (0x15, 8 bytes)}`. Each frame round-trips through send/receive with byte-for-byte integrity. Covers the legal LIN ID and payload-length boundaries. |
|
||
|
||
---
|
||
|
||
### 2.2 `test_babylin_wrapper_mock.py` — BabyLIN adapter against a mocked SDK
|
||
|
||
Source: [tests/test_babylin_wrapper_mock.py](tests/test_babylin_wrapper_mock.py)
|
||
|
||
Constructs `BabyLinInterface(wrapper_module=mock_bl)` so the adapter exercises real code paths without needing the BabyLIN native library.
|
||
|
||
| Test | Markers | Purpose |
|
||
| --- | --- | --- |
|
||
| `test_babylin_sdk_adapter_with_mock_wrapper` | `babylin smoke req_001` | Connect (discover port, open, load SDF, start schedule) → `send(LinFrame(0x12, [0xAA,0x55,0x01]))` → `receive(timeout=0.1)`. The mock wrapper echoes the transmitted bytes; the test asserts ID and data round-trip. |
|
||
| `test_babylin_master_request_with_mock_wrapper[…]` | `babylin smoke req_001` | Parametrized 2 ways. **`vendor.mock_babylin_wrapper-True`**: full mock with `BLC_sendRawMasterRequest(channel, id, length)` — expects the deterministic pattern. **`_MockBytesOnly-False`**: shim where only the bytes signature is supported; the adapter falls back to sending zeros and the response is asserted to be zeros of the requested length. Together these cover both SDK signatures the adapter must handle. |
|
||
|
||
---
|
||
|
||
## 3. Plugin self-test — `tests/plugin/`
|
||
|
||
### 3.1 `test_conftest_plugin_artifacts.py`
|
||
|
||
Source: [tests/plugin/test_conftest_plugin_artifacts.py](tests/plugin/test_conftest_plugin_artifacts.py)
|
||
|
||
| Test | Markers | Purpose |
|
||
| --- | --- | --- |
|
||
| `test_plugin_writes_artifacts` | `unit` | Uses pytest's `pytester` to run a synthetic test in a temp dir with the reporting plugin loaded. Asserts `reports/requirements_coverage.json` is created with `REQ-001` mapped, that `reports/summary.md` exists, and that the JSON references the generated `report.html` and `junit.xml`. Validates the plugin's full artifact pipeline end-to-end. |
|
||
|
||
---
|
||
|
||
## 4. Hardware – MUM (Melexis Universal Master)
|
||
|
||
Tests gated on `interface.type == "mum"`. All require:
|
||
|
||
- A reachable MUM (default `192.168.7.2` over USB-RNDIS)
|
||
- Melexis `pylin` and `pymumclient` Python packages installed
|
||
- An ECU wired to the MUM's `lin0` and powered through `power_out0`
|
||
- `interface.ldf_path` pointing at the LDF that matches the ECU
|
||
|
||
### 4.1 `test_e2e_mum_led_activate.py`
|
||
|
||
Source: [tests/hardware/test_e2e_mum_led_activate.py](tests/hardware/test_e2e_mum_led_activate.py)
|
||
|
||
| Test | Markers | Purpose |
|
||
| --- | --- | --- |
|
||
| `test_mum_e2e_power_on_then_led_activate` | `hardware mum` | The "smoke + LED on" flow. Reads `ALM_Status`, decodes `ALMNadNo` via the LDF, builds an `ALM_Req_A` payload (full-white RGB at full intensity, immediate setpoint, mode 0) targeting that NAD, sends it, and re-reads `ALM_Status` to confirm the bus is still alive afterward. |
|
||
|
||
**Notes:**
|
||
|
||
- Power-up is implicit — the MUM `lin` fixture already calls `power_control.power_up()` on connect.
|
||
- Frame layouts come from the `ldf` fixture, not hand-coded byte positions.
|
||
|
||
### 4.2 `test_mum_alm_animation.py`
|
||
|
||
Source: [tests/hardware/test_mum_alm_animation.py](tests/hardware/test_mum_alm_animation.py)
|
||
|
||
Suite of automated checks for the four behaviour buckets in
|
||
`vendor/automated_lin_test/test_animation.py`. A module-scoped fixture
|
||
reads the ECU's NAD once; an `autouse` fixture forces an OFF baseline
|
||
before and after every test so cases don't bleed state into each other.
|
||
|
||
| Test | Markers | Purpose |
|
||
| --- | --- | --- |
|
||
| `test_mode0_immediate_setpoint_drives_led_on` | `hardware mum` | `AmbLightMode=0`, bright RGB+I, target single NAD. Polls `ALMLEDState` and asserts it reaches `LED_ON` within ~1 s. |
|
||
| `test_mode1_fade_passes_through_animating` | `hardware mum` | `AmbLightMode=1` with `AmbLightDuration=10` (≈ 2 s expected). Asserts `ALMLEDState` enters `ANIMATING` during the fade and reaches `LED_ON` afterward. |
|
||
| `test_duration_scales_with_lsb[5-0.6]` / `[10-0.6]` | `hardware mum` | Parametrized: with `Duration=N`, the `ANIMATING` window must be within ±0.6 s of `N × 0.2 s`. Loose tolerance accounts for the 50 ms poll cadence and bus latency. |
|
||
| `test_update1_save_does_not_apply_immediately` | `hardware mum` | `AmbLightUpdate=1` (Save) with bright payload — `ALMLEDState` must NOT transition to `ANIMATING` or `LED_ON`. Verifies save-only semantics. |
|
||
| `test_update2_apply_runs_saved_command` | `hardware mum` | After a save (Update=1), an apply (Update=2) with throwaway payload should execute the saved command — `ANIMATING` is observed. |
|
||
| `test_update3_discard_then_apply_is_noop` | `hardware mum` | Save → Discard (Update=3) → Apply. Apply must be a no-op (no `ANIMATING`, no `LED_ON`). Verifies the discard clears the saved buffer. |
|
||
| `test_lid_broadcast_targets_node` | `hardware mum` | `AmbLightLIDFrom=0x00, AmbLightLIDTo=0xFF` with bright RGB. Node must react and reach `LED_ON`, regardless of its actual NAD. |
|
||
| `test_lid_invalid_range_is_ignored` | `hardware mum` | `LIDFrom > LIDTo` (e.g. `0x14 > 0x0A`). Node must ignore the frame — `ALMLEDState` stays at OFF baseline. |
|
||
|
||
**Caveats:**
|
||
|
||
- Visual properties (color, smoothness of fade) cannot be asserted without a camera. These tests assert only what the LIN bus exposes (`ALMLEDState` transitions, ANIMATING duration). For a human-verified visual run, use the original `vendor/automated_lin_test/test_animation.py`.
|
||
- `test_duration_scales_with_lsb` polls every 50 ms; the tolerance is intentionally loose. Tighten it once you've measured your firmware's actual jitter.
|
||
|
||
### 4.3 `test_mum_auto_addressing.py`
|
||
|
||
Source: [tests/hardware/test_mum_auto_addressing.py](tests/hardware/test_mum_auto_addressing.py)
|
||
|
||
| Test | Markers | Purpose |
|
||
| --- | --- | --- |
|
||
| `test_bsm_auto_addressing_changes_nad` | `hardware mum slow` | Drives the full BSM-SNPD sequence (INIT → 16× ASSIGN → STORE → FINALIZE) with a target NAD different from the ECU's current one, then re-reads `ALM_Status` and asserts `ALMNadNo == target`. Always restores the original NAD in a `finally` block (the restore result is recorded as report properties). Uses `lin.send_raw()` so the LIN 1.x **Classic** checksum is used — Enhanced would be silently rejected by the firmware. |
|
||
|
||
**Notes:**
|
||
|
||
- Marked `slow` because the full sequence runs in ~3-4 seconds (two BSM cycles plus settle). Skip with `-m "hardware and mum and not slow"`.
|
||
- Restore is best-effort: if the second BSM cycle fails, the bench stays at the target NAD. The restore failure is visible as `restore_warning` / `restore_error` in the report properties.
|
||
|
||
### 4.4 `test_e2e_power_on_lin_smoke.py` *(legacy, BabyLIN-marked)*
|
||
|
||
Source: [tests/hardware/test_e2e_power_on_lin_smoke.py](tests/hardware/test_e2e_power_on_lin_smoke.py)
|
||
|
||
Despite living in `tests/hardware/`, this file targets the **BabyLIN** adapter (it predates the MUM migration). See section 5.4.
|
||
|
||
---
|
||
|
||
## 5. Hardware – BabyLIN (legacy)
|
||
|
||
Tests gated on `interface.type == "babylin"`. Require:
|
||
|
||
- BabyLIN device + native libraries placed under `vendor/`
|
||
- An SDF compiled from your LDF, path supplied via `interface.sdf_path`
|
||
- For the E2E test: an Owon PSU on a serial port (the BabyLIN doesn't supply ECU power)
|
||
|
||
### 5.1 `test_babylin_hardware_smoke.py`
|
||
|
||
Source: [tests/test_babylin_hardware_smoke.py](tests/test_babylin_hardware_smoke.py)
|
||
|
||
| Test | Markers | Purpose |
|
||
| --- | --- | --- |
|
||
| `test_babylin_connect_receive_timeout` | `hardware babylin` | Minimal sanity: open the BabyLIN device via the configured `lin` fixture and call `lin.receive(timeout=0.2)`. Accepts either a `LinFrame` or `None` (timeout) — verifies the adapter is functional and not crashing. |
|
||
|
||
### 5.2 `test_babylin_hardware_schedule_smoke.py`
|
||
|
||
Source: [tests/test_babylin_hardware_schedule_smoke.py](tests/test_babylin_hardware_schedule_smoke.py)
|
||
|
||
| Test | Markers | Purpose |
|
||
| --- | --- | --- |
|
||
| `test_babylin_sdk_example_flow` | `hardware babylin smoke` | Verifies `interface.type == "babylin"` and an `sdf_path` is set, then exercises the receive path while the configured `schedule_nr` runs. Mirrors the vendor example flow (open / load SDF / start schedule / receive). Accepts either a frame or a timeout. |
|
||
|
||
### 5.3 `test_hardware_placeholder.py`
|
||
|
||
Source: [tests/test_hardware_placeholder.py](tests/test_hardware_placeholder.py)
|
||
|
||
| Test | Markers | Purpose |
|
||
| --- | --- | --- |
|
||
| `test_babylin_placeholder` | `hardware babylin` | Always passes. Used to confirm the marker filter and CI plumbing for hardware jobs without requiring any specific device behaviour. |
|
||
|
||
### 5.4 `test_e2e_power_on_lin_smoke.py`
|
||
|
||
Source: [tests/hardware/test_e2e_power_on_lin_smoke.py](tests/hardware/test_e2e_power_on_lin_smoke.py)
|
||
|
||
| Test | Markers | Purpose |
|
||
| --- | --- | --- |
|
||
| `test_e2e_power_on_then_cco_rgb_activate` | `hardware babylin` | Full BabyLIN E2E. Powers the ECU through the Owon PSU, switches to the LDF's `CCO` schedule via `lin.start_schedule("CCO")` (which resolves the schedule name to its index using `BLC_SDF_getScheduleNr`), publishes an `ALM_Req_A` payload with full-white RGB at full intensity, captures bus traffic for ~1 s, and asserts at least one frame was observed. Always disables PSU output in `finally`. |
|
||
|
||
**Notes:**
|
||
|
||
- This test was the original E2E target before the MUM migration. It still works as a BabyLIN smoke test if you flip `interface.type: babylin` and provide a valid SDF.
|
||
- The Owon PSU section of `config.power_supply` must be enabled (`port`, `set_voltage`, `set_current`, `do_set: true`).
|
||
|
||
---
|
||
|
||
## 6. Hardware – Owon PSU only
|
||
|
||
### 6.1 `test_owon_psu.py`
|
||
|
||
Source: [tests/hardware/test_owon_psu.py](tests/hardware/test_owon_psu.py)
|
||
|
||
| Test | Markers | Purpose |
|
||
| --- | --- | --- |
|
||
| `test_owon_psu_idn_and_optional_set` | `hardware` | Independent of any LIN adapter. Skips unless `power_supply.enabled: true` and `power_supply.port` is set. Opens the configured serial port, queries `*IDN?` (asserts non-empty; optionally checks `idn_substr`), reads `output?`, and — if `do_set: true` — sets V/I, briefly enables output, measures back, then disables. All values are recorded as report properties. |
|
||
|
||
**Notes:**
|
||
|
||
- Useful as a pure-PSU bench check before running any LIN E2E test.
|
||
- Settings can live in `config/test_config.yaml` (central) or `config/owon_psu.yaml` (per-machine override; the latter wins).
|
||
|
||
---
|
||
|
||
## Test naming conventions
|
||
|
||
When adding new tests, follow these patterns so the catalog stays scannable:
|
||
|
||
- **Unit tests** live in `tests/unit/` and carry `@pytest.mark.unit`. Filename starts with `test_<thing>_<scope>` (e.g., `test_mum_adapter_mocked.py`).
|
||
- **Mock smoke tests** live in `tests/` and use either the in-process Mock adapter (override the `lin` fixture locally) or an injected SDK mock wrapper.
|
||
- **Hardware tests** live in `tests/hardware/` (preferred) or `tests/` (legacy) and carry `@pytest.mark.hardware` plus an adapter marker (`mum` / `babylin`).
|
||
- **Slow tests** (>5 s) carry `@pytest.mark.slow` so they can be excluded with `-m "not slow"`.
|
||
- **Requirement traceability** is via `req_NNN` markers on the test function and a `Requirements:` line in the docstring (parsed by the reporting plugin).
|
||
|
||
## Docstring format
|
||
|
||
The reporting plugin extracts these fields from each test's docstring and renders them in the HTML report:
|
||
|
||
```python
|
||
"""
|
||
Title: <short title>
|
||
|
||
Description:
|
||
<what the test validates and why>
|
||
|
||
Requirements: REQ-001, REQ-002
|
||
|
||
Test Steps:
|
||
1. <step one>
|
||
2. <step two>
|
||
|
||
Expected Result:
|
||
<succinct expected outcome>
|
||
"""
|
||
```
|
||
|
||
See `docs/03_reporting_and_metadata.md` and `docs/15_report_properties_cheatsheet.md` for the full schema.
|
||
|
||
## Related docs
|
||
|
||
- `docs/12_using_the_framework.md` — How to actually run the various suites
|
||
- `docs/04_lin_interface_call_flow.md` — What `send` / `receive` do per adapter
|
||
- `docs/16_mum_internals.md` — MUM adapter implementation details
|
||
- `docs/17_ldf_parser.md` — `ldf` fixture and `Frame.pack` / `unpack`
|
||
- `docs/06_requirement_traceability.md` — How `req_NNN` markers feed the coverage JSON
|