add ldf parser

This commit is contained in:
Hosam-Eldin Mostafa 2026-04-29 00:56:07 +02:00
parent a10187844a
commit d268d845ce
6 changed files with 1530 additions and 0 deletions

179
docs/17_ldf_parser.md Normal file
View File

@ -0,0 +1,179 @@
# LDF Parser & Frame Helpers
The framework parses your LDF (LIN Description File) at session start and
exposes a typed `LdfDatabase` to tests. Tests then build and decode frames
by **signal name**, never by hand-counting bit positions.
## Why
Hard-coded frame layouts (the `ALM_REQ_A_FRAME = {...}` style in
`vendor/automated_lin_test/config.py`) drift the moment the LDF changes.
Loading the LDF directly removes the drift and gives you a pleasant API:
```python
def test_x(lin, ldf):
req = ldf.frame("ALM_Req_A")
payload = req.pack(
AmbLightColourRed=0xFF, AmbLightColourGreen=0xFF,
AmbLightColourBlue=0xFF, AmbLightIntensity=0xFF,
AmbLightLIDFrom=nad, AmbLightLIDTo=nad,
)
lin.send(LinFrame(id=req.id, data=payload))
raw = lin.receive(id=ldf.frame("ALM_Status").id, timeout=1.0)
sig = ldf.frame("ALM_Status").unpack(bytes(raw.data))
assert sig["ALMNadNo"] == nad
```
## Where it lives
- Parser wrapper: `ecu_framework/lin/ldf.py`
- Test fixture: `ldf` (session-scoped, in `tests/conftest.py`)
- Underlying library: [`ldfparser`](https://pypi.org/project/ldfparser/) (pure-Python, MIT)
- LDF location is read from `interface.ldf_path` in YAML
- Unit tests against `vendor/4SEVEN_color_lib_test.ldf`: `tests/unit/test_ldf_database.py`
## Configuration
Set `interface.ldf_path` (relative paths resolve against the workspace root):
```yaml
interface:
type: mum
host: 192.168.7.2
bitrate: 19200
ldf_path: ./vendor/4SEVEN_color_lib_test.ldf
# frame_lengths is optional: any keys here override the LDF on a
# per-frame-id basis. Leave empty to inherit everything from the LDF.
frame_lengths: {}
```
When `ldf_path` is set, the `lin` fixture also feeds the LDF's
`{frame_id: length}` map into `MumLinInterface(frame_lengths=...)`, so
`lin.receive(id=...)` knows the right number of bytes to ask for **for
every frame in the LDF** — no per-id bookkeeping required.
## API
### `LdfDatabase`
```python
from ecu_framework.lin.ldf import LdfDatabase
db = LdfDatabase("./vendor/4SEVEN_color_lib_test.ldf")
db.protocol_version # "2.1"
db.baudrate # 19200
db.frame("ALM_Req_A") # by name
db.frame(0x0A) # by frame_id
db.frames() # list[Frame]
db.frame_lengths() # {frame_id: length} — drop into MumLinInterface
db.signal_names("ALM_Req_A") # ['AmbLightColourRed', ...]
```
`db.frame(...)` raises `FrameNotFound` (a `KeyError` subclass) if the name
or ID isn't present; missing files raise `FileNotFoundError`.
### `Frame`
```python
frame = db.frame("ALM_Req_A")
frame.id # 0x0A (int)
frame.name # "ALM_Req_A"
frame.length # 8
frame.signal_names() # ['AmbLightColourRed', ...]
frame.signal_layout() # [(start_bit, name, width), ...]
# Raw integer pack/unpack — use this for tests that work in raw values.
payload = frame.pack(AmbLightColourRed=255, AmbLightColourGreen=128)
payload = frame.pack({"AmbLightColourRed": 255}) # dict form is fine too
decoded = frame.unpack(payload) # {'AmbLightColourRed': 255, ...}
# Encoding-aware variant (logical/physical values from the LDF) — use this
# if you want to write `AmbLightUpdate="Immediate color Update"`:
encoded = frame.encode({"AmbLightUpdate": "Immediate color Update", ...})
decoded = frame.decode(encoded)
```
### Default values
`pack()` doesn't require every signal — anything you omit takes the
**`init_value` declared in the LDF**. For example, `ColorConfigFrameRed`'s
`_X` signal has `init_value = 5665`, so `frame.pack()` with no kwargs
produces a payload that decodes back to that value:
```python
db.frame("ColorConfigFrameRed").unpack(db.frame("ColorConfigFrameRed").pack())
# → {'ColorConfigFrameRed_X': 5665, 'ColorConfigFrameRed_Y': 2396, ...}
```
This means you can usually pass only the signals the test cares about and
let the LDF supply sensible defaults for the rest.
## The `ldf` fixture
`tests/conftest.py` provides a session-scoped `ldf` fixture that:
1. Reads `interface.ldf_path` from config.
2. Resolves it against the workspace root if relative.
3. Skips the test cleanly with a clear message if the path is missing,
the file isn't there, or `ldfparser` isn't installed.
4. Returns an `LdfDatabase`.
A test that needs LDF-defined frames simply requests it:
```python
def test_thing(lin, ldf):
payload = ldf.frame("ALM_Req_A").pack(AmbLightColourRed=0xFF)
lin.send(LinFrame(id=ldf.frame("ALM_Req_A").id, data=payload))
```
Tests that don't need LDF can ignore the fixture; nothing is loaded
unless the fixture is requested.
## Switching between raw and encoded values
| Use this | When |
| --- | --- |
| `frame.pack(**raw_ints) / frame.unpack(bytes)` | You're writing test logic against numeric signal values (most assertions). |
| `frame.encode(values_dict) / frame.decode(bytes)` | You want LDF logical names (`"Immediate color Update"`) or scaled physical values (e.g. `AmbLightDuration` is `value × 0.2 s`). |
Both round-trip through the same byte representation; the difference is
purely how the values look in Python.
## Common pitfalls
- **Frame ID ranges**: `LinFrame` validates IDs as 0x00..0x3F (LIN classic 6-bit). `ldfparser` returns IDs in this range for normal frames; diagnostic frames (`MasterReq=0x3C`, `SlaveResp=0x3D`) are also accepted. If you ever see an out-of-range ID, you're probably looking at an event-triggered frame's collision resolution table — not a real bus ID.
- **Bit ordering**: LDF and ldfparser both use the LIN-standard little-endian bit ordering within bytes. The framework's `Frame.pack()` matches the existing hand-rolled `vendor/automated_lin_test/config.py:pack_frame()` byte-for-byte for the 4SEVEN file.
- **`encode` vs `encode_raw`**: ldfparser's `encode()` insists on encoded values (`"Immediate color Update"` not `0`). Our `Frame.pack()` uses `encode_raw()` instead, so kwargs are integers. If you need encoded names, use `Frame.encode(dict)` explicitly.
## Migration from hardcoded frames
If you have tests that import the dicts in `vendor/automated_lin_test/config.py`
(`ALM_REQ_A_FRAME`, etc.) and call its `pack_frame` / `unpack_frame`, they
keep working — the new system is additive. To migrate a test:
```python
# Before
from config import ALM_REQ_A_FRAME, pack_frame
data = pack_frame(ALM_REQ_A_FRAME, AmbLightColourRed=255, ...)
lin.send_message(master_to_slave=True, frame_id=ALM_REQ_A_FRAME['frame_id'],
data_length=ALM_REQ_A_FRAME['length'], data=data)
# After
def test(lin, ldf):
f = ldf.frame("ALM_Req_A")
lin.send(LinFrame(id=f.id, data=f.pack(AmbLightColourRed=255, ...)))
```
## Related
- `docs/02_configuration_resolution.md``interface.ldf_path` schema
- `docs/04_lin_interface_call_flow.md` — how MUM uses `frame_lengths`
- `docs/16_mum_internals.md` — MUM adapter internals (the `ldf` fixture is the recommended source for `frame_lengths` rather than hand-maintained YAML)
- `vendor/4SEVEN_color_lib_test.ldf` — the LDF used as test fixture

347
docs/18_test_catalog.md Normal file
View File

@ -0,0 +1,347 @@
# Test Catalog
Comprehensive description of every test case in the framework — what each
one does, what it expects, what hardware it needs, and how to run it.
Generated by hand from the source files; rerun
`pytest --collect-only -q --no-cov` to see the live list.
## Quick reference
| Category | Files | Tests (incl. parametrize expansions) | Hardware? |
| --- | --- | --- | --- |
| Unit (pure logic) | 6 | 28 | none |
| Mock-loopback smoke | 2 | 6 | none |
| Plugin self-test | 1 | 1 | none |
| Hardware MUM | 4 | 12 | MUM + ECU |
| Hardware BabyLIN (legacy) | 4 | 4 | BabyLIN + ECU + Owon PSU |
| Hardware Owon PSU | 1 | 1 | Owon PSU |
| **Total** | **18** | **52** | mixed |
The numbers count the cases pytest reports when collecting. Some tests are
`@parametrize`-expanded (e.g. `test_linframe_invalid_id_raises[-1]`,
`[64]`) and listed once below with a note on the parameters.
### How to run a category
```powershell
pytest -m "unit" # pure unit tests
pytest -m "not hardware" # everything except hardware (≈ 35 cases)
pytest -m "hardware and mum" # MUM-only hardware tests
pytest -m "hardware and babylin" # legacy BabyLIN hardware tests
pytest -m "hardware and not slow" # hardware excluding the slow auto-addressing test
```
---
## 1. Unit tests — `tests/unit/`
Pure-Python tests that don't touch hardware or external I/O. Run on every PR.
### 1.1 `test_linframe.py``LinFrame` validation
Source: [tests/unit/test_linframe.py](tests/unit/test_linframe.py)
| Test | Markers | Purpose |
| --- | --- | --- |
| `test_linframe_accepts_valid_ranges` | `unit` | Construct a `LinFrame(id=0x3F, data=8 bytes of zero)` and assert id/length round-trip cleanly. Ensures the maximum legal LIN classic ID and 8-byte payload are accepted. |
| `test_linframe_invalid_id_raises[-1]` / `[64]` | `unit` | Parametrized: `LinFrame(id=-1)` and `LinFrame(id=0x40)` must raise `ValueError`. Confirms the 0x000x3F clamp on classic LIN IDs. |
| `test_linframe_too_long_raises` | `unit` | `LinFrame(id=0x01, data=9 bytes)` must raise `ValueError`. Confirms the 8-byte payload upper bound. |
**Why it matters:** `LinFrame` is the type every adapter (Mock/MUM/BabyLIN) hands back to tests. If validation drifts, all downstream tests get more permissive silently.
---
### 1.2 `test_config_loader.py` — YAML configuration precedence
Source: [tests/unit/test_config_loader.py](tests/unit/test_config_loader.py)
| Test | Markers | Purpose |
| --- | --- | --- |
| `test_config_precedence_env_overrides` | `unit` | Writes a temp YAML with `interface.type: babylin` / `channel: 7`, points `ECU_TESTS_CONFIG` at it, then loads with `overrides={"interface": {"channel": 9}}`. Asserts the YAML's `type` made it through and the in-code override beat the YAML's `channel`. |
| `test_config_defaults_when_no_file` | `unit` | With no `ECU_TESTS_CONFIG` and no workspace root, `load_config()` must return defaults (`type: mock`, `flash.enabled: false`). |
**Precedence order asserted:** in-code `overrides` > `ECU_TESTS_CONFIG` env > `config/test_config.yaml` > built-in defaults.
---
### 1.3 `test_babylin_adapter_mocked.py` — BabyLIN adapter error path
Source: [tests/unit/test_babylin_adapter_mocked.py](tests/unit/test_babylin_adapter_mocked.py)
| Test | Markers | Purpose |
| --- | --- | --- |
| `test_connect_sdf_error_raises` | `unit` | Inject a fake BabyLIN wrapper whose `BLC_loadSDF` returns a non-OK code. `BabyLinInterface.connect()` must raise `RuntimeError`. Validates that SDK error codes during SDF download surface as Python exceptions instead of being silently ignored. |
---
### 1.4 `test_mum_adapter_mocked.py` — MUM adapter plumbing
Source: [tests/unit/test_mum_adapter_mocked.py](tests/unit/test_mum_adapter_mocked.py)
All cases inject fake `pymumclient` and `pylin` modules so the adapter can be exercised with no MUM hardware.
| Test | Markers | Purpose |
| --- | --- | --- |
| `test_connect_opens_mum_and_powers_up` | `unit` | `connect()` calls `MelexisUniversalMaster.open_all(host)`, `linmaster.setup()`, sets `lin_dev.baudrate`, and powers up the ECU exactly once. |
| `test_disconnect_powers_down_and_tears_down` | `unit` | `disconnect()` calls `power_control.power_down()` and `linmaster.teardown()` exactly once each. |
| `test_send_publishes_master_frame` | `unit` | `lin.send(LinFrame(0x0A, 8 bytes))` calls `lin_dev.send_message(master_to_slave=True, frame_id=0x0A, data_length=8, data=[...])`. |
| `test_receive_uses_frame_lengths_default` | `unit` | `lin.receive(id=0x11)` reads the configured length (4) from the default `frame_lengths` map and returns the slave bytes wrapped in a `LinFrame`. |
| `test_receive_returns_none_on_pylin_exception` | `unit` | If pylin raises during `send_message(master_to_slave=False, ...)`, `receive()` must return `None` (treated as timeout). Stops tests from having to wrap every receive in try/except. |
| `test_receive_without_id_raises` | `unit` | `lin.receive(id=None)` must raise `NotImplementedError`. The MUM is master-driven; passive listen is unsupported. |
| `test_send_raw_uses_classic_checksum_path` | `unit` | `lin.send_raw(bytes)` calls `transport_layer.ld_put_raw(data, baudrate=19200)`. This is the path BSM-SNPD diagnostic frames need (Classic checksum). |
| `test_power_cycle_calls_down_then_up` | `unit` | `lin.power_cycle(wait=0)` issues at least one extra `power_down()` and the matching `power_up()` on top of the connect-time power up. |
---
### 1.5 `test_ldf_database.py` — LDF parser wrapper
Source: [tests/unit/test_ldf_database.py](tests/unit/test_ldf_database.py)
Module is skipped automatically if `ldfparser` isn't installed. Uses `vendor/4SEVEN_color_lib_test.ldf` as fixture data.
| Test | Markers | Purpose |
| --- | --- | --- |
| `test_loads_metadata` | `unit` | `db.protocol_version` is one of `1.3`/`2.0`/`2.1` and `db.baudrate == 19200` for the 4SEVEN LDF. |
| `test_lookup_by_name_and_id` | `unit` | `db.frame("ALM_Req_A")` and `db.frame(0x0A)` return the same frame; id/name/length match the LDF Frames block. |
| `test_unknown_frame_raises` | `unit` | `db.frame("not_a_real_frame")` raises `FrameNotFound`. |
| `test_signal_layout_matches_ldf` | `unit` | `frame.signal_layout()` for `ALM_Req_A` contains the exact `(start_bit, name, width)` tuples from the LDF (spot-checks `AmbLightColourRed`, `AmbLightUpdate`, `AmbLightMode`, `AmbLightLIDTo`). |
| `test_pack_kwargs_full_payload` | `unit` | `frame.pack(...)` with all signals provided produces an 8-byte payload `ffffffff00000101`. |
| `test_pack_unspecified_signals_use_init_value` | `unit` | `frame.pack()` with no kwargs uses each signal's LDF `init_value`. Verified by decoding the packed output for `ColorConfigFrameRed` (which has non-zero init values like 5665). |
| `test_pack_dict_argument` | `unit` | `frame.pack({...})` and `frame.pack(**{...})` produce identical bytes. |
| `test_pack_rejects_args_and_kwargs_together` | `unit` | `frame.pack({"X": 1}, Y=2)` raises `TypeError`. |
| `test_unpack_round_trip` | `unit` | A non-trivial value set (RGB, intensity, mode bits, LID range) packs and unpacks back to the same dict. |
| `test_alm_status_decode_real_payload` | `unit` | `unpack(b"\\x07\\x00\\x00\\x00")` on `ALM_Status` yields `ALMNadNo == 7`. |
| `test_frame_lengths_includes_all_unconditional_frames` | `unit` | `db.frame_lengths()` contains every unconditional frame ID with a positive length (sanity: ALM_Req_A=8, ALM_Status=4, ConfigFrame=3). |
| `test_frames_returns_wrapped_frame_objects` | `unit` | `db.frames()` returns wrapped `Frame` objects whose names cover the expected set (ALM_Req_A, ALM_Status, ConfigFrame…). |
| `test_ldf_repr_does_not_explode` | `unit` | `repr(db)` includes `LdfDatabase` and doesn't raise. |
| `test_missing_file_raises_filenotfounderror` | `unit` | `LdfDatabase(missing_path)` raises `FileNotFoundError`. |
---
### 1.6 `test_hex_flasher.py` — flashing scaffold
Source: [tests/unit/test_hex_flasher.py](tests/unit/test_hex_flasher.py)
| Test | Markers | Purpose |
| --- | --- | --- |
| `test_hex_flasher_sends_basic_sequence` | `unit` | Writes a minimal Intel HEX (EOF-only) and runs `HexFlasher(stub_lin).flash_hex(path)`. Asserts no exception and that `lin.sent` is a list. Placeholder until the flasher is fleshed out with UDS — once real UDS is wired in, this test gains real assertions about the byte sequence. |
---
## 2. Mock-loopback smoke — `tests/`
Tests that exercise the full LinInterface API (send / receive / request) using either the in-process Mock adapter or the BabyLIN adapter with a mock SDK wrapper.
### 2.1 `test_smoke_mock.py` — Mock adapter end-to-end
Source: [tests/test_smoke_mock.py](tests/test_smoke_mock.py)
Module-local `lin` fixture forces `MockBabyLinInterface` regardless of the central config, so these always run as mock-only tests.
| Test | Markers | Purpose |
| --- | --- | --- |
| `TestMockLinInterface::test_mock_send_receive_echo` | `smoke req_001 req_003` | Send `LinFrame(0x12, [1,2,3])` and receive it back through the mock's loopback. ID and data must match exactly. |
| `TestMockLinInterface::test_mock_request_synthesized_response` | `smoke req_002` | `lin.request(id=0x21, length=4)` returns a deterministic frame where `data[i] == (id + i) & 0xFF`. The mock implements this pattern so request/response logic can be tested without hardware. |
| `TestMockLinInterface::test_mock_receive_timeout_behavior` | `smoke req_004` | `lin.receive(id=0xFF, timeout=0.1)` (no matching frame queued) returns `None` and doesn't block longer than the requested timeout. |
| `TestMockLinInterface::test_mock_frame_validation_boundaries[…]` | `boundary req_001 req_003` | Parametrized 4 ways: `(id, payload)``{(0x01, [0x55]), (0x3F, [0xAA,0x55]), (0x20, 5 bytes), (0x15, 8 bytes)}`. Each frame round-trips through send/receive with byte-for-byte integrity. Covers the legal LIN ID and payload-length boundaries. |
---
### 2.2 `test_babylin_wrapper_mock.py` — BabyLIN adapter against a mocked SDK
Source: [tests/test_babylin_wrapper_mock.py](tests/test_babylin_wrapper_mock.py)
Constructs `BabyLinInterface(wrapper_module=mock_bl)` so the adapter exercises real code paths without needing the BabyLIN native library.
| Test | Markers | Purpose |
| --- | --- | --- |
| `test_babylin_sdk_adapter_with_mock_wrapper` | `babylin smoke req_001` | Connect (discover port, open, load SDF, start schedule) → `send(LinFrame(0x12, [0xAA,0x55,0x01]))``receive(timeout=0.1)`. The mock wrapper echoes the transmitted bytes; the test asserts ID and data round-trip. |
| `test_babylin_master_request_with_mock_wrapper[…]` | `babylin smoke req_001` | Parametrized 2 ways. **`vendor.mock_babylin_wrapper-True`**: full mock with `BLC_sendRawMasterRequest(channel, id, length)` — expects the deterministic pattern. **`_MockBytesOnly-False`**: shim where only the bytes signature is supported; the adapter falls back to sending zeros and the response is asserted to be zeros of the requested length. Together these cover both SDK signatures the adapter must handle. |
---
## 3. Plugin self-test — `tests/plugin/`
### 3.1 `test_conftest_plugin_artifacts.py`
Source: [tests/plugin/test_conftest_plugin_artifacts.py](tests/plugin/test_conftest_plugin_artifacts.py)
| Test | Markers | Purpose |
| --- | --- | --- |
| `test_plugin_writes_artifacts` | `unit` | Uses pytest's `pytester` to run a synthetic test in a temp dir with the reporting plugin loaded. Asserts `reports/requirements_coverage.json` is created with `REQ-001` mapped, that `reports/summary.md` exists, and that the JSON references the generated `report.html` and `junit.xml`. Validates the plugin's full artifact pipeline end-to-end. |
---
## 4. Hardware MUM (Melexis Universal Master)
Tests gated on `interface.type == "mum"`. All require:
- A reachable MUM (default `192.168.7.2` over USB-RNDIS)
- Melexis `pylin` and `pymumclient` Python packages installed
- An ECU wired to the MUM's `lin0` and powered through `power_out0`
- `interface.ldf_path` pointing at the LDF that matches the ECU
### 4.1 `test_e2e_mum_led_activate.py`
Source: [tests/hardware/test_e2e_mum_led_activate.py](tests/hardware/test_e2e_mum_led_activate.py)
| Test | Markers | Purpose |
| --- | --- | --- |
| `test_mum_e2e_power_on_then_led_activate` | `hardware mum` | The "smoke + LED on" flow. Reads `ALM_Status`, decodes `ALMNadNo` via the LDF, builds an `ALM_Req_A` payload (full-white RGB at full intensity, immediate setpoint, mode 0) targeting that NAD, sends it, and re-reads `ALM_Status` to confirm the bus is still alive afterward. |
**Notes:**
- Power-up is implicit — the MUM `lin` fixture already calls `power_control.power_up()` on connect.
- Frame layouts come from the `ldf` fixture, not hand-coded byte positions.
### 4.2 `test_mum_alm_animation.py`
Source: [tests/hardware/test_mum_alm_animation.py](tests/hardware/test_mum_alm_animation.py)
Suite of automated checks for the four behaviour buckets in
`vendor/automated_lin_test/test_animation.py`. A module-scoped fixture
reads the ECU's NAD once; an `autouse` fixture forces an OFF baseline
before and after every test so cases don't bleed state into each other.
| Test | Markers | Purpose |
| --- | --- | --- |
| `test_mode0_immediate_setpoint_drives_led_on` | `hardware mum` | `AmbLightMode=0`, bright RGB+I, target single NAD. Polls `ALMLEDState` and asserts it reaches `LED_ON` within ~1 s. |
| `test_mode1_fade_passes_through_animating` | `hardware mum` | `AmbLightMode=1` with `AmbLightDuration=10` (≈ 2 s expected). Asserts `ALMLEDState` enters `ANIMATING` during the fade and reaches `LED_ON` afterward. |
| `test_duration_scales_with_lsb[5-0.6]` / `[10-0.6]` | `hardware mum` | Parametrized: with `Duration=N`, the `ANIMATING` window must be within ±0.6 s of `N × 0.2 s`. Loose tolerance accounts for the 50 ms poll cadence and bus latency. |
| `test_update1_save_does_not_apply_immediately` | `hardware mum` | `AmbLightUpdate=1` (Save) with bright payload — `ALMLEDState` must NOT transition to `ANIMATING` or `LED_ON`. Verifies save-only semantics. |
| `test_update2_apply_runs_saved_command` | `hardware mum` | After a save (Update=1), an apply (Update=2) with throwaway payload should execute the saved command — `ANIMATING` is observed. |
| `test_update3_discard_then_apply_is_noop` | `hardware mum` | Save → Discard (Update=3) → Apply. Apply must be a no-op (no `ANIMATING`, no `LED_ON`). Verifies the discard clears the saved buffer. |
| `test_lid_broadcast_targets_node` | `hardware mum` | `AmbLightLIDFrom=0x00, AmbLightLIDTo=0xFF` with bright RGB. Node must react and reach `LED_ON`, regardless of its actual NAD. |
| `test_lid_invalid_range_is_ignored` | `hardware mum` | `LIDFrom > LIDTo` (e.g. `0x14 > 0x0A`). Node must ignore the frame — `ALMLEDState` stays at OFF baseline. |
**Caveats:**
- Visual properties (color, smoothness of fade) cannot be asserted without a camera. These tests assert only what the LIN bus exposes (`ALMLEDState` transitions, ANIMATING duration). For a human-verified visual run, use the original `vendor/automated_lin_test/test_animation.py`.
- `test_duration_scales_with_lsb` polls every 50 ms; the tolerance is intentionally loose. Tighten it once you've measured your firmware's actual jitter.
### 4.3 `test_mum_auto_addressing.py`
Source: [tests/hardware/test_mum_auto_addressing.py](tests/hardware/test_mum_auto_addressing.py)
| Test | Markers | Purpose |
| --- | --- | --- |
| `test_bsm_auto_addressing_changes_nad` | `hardware mum slow` | Drives the full BSM-SNPD sequence (INIT → 16× ASSIGN → STORE → FINALIZE) with a target NAD different from the ECU's current one, then re-reads `ALM_Status` and asserts `ALMNadNo == target`. Always restores the original NAD in a `finally` block (the restore result is recorded as report properties). Uses `lin.send_raw()` so the LIN 1.x **Classic** checksum is used — Enhanced would be silently rejected by the firmware. |
**Notes:**
- Marked `slow` because the full sequence runs in ~3-4 seconds (two BSM cycles plus settle). Skip with `-m "hardware and mum and not slow"`.
- Restore is best-effort: if the second BSM cycle fails, the bench stays at the target NAD. The restore failure is visible as `restore_warning` / `restore_error` in the report properties.
### 4.4 `test_e2e_power_on_lin_smoke.py` *(legacy, BabyLIN-marked)*
Source: [tests/hardware/test_e2e_power_on_lin_smoke.py](tests/hardware/test_e2e_power_on_lin_smoke.py)
Despite living in `tests/hardware/`, this file targets the **BabyLIN** adapter (it predates the MUM migration). See section 5.4.
---
## 5. Hardware BabyLIN (legacy)
Tests gated on `interface.type == "babylin"`. Require:
- BabyLIN device + native libraries placed under `vendor/`
- An SDF compiled from your LDF, path supplied via `interface.sdf_path`
- For the E2E test: an Owon PSU on a serial port (the BabyLIN doesn't supply ECU power)
### 5.1 `test_babylin_hardware_smoke.py`
Source: [tests/test_babylin_hardware_smoke.py](tests/test_babylin_hardware_smoke.py)
| Test | Markers | Purpose |
| --- | --- | --- |
| `test_babylin_connect_receive_timeout` | `hardware babylin` | Minimal sanity: open the BabyLIN device via the configured `lin` fixture and call `lin.receive(timeout=0.2)`. Accepts either a `LinFrame` or `None` (timeout) — verifies the adapter is functional and not crashing. |
### 5.2 `test_babylin_hardware_schedule_smoke.py`
Source: [tests/test_babylin_hardware_schedule_smoke.py](tests/test_babylin_hardware_schedule_smoke.py)
| Test | Markers | Purpose |
| --- | --- | --- |
| `test_babylin_sdk_example_flow` | `hardware babylin smoke` | Verifies `interface.type == "babylin"` and an `sdf_path` is set, then exercises the receive path while the configured `schedule_nr` runs. Mirrors the vendor example flow (open / load SDF / start schedule / receive). Accepts either a frame or a timeout. |
### 5.3 `test_hardware_placeholder.py`
Source: [tests/test_hardware_placeholder.py](tests/test_hardware_placeholder.py)
| Test | Markers | Purpose |
| --- | --- | --- |
| `test_babylin_placeholder` | `hardware babylin` | Always passes. Used to confirm the marker filter and CI plumbing for hardware jobs without requiring any specific device behaviour. |
### 5.4 `test_e2e_power_on_lin_smoke.py`
Source: [tests/hardware/test_e2e_power_on_lin_smoke.py](tests/hardware/test_e2e_power_on_lin_smoke.py)
| Test | Markers | Purpose |
| --- | --- | --- |
| `test_e2e_power_on_then_cco_rgb_activate` | `hardware babylin` | Full BabyLIN E2E. Powers the ECU through the Owon PSU, switches to the LDF's `CCO` schedule via `lin.start_schedule("CCO")` (which resolves the schedule name to its index using `BLC_SDF_getScheduleNr`), publishes an `ALM_Req_A` payload with full-white RGB at full intensity, captures bus traffic for ~1 s, and asserts at least one frame was observed. Always disables PSU output in `finally`. |
**Notes:**
- This test was the original E2E target before the MUM migration. It still works as a BabyLIN smoke test if you flip `interface.type: babylin` and provide a valid SDF.
- The Owon PSU section of `config.power_supply` must be enabled (`port`, `set_voltage`, `set_current`, `do_set: true`).
---
## 6. Hardware Owon PSU only
### 6.1 `test_owon_psu.py`
Source: [tests/hardware/test_owon_psu.py](tests/hardware/test_owon_psu.py)
| Test | Markers | Purpose |
| --- | --- | --- |
| `test_owon_psu_idn_and_optional_set` | `hardware` | Independent of any LIN adapter. Skips unless `power_supply.enabled: true` and `power_supply.port` is set. Opens the configured serial port, queries `*IDN?` (asserts non-empty; optionally checks `idn_substr`), reads `output?`, and — if `do_set: true` — sets V/I, briefly enables output, measures back, then disables. All values are recorded as report properties. |
**Notes:**
- Useful as a pure-PSU bench check before running any LIN E2E test.
- Settings can live in `config/test_config.yaml` (central) or `config/owon_psu.yaml` (per-machine override; the latter wins).
---
## Test naming conventions
When adding new tests, follow these patterns so the catalog stays scannable:
- **Unit tests** live in `tests/unit/` and carry `@pytest.mark.unit`. Filename starts with `test_<thing>_<scope>` (e.g., `test_mum_adapter_mocked.py`).
- **Mock smoke tests** live in `tests/` and use either the in-process Mock adapter (override the `lin` fixture locally) or an injected SDK mock wrapper.
- **Hardware tests** live in `tests/hardware/` (preferred) or `tests/` (legacy) and carry `@pytest.mark.hardware` plus an adapter marker (`mum` / `babylin`).
- **Slow tests** (>5 s) carry `@pytest.mark.slow` so they can be excluded with `-m "not slow"`.
- **Requirement traceability** is via `req_NNN` markers on the test function and a `Requirements:` line in the docstring (parsed by the reporting plugin).
## Docstring format
The reporting plugin extracts these fields from each test's docstring and renders them in the HTML report:
```python
"""
Title: <short title>
Description:
<what the test validates and why>
Requirements: REQ-001, REQ-002
Test Steps:
1. <step one>
2. <step two>
Expected Result:
<succinct expected outcome>
"""
```
See `docs/03_reporting_and_metadata.md` and `docs/15_report_properties_cheatsheet.md` for the full schema.
## Related docs
- `docs/12_using_the_framework.md` — How to actually run the various suites
- `docs/04_lin_interface_call_flow.md` — What `send` / `receive` do per adapter
- `docs/16_mum_internals.md` — MUM adapter implementation details
- `docs/17_ldf_parser.md``ldf` fixture and `Frame.pack` / `unpack`
- `docs/06_requirement_traceability.md` — How `req_NNN` markers feed the coverage JSON

173
ecu_framework/lin/ldf.py Normal file
View File

@ -0,0 +1,173 @@
"""Thin wrapper over `ldfparser` for use in tests.
Loads an LDF (LIN Description File) and exposes per-frame `pack()` /
`unpack()` helpers plus a `frame_lengths()` map suitable for plugging
into the MUM adapter's `frame_lengths` argument.
Typical usage:
from ecu_framework.lin.ldf import LdfDatabase
db = LdfDatabase("./vendor/4SEVEN_color_lib_test.ldf")
frame = db.frame("ALM_Req_A")
payload = frame.pack(
AmbLightColourRed=0xFF,
AmbLightColourGreen=0xFF,
AmbLightColourBlue=0xFF,
AmbLightIntensity=0xFF,
AmbLightLIDFrom=0x01,
AmbLightLIDTo=0x01,
)
# → bytes(8); unspecified signals fall back to their LDF init_value.
decoded = db.frame("ALM_Status").unpack(b"\\x07\\x00\\x00\\x00")
# → {'ALMNadNo': 7, 'ALMVoltageStatus': 0, ...}
The wrapper uses `encode_raw` / `decode_raw` rather than `encode` / `decode`
so signal *encoding types* (logical/physical conversions) are bypassed
tests work with raw integer values, which is what `LinFrame.data` carries.
If you need encoding-type interpretation, use `Frame.encode()` /
`Frame.decode()` (which delegate to the underlying ldfparser methods).
"""
from __future__ import annotations
from pathlib import Path
from typing import Any, Dict, List, Tuple, Union
class FrameNotFound(KeyError):
"""Raised when a frame name or ID isn't present in the loaded LDF."""
class Frame:
"""Lightweight wrapper around an `ldfparser` frame object.
Exposes the attributes tests actually need (`id`, `name`, `length`,
`signal_layout`) and `pack`/`unpack` helpers that work on raw bytes.
"""
__slots__ = ("_raw",)
def __init__(self, raw_frame: Any) -> None:
self._raw = raw_frame
@property
def name(self) -> str:
return str(self._raw.name)
@property
def id(self) -> int:
return int(self._raw.frame_id)
@property
def length(self) -> int:
return int(self._raw.length)
def signal_layout(self) -> List[Tuple[int, str, int]]:
"""Return [(start_bit, signal_name, width_in_bits), ...]."""
return [(int(off), s.name, int(s.width)) for off, s in self._raw.signal_map]
def signal_names(self) -> List[str]:
return [s.name for _, s in self._raw.signal_map]
# ---- raw (integer) packing ------------------------------------------
def pack(self, *args: Dict[str, int], **kwargs: int) -> bytes:
"""Encode signal values into the raw payload for this frame.
Accepts either a single dict positional argument or keyword args:
frame.pack(AmbLightColourRed=255, AmbLightColourGreen=128)
frame.pack({"AmbLightColourRed": 255, "AmbLightColourGreen": 128})
Signals not provided fall back to the `init_value` declared in the
LDF (handled by ldfparser's `encode_raw`). Returns bytes of length
`self.length`.
"""
if args and kwargs:
raise TypeError("pack() takes either a positional dict or kwargs, not both")
if args:
if len(args) != 1 or not isinstance(args[0], dict):
raise TypeError("pack() positional argument must be a dict")
values = dict(args[0])
else:
values = dict(kwargs)
encoded = self._raw.encode_raw(values)
return bytes(encoded)
def unpack(self, data: Union[bytes, bytearray, list]) -> Dict[str, int]:
"""Decode raw bytes into a `{signal_name: int}` dict."""
return dict(self._raw.decode_raw(bytes(data)))
# ---- encoding-aware (logical/physical values) -----------------------
def encode(self, values: Dict[str, Any]) -> bytes:
"""Encode using LDF encoding types (logical → numeric, physical scaling).
Useful when you want to write 'Immediate color Update' instead of `0`.
Falls back to ldfparser's `encode`.
"""
encoded = self._raw.encode(values)
return bytes(encoded)
def decode(self, data: Union[bytes, bytearray, list]) -> Dict[str, Any]:
"""Decode using LDF encoding types (numeric → logical/physical)."""
return dict(self._raw.decode(bytes(data)))
def __repr__(self) -> str:
return f"Frame(name={self.name!r}, id=0x{self.id:02X}, length={self.length})"
class LdfDatabase:
"""Load an LDF file and expose its frames in a test-friendly form."""
def __init__(self, path: Union[str, Path]) -> None:
# Lazy import keeps the framework importable on machines without ldfparser
# — only `LdfDatabase` instantiation requires it.
try:
from ldfparser import parse_ldf # type: ignore
except Exception as e:
raise RuntimeError(
"ldfparser is not installed. Install it with: pip install ldfparser"
) from e
self.path = Path(path)
if not self.path.is_file():
raise FileNotFoundError(f"LDF not found: {self.path}")
self._raw = parse_ldf(str(self.path))
@property
def baudrate(self) -> int:
return int(self._raw.baudrate)
@property
def protocol_version(self) -> str:
return str(self._raw.protocol_version)
def frame(self, key: Union[str, int]) -> Frame:
"""Look up a frame by name (str) or by frame_id (int)."""
try:
raw = self._raw.get_frame(key)
except LookupError as e:
raise FrameNotFound(f"Frame {key!r} not found in {self.path.name}") from e
return Frame(raw)
def frames(self) -> List[Frame]:
"""Return all unconditional frames (excludes diagnostic/event-triggered)."""
return [Frame(rf) for rf in self._raw.frames]
def frame_lengths(self) -> Dict[int, int]:
"""`{frame_id: length}` map suitable for `MumLinInterface(frame_lengths=...)`."""
return {int(rf.frame_id): int(rf.length) for rf in self._raw.frames}
def signal_names(self, frame_key: Union[str, int]) -> List[str]:
"""Convenience: list signal names for a given frame."""
return self.frame(frame_key).signal_names()
def __repr__(self) -> str:
try:
n = sum(1 for _ in self._raw.frames)
except Exception:
n = "?"
return f"LdfDatabase(path={self.path!s}, frames={n})"
__all__ = ["LdfDatabase", "Frame", "FrameNotFound"]

View File

@ -0,0 +1,485 @@
"""Automated animation / state checks for ALM_Req_A on MUM.
Ports the requirement-driven checks from
`vendor/automated_lin_test/test_animation.py` into pytest cases that don't
require a human in the loop. Visual properties (LED color, smoothness of
fade) cannot be asserted without optical instrumentation, so each check
asserts what *can* be observed over the LIN bus:
- `ALM_Status.ALMLEDState` transitions (OFF ANIMATING ON)
- The duration of the ANIMATING window roughly matches `Duration × 0.2s`
- Save / Apply / Discard semantics on `AmbLightUpdate`
- LID-range targeting (single-node, broadcast, invalid From > To)
All frame layouts are read from the LDF (no hand-coded byte positions).
"""
from __future__ import annotations
import time
from typing import Optional
import pytest
from ecu_framework.config import EcuTestConfig
from ecu_framework.lin.base import LinFrame, LinInterface
pytestmark = [pytest.mark.hardware, pytest.mark.mum]
# ALMLEDState values (from LDF Signal_encoding_types: LED_State)
LED_STATE_OFF = 0
LED_STATE_ANIMATING = 1
LED_STATE_ON = 2
# Test pacing
STATE_POLL_INTERVAL = 0.05 # 50 ms — granularity for state-change detection
STATE_TIMEOUT_DEFAULT = 1.0
DURATION_LSB_SECONDS = 0.2 # AmbLightDuration scaling per the ECU spec
# --- helpers ---------------------------------------------------------------
def _read_alm_status(lin: LinInterface, status_frame, timeout=1.0):
"""Return the decoded ALM_Status dict, or None on timeout."""
rx = lin.receive(id=status_frame.id, timeout=timeout)
if rx is None:
return None
return status_frame.unpack(bytes(rx.data))
def _read_led_state(lin: LinInterface, status_frame) -> int:
decoded = _read_alm_status(lin, status_frame)
if decoded is None:
return -1
return int(decoded.get("ALMLEDState", -1))
def _wait_for_state(
lin: LinInterface, status_frame, target: int, timeout: float
) -> tuple[bool, float, list[int]]:
"""Poll ALMLEDState until it equals `target`, or timeout.
Returns (reached, elapsed_seconds, observed_state_history).
"""
seen = []
deadline = time.monotonic() + timeout
start = time.monotonic()
while time.monotonic() < deadline:
st = _read_led_state(lin, status_frame)
if not seen or seen[-1] != st:
seen.append(st)
if st == target:
return True, time.monotonic() - start, seen
time.sleep(STATE_POLL_INTERVAL)
return False, time.monotonic() - start, seen
def _measure_animating_window(
lin: LinInterface, status_frame, max_wait: float
) -> tuple[Optional[float], list[int]]:
"""Wait for ANIMATING to start, then for it to leave ANIMATING.
Returns (animating_seconds, state_history). If ANIMATING never appears
within `max_wait`, returns (None, history).
"""
seen = []
started_at: Optional[float] = None
deadline = time.monotonic() + max_wait
while time.monotonic() < deadline:
st = _read_led_state(lin, status_frame)
if not seen or seen[-1] != st:
seen.append(st)
if started_at is None and st == LED_STATE_ANIMATING:
started_at = time.monotonic()
elif started_at is not None and st != LED_STATE_ANIMATING:
return time.monotonic() - started_at, seen
time.sleep(STATE_POLL_INTERVAL)
return None, seen
def _send_alm_req(lin: LinInterface, req_frame, **signals):
"""Pack ALM_Req_A from signal kwargs and publish it via lin.send()."""
payload = req_frame.pack(**signals)
lin.send(LinFrame(id=req_frame.id, data=payload))
def _force_off(lin: LinInterface, req_frame, nad: int):
"""Drive the LED to OFF (mode=0, intensity=0) and pause briefly."""
_send_alm_req(
lin, req_frame,
AmbLightColourRed=0, AmbLightColourGreen=0, AmbLightColourBlue=0,
AmbLightIntensity=0,
AmbLightUpdate=0, AmbLightMode=0, AmbLightDuration=0,
AmbLightLIDFrom=nad, AmbLightLIDTo=nad,
)
time.sleep(0.4)
# --- fixtures --------------------------------------------------------------
@pytest.fixture(scope="module")
def _ctx(config: EcuTestConfig, lin: LinInterface, ldf):
"""Bundle the (lin, req_frame, status_frame, nad) values used by every test."""
if config.interface.type != "mum":
pytest.skip("interface.type must be 'mum' for this suite")
req = ldf.frame("ALM_Req_A")
status = ldf.frame("ALM_Status")
rx = lin.receive(id=status.id, timeout=1.0)
if rx is None:
pytest.skip("ECU not responding on ALM_Status — check wiring/power")
decoded = status.unpack(bytes(rx.data))
nad = int(decoded["ALMNadNo"])
if not (0x01 <= nad <= 0xFE):
pytest.skip(f"ECU reports invalid NAD {nad:#x} — auto-addressing first")
return {"lin": lin, "req": req, "status": status, "nad": nad}
@pytest.fixture(autouse=True)
def _reset_to_off(_ctx):
"""Force LED to OFF before each test in this module so tests don't bleed
state into one another. Tests that need a non-OFF baseline override this
by calling _force_off() themselves at the right moment.
"""
_force_off(_ctx["lin"], _ctx["req"], _ctx["nad"])
yield
_force_off(_ctx["lin"], _ctx["req"], _ctx["nad"])
# --- tests: AmbLightMode behavior ------------------------------------------
def test_mode0_immediate_setpoint_drives_led_on(_ctx, rp):
"""
Title: Mode 0 - Immediate Setpoint reaches LED_ON without animating
Description:
With AmbLightMode=0, the ECU should jump directly to the requested
color/intensity. The bus-observable signal of that is ALMLEDState
transitioning to LED_ON quickly without spending appreciable time
in LED_ANIMATING.
Test Steps:
1. Send ALM_Req_A with bright RGB+I, mode=0, duration=10
2. Poll ALM_Status until ALMLEDState == ON or short timeout
3. Assert ALMLEDState reached ON
Expected Result:
ALMLEDState reaches LED_ON within ~1.0 s.
"""
c = _ctx
_send_alm_req(
c["lin"], c["req"],
AmbLightColourRed=0, AmbLightColourGreen=180, AmbLightColourBlue=80,
AmbLightIntensity=200,
AmbLightUpdate=0, AmbLightMode=0, AmbLightDuration=10,
AmbLightLIDFrom=c["nad"], AmbLightLIDTo=c["nad"],
)
reached, elapsed, history = _wait_for_state(
c["lin"], c["status"], LED_STATE_ON, timeout=STATE_TIMEOUT_DEFAULT
)
rp("led_state_history", history)
rp("on_elapsed_s", round(elapsed, 3))
assert reached, f"LEDState never reached ON (history: {history})"
def test_mode1_fade_passes_through_animating(_ctx, rp):
"""
Title: Mode 1 - Fade RGB + Intensity passes through LED_ANIMATING
Description:
AmbLightMode=1 should produce a smooth fade. We expect ALMLEDState
to transit OFF ANIMATING ON during the fade, with non-zero time
spent in ANIMATING.
Test Steps:
1. Send ALM_Req_A with mode=1, duration=10 (2.0 s expected fade)
2. Measure how long ALMLEDState reports ANIMATING
Expected Result:
- ANIMATING is observed at least once
- ALMLEDState eventually reaches LED_ON
"""
c = _ctx
_send_alm_req(
c["lin"], c["req"],
AmbLightColourRed=255, AmbLightColourGreen=40, AmbLightColourBlue=0,
AmbLightIntensity=220,
AmbLightUpdate=0, AmbLightMode=1, AmbLightDuration=10,
AmbLightLIDFrom=c["nad"], AmbLightLIDTo=c["nad"],
)
# max_wait must comfortably exceed expected fade (10 * 0.2 = 2.0 s)
animating_s, history = _measure_animating_window(c["lin"], c["status"], max_wait=4.0)
rp("led_state_history", history)
rp("animating_seconds", animating_s)
assert LED_STATE_ANIMATING in history, (
f"ANIMATING never observed during a Mode 1 fade (history: {history})"
)
# After the fade, ECU should reach ON. Allow a little extra slack.
reached_on, _, post_history = _wait_for_state(
c["lin"], c["status"], LED_STATE_ON, timeout=2.0
)
rp("post_history", post_history)
assert reached_on, f"LEDState did not reach ON after Mode 1 fade ({post_history})"
@pytest.mark.parametrize("duration_lsb,tol", [(5, 0.6), (10, 0.6)])
def test_duration_scales_with_lsb(_ctx, rp, duration_lsb, tol):
"""
Title: AmbLightDuration scales the fade window by 0.2 s per LSB
Description:
Mode 1 with AmbLightDuration=N should produce an animation of
N × 0.2 s. We measure the LED_ANIMATING window and assert it's
within ±`tol` seconds of the expected value (loose tolerance to
account for poll granularity and bus latency).
Test Steps:
1. Force OFF baseline
2. Send mode=1 with the requested duration
3. Measure the ANIMATING window
4. Compare to expected = duration_lsb * 0.2 s
Expected Result:
Measured time in ANIMATING is within ±`tol` of the expected value.
"""
c = _ctx
_send_alm_req(
c["lin"], c["req"],
AmbLightColourRed=0, AmbLightColourGreen=0, AmbLightColourBlue=255,
AmbLightIntensity=200,
AmbLightUpdate=0, AmbLightMode=1, AmbLightDuration=duration_lsb,
AmbLightLIDFrom=c["nad"], AmbLightLIDTo=c["nad"],
)
expected = duration_lsb * DURATION_LSB_SECONDS
measured, history = _measure_animating_window(
c["lin"], c["status"], max_wait=expected + 2.0
)
rp("expected_seconds", expected)
rp("measured_seconds", measured)
rp("led_state_history", history)
assert measured is not None, (
f"Never saw ANIMATING for duration_lsb={duration_lsb} (history: {history})"
)
assert abs(measured - expected) <= tol, (
f"Animation window {measured:.3f}s differs from expected {expected:.3f}s "
f"by more than ±{tol:.2f}s"
)
# --- tests: AmbLightUpdate save / apply / discard --------------------------
def test_update1_save_does_not_apply_immediately(_ctx, rp):
"""
Title: AmbLightUpdate=1 (Save) does not change LED state
Description:
With AmbLightUpdate=1, the ECU should buffer the command without
executing it. ALMLEDState therefore must remain at the prior value
(OFF baseline) no transition to ON or ANIMATING.
Test Steps:
1. Force OFF baseline
2. Send a 'save' frame (update=1) with bright RGB+I, mode=1
3. Observe ALMLEDState briefly
Expected Result:
ALMLEDState stays at OFF.
"""
c = _ctx
_send_alm_req(
c["lin"], c["req"],
AmbLightColourRed=0, AmbLightColourGreen=255, AmbLightColourBlue=0,
AmbLightIntensity=255,
AmbLightUpdate=1, AmbLightMode=1, AmbLightDuration=10,
AmbLightLIDFrom=c["nad"], AmbLightLIDTo=c["nad"],
)
# Watch for ~1 s; state must NOT enter ANIMATING or ON
deadline = time.monotonic() + 1.0
history = []
while time.monotonic() < deadline:
st = _read_led_state(c["lin"], c["status"])
if not history or history[-1] != st:
history.append(st)
time.sleep(STATE_POLL_INTERVAL)
rp("led_state_history", history)
assert LED_STATE_ANIMATING not in history, (
f"Save (update=1) unexpectedly triggered ANIMATING: {history}"
)
assert LED_STATE_ON not in history, (
f"Save (update=1) unexpectedly drove LED ON: {history}"
)
def test_update2_apply_runs_saved_command(_ctx, rp):
"""
Title: AmbLightUpdate=2 (Apply) runs a previously saved command
Description:
After a save (update=1) of a Mode-1 bright frame, an apply (update=2)
with arbitrary payload should execute the *saved* command the
ECU should now animate and reach ON.
Test Steps:
1. Force OFF baseline
2. Save a Mode-1 bright frame (update=1)
3. Send apply (update=2) with throwaway payload
4. Expect LEDState to reach ANIMATING then ON
Expected Result:
LEDState transitions OFF ANIMATING ON after Apply.
"""
c = _ctx
# Save a fade-to-green at full intensity
_send_alm_req(
c["lin"], c["req"],
AmbLightColourRed=0, AmbLightColourGreen=255, AmbLightColourBlue=0,
AmbLightIntensity=255,
AmbLightUpdate=1, AmbLightMode=1, AmbLightDuration=5,
AmbLightLIDFrom=c["nad"], AmbLightLIDTo=c["nad"],
)
time.sleep(0.3) # let the save settle
# Apply with throwaway payload — ECU should run the saved fade
_send_alm_req(
c["lin"], c["req"],
AmbLightColourRed=7, AmbLightColourGreen=7, AmbLightColourBlue=7,
AmbLightIntensity=7,
AmbLightUpdate=2, AmbLightMode=0, AmbLightDuration=0,
AmbLightLIDFrom=c["nad"], AmbLightLIDTo=c["nad"],
)
animating_s, history = _measure_animating_window(c["lin"], c["status"], max_wait=4.0)
rp("animating_seconds", animating_s)
rp("led_state_history", history)
assert LED_STATE_ANIMATING in history, (
f"Apply (update=2) did not animate after a save (history: {history})"
)
def test_update3_discard_then_apply_is_noop(_ctx, rp):
"""
Title: AmbLightUpdate=3 (Discard) clears the saved buffer
Description:
After save discard, an apply should be a no-op (no animation, no
ON transition).
Test Steps:
1. Force OFF baseline
2. Save a Mode-1 bright frame (update=1)
3. Discard the saved frame (update=3)
4. Apply (update=2)
5. Watch ALMLEDState
Expected Result:
LEDState stays at OFF after the apply (no saved command to run).
"""
c = _ctx
# Save
_send_alm_req(
c["lin"], c["req"],
AmbLightColourRed=255, AmbLightColourGreen=0, AmbLightColourBlue=0,
AmbLightIntensity=255,
AmbLightUpdate=1, AmbLightMode=1, AmbLightDuration=5,
AmbLightLIDFrom=c["nad"], AmbLightLIDTo=c["nad"],
)
time.sleep(0.3)
# Discard
_send_alm_req(
c["lin"], c["req"],
AmbLightColourRed=0, AmbLightColourGreen=0, AmbLightColourBlue=0,
AmbLightIntensity=0,
AmbLightUpdate=3, AmbLightMode=0, AmbLightDuration=0,
AmbLightLIDFrom=c["nad"], AmbLightLIDTo=c["nad"],
)
time.sleep(0.3)
# Apply
_send_alm_req(
c["lin"], c["req"],
AmbLightColourRed=7, AmbLightColourGreen=7, AmbLightColourBlue=7,
AmbLightIntensity=7,
AmbLightUpdate=2, AmbLightMode=0, AmbLightDuration=0,
AmbLightLIDFrom=c["nad"], AmbLightLIDTo=c["nad"],
)
# Watch — must NOT animate
deadline = time.monotonic() + 1.5
history = []
while time.monotonic() < deadline:
st = _read_led_state(c["lin"], c["status"])
if not history or history[-1] != st:
history.append(st)
time.sleep(STATE_POLL_INTERVAL)
rp("led_state_history", history)
assert LED_STATE_ANIMATING not in history, (
f"Apply after discard unexpectedly animated: {history}"
)
# --- tests: LID range targeting --------------------------------------------
def test_lid_broadcast_targets_node(_ctx, rp):
"""
Title: LIDFrom=0x00, LIDTo=0xFF (broadcast) reaches this node
Description:
A broadcast LID range should include any NAD, so this node should
react and drive the LED ON.
Expected Result: LEDState reaches ON.
"""
c = _ctx
_send_alm_req(
c["lin"], c["req"],
AmbLightColourRed=120, AmbLightColourGreen=0, AmbLightColourBlue=255,
AmbLightIntensity=180,
AmbLightUpdate=0, AmbLightMode=0, AmbLightDuration=0,
AmbLightLIDFrom=0x00, AmbLightLIDTo=0xFF,
)
reached, elapsed, history = _wait_for_state(
c["lin"], c["status"], LED_STATE_ON, timeout=STATE_TIMEOUT_DEFAULT
)
rp("led_state_history", history)
rp("on_elapsed_s", round(elapsed, 3))
assert reached, f"Broadcast LID range failed to drive node ON: {history}"
def test_lid_invalid_range_is_ignored(_ctx, rp):
"""
Title: LIDFrom > LIDTo is rejected (no LED change)
Description:
An ill-formed LID range (From > To) should be ignored by the node;
ALMLEDState must remain at the OFF baseline.
Expected Result: LEDState stays OFF.
"""
c = _ctx
_send_alm_req(
c["lin"], c["req"],
AmbLightColourRed=255, AmbLightColourGreen=255, AmbLightColourBlue=255,
AmbLightIntensity=255,
AmbLightUpdate=0, AmbLightMode=0, AmbLightDuration=0,
AmbLightLIDFrom=0x14, AmbLightLIDTo=0x0A, # From > To
)
deadline = time.monotonic() + 1.0
history = []
while time.monotonic() < deadline:
st = _read_led_state(c["lin"], c["status"])
if not history or history[-1] != st:
history.append(st)
time.sleep(STATE_POLL_INTERVAL)
rp("led_state_history", history)
assert LED_STATE_ANIMATING not in history, (
f"Invalid LID range animated unexpectedly: {history}"
)
assert LED_STATE_ON not in history, (
f"Invalid LID range drove LED ON unexpectedly: {history}"
)

View File

@ -0,0 +1,188 @@
"""LIN auto-addressing (BSM-SNPD) test on the MUM.
Ports the BSM-SNPD sequence from `vendor/automated_lin_test/test_auto_addressing.py`
into pytest. The flow:
1. INIT subf=0x01, params=(0x02, 0xFF) wait 50 ms
2. ASSIGN subf=0x02, params=(0x02, target_nad) x 16 frames, 20 ms apart
(target_nad placed first, then NADs 0x01..0x10 cycle)
3. STORE subf=0x03, params=(0x02, 0xFF) wait 20 ms
4. FINALIZE subf=0x04, params=(0x02, 0xFF) wait 20 ms
Each frame is 8 bytes:
byte 0 NAD = 0x7F (broadcast)
byte 1 PCI = 0x06 (6 data bytes)
byte 2 SID = 0xB5 (BSM-SNPD)
byte 3 Supplier ID LSB = 0xFF
byte 4 Supplier ID MSB = 0x7F
byte 5 subfunction
byte 6 param 1
byte 7 param 2
Critically, BSM frames must be sent with **LIN 1.x Classic checksum**, which
the ECU firmware checks. `MumLinInterface.send_raw()` routes through the
transport layer's `ld_put_raw`, which uses Classic; `lin.send()` would use
Enhanced and frames would be silently rejected.
The test changes the ECU's NAD, asserts the change, and restores the original
NAD in `finally` so it leaves the bench in the state it found it.
"""
from __future__ import annotations
import time
from typing import Iterable
import pytest
from ecu_framework.config import EcuTestConfig
from ecu_framework.lin.base import LinInterface
pytestmark = [pytest.mark.hardware, pytest.mark.mum, pytest.mark.slow]
# BSM-SNPD constants
BSM_NAD_BROADCAST = 0x7F
BSM_PCI = 0x06
BSM_SID = 0xB5
BSM_SUPPLIER_ID_LSB = 0xFF
BSM_SUPPLIER_ID_MSB = 0x7F
BSM_SUBF_INIT = 0x01
BSM_SUBF_ASSIGN = 0x02
BSM_SUBF_STORE = 0x03
BSM_SUBF_FINALIZE = 0x04
BSM_INIT_DELAY = 0.050
BSM_FRAME_DELAY = 0.020
VALID_NAD_RANGE: Iterable[int] = range(0x01, 0x11) # 0x01..0x10 inclusive
# Time to wait after FINALIZE for the ECU to commit and resume normal traffic
POST_FINALIZE_SETTLE = 1.0
def _bsm_frame(subfunction: int, param1: int, param2: int) -> bytes:
"""Build the 8-byte BSM-SNPD raw payload."""
return bytes([
BSM_NAD_BROADCAST,
BSM_PCI,
BSM_SID,
BSM_SUPPLIER_ID_LSB,
BSM_SUPPLIER_ID_MSB,
subfunction & 0xFF,
param1 & 0xFF,
param2 & 0xFF,
])
def _read_nad(lin: LinInterface, status_frame, attempts: int = 5) -> int | None:
"""Read ALM_Status a few times, return ALMNadNo or None if no response."""
for _ in range(attempts):
rx = lin.receive(id=status_frame.id, timeout=0.5)
if rx is not None:
decoded = status_frame.unpack(bytes(rx.data))
return int(decoded["ALMNadNo"])
time.sleep(0.1)
return None
def _run_bsm_sequence(lin: LinInterface, target_nad: int) -> None:
"""Drive one full INIT→ASSIGN×16→STORE→FINALIZE cycle, target NAD first."""
# 1. INIT
lin.send_raw(_bsm_frame(BSM_SUBF_INIT, 0x02, 0xFF))
time.sleep(BSM_INIT_DELAY)
# 2. 16x ASSIGN, target_nad placed first
nad_sequence = list(VALID_NAD_RANGE)
if target_nad in nad_sequence:
nad_sequence.remove(target_nad)
nad_sequence.insert(0, target_nad)
for nad in nad_sequence:
lin.send_raw(_bsm_frame(BSM_SUBF_ASSIGN, 0x02, nad))
time.sleep(BSM_FRAME_DELAY)
# 3. STORE
lin.send_raw(_bsm_frame(BSM_SUBF_STORE, 0x02, 0xFF))
time.sleep(BSM_FRAME_DELAY)
# 4. FINALIZE
lin.send_raw(_bsm_frame(BSM_SUBF_FINALIZE, 0x02, 0xFF))
time.sleep(BSM_FRAME_DELAY)
def test_bsm_auto_addressing_changes_nad(
config: EcuTestConfig, lin: LinInterface, ldf, rp
):
"""
Title: BSM-SNPD auto-addressing assigns a new NAD and ALM_Status reflects it
Description:
Runs the full BSM-SNPD sequence (INIT, 16x ASSIGN, STORE, FINALIZE)
with a target NAD different from the ECU's current NAD, then reads
ALM_Status and asserts ALMNadNo equals the target. Restores the
original NAD in a finally block to leave the bench unchanged.
Requirements: REQ-MUM-BSM-AUTOADDR
Test Steps:
1. Skip unless interface.type == 'mum'
2. Read initial NAD from ALM_Status
3. Pick a target NAD in 0x01..0x10 different from initial
4. Run BSM sequence with target_nad first
5. Read ALM_Status; assert ALMNadNo == target_nad
6. Run BSM sequence again to restore initial NAD
7. Read ALM_Status; record the final NAD
Expected Result:
- Initial NAD is in 0x01..0xFE
- After BSM sequence, ALM_Status.ALMNadNo == target_nad
- After restore sequence, ALM_Status.ALMNadNo == initial_nad
"""
if config.interface.type != "mum":
pytest.skip("interface.type must be 'mum' for this test")
# send_raw is MUM-only; gate on capability so the failure mode is clean
if not hasattr(lin, "send_raw"):
pytest.skip("LIN adapter does not expose send_raw() (need MumLinInterface)")
status = ldf.frame("ALM_Status")
rp("ldf_path", str(ldf.path))
# Step 2: read current NAD
initial_nad = _read_nad(lin, status)
assert initial_nad is not None, "ECU not responding on ALM_Status — wiring/power?"
rp("initial_nad", f"0x{initial_nad:02X}")
assert 0x01 <= initial_nad <= 0xFE, f"ECU initial NAD {initial_nad:#x} is out of range"
# Step 3: pick a target NAD different from current
candidates = [n for n in VALID_NAD_RANGE if n != initial_nad]
target_nad = candidates[0]
rp("target_nad", f"0x{target_nad:02X}")
try:
# Step 4: run the BSM sequence
_run_bsm_sequence(lin, target_nad)
time.sleep(POST_FINALIZE_SETTLE)
# Step 5: verify
new_nad = _read_nad(lin, status)
rp("post_bsm_nad", f"0x{new_nad:02X}" if new_nad is not None else "no_response")
assert new_nad == target_nad, (
f"NAD did not change to target: expected 0x{target_nad:02X}, "
f"got {new_nad if new_nad is None else f'0x{new_nad:02X}'}"
)
finally:
# Step 6 + 7: restore the original NAD so the bench is left as we found it
try:
_run_bsm_sequence(lin, initial_nad)
time.sleep(POST_FINALIZE_SETTLE)
restored_nad = _read_nad(lin, status)
rp("restored_nad", f"0x{restored_nad:02X}" if restored_nad is not None else "no_response")
if restored_nad != initial_nad:
# Don't fail the test on restore failure (the original assertion is
# what we care about), but make it visible.
rp("restore_warning", f"failed to restore initial NAD ({restored_nad})")
except Exception as e:
rp("restore_error", repr(e))

View File

@ -0,0 +1,158 @@
"""Unit tests for LdfDatabase / Frame using the 4SEVEN LDF as fixture data."""
from __future__ import annotations
import pathlib
import pytest
# Skip the whole module if ldfparser isn't installed.
pytest.importorskip("ldfparser", reason="ldfparser is required for LDF unit tests")
from ecu_framework.lin.ldf import Frame, FrameNotFound, LdfDatabase
WORKSPACE_ROOT = pathlib.Path(__file__).resolve().parents[2]
LDF_PATH = WORKSPACE_ROOT / "vendor" / "4SEVEN_color_lib_test.ldf"
@pytest.fixture(scope="module")
def db() -> LdfDatabase:
return LdfDatabase(LDF_PATH)
@pytest.mark.unit
def test_loads_metadata(db: LdfDatabase):
assert db.protocol_version in ("2.1", "2.0", "1.3")
assert db.baudrate == 19200
@pytest.mark.unit
def test_lookup_by_name_and_id(db: LdfDatabase):
by_name = db.frame("ALM_Req_A")
by_id = db.frame(0x0A)
assert by_name.id == 0x0A == by_id.id
assert by_name.name == "ALM_Req_A" == by_id.name
assert by_name.length == 8
@pytest.mark.unit
def test_unknown_frame_raises(db: LdfDatabase):
with pytest.raises(FrameNotFound):
db.frame("not_a_real_frame")
@pytest.mark.unit
def test_signal_layout_matches_ldf(db: LdfDatabase):
layout = db.frame("ALM_Req_A").signal_layout()
# spot-check a couple of entries from the LDF Frames block
assert (0, "AmbLightColourRed", 8) in layout
assert (32, "AmbLightUpdate", 2) in layout
assert (34, "AmbLightMode", 6) in layout
assert (56, "AmbLightLIDTo", 8) in layout
@pytest.mark.unit
def test_pack_kwargs_full_payload(db: LdfDatabase):
frame = db.frame("ALM_Req_A")
payload = frame.pack(
AmbLightColourRed=0xFF,
AmbLightColourGreen=0xFF,
AmbLightColourBlue=0xFF,
AmbLightIntensity=0xFF,
AmbLightUpdate=0,
AmbLightMode=0,
AmbLightDuration=0,
AmbLightLIDFrom=0x01,
AmbLightLIDTo=0x01,
)
assert isinstance(payload, bytes)
assert len(payload) == 8
assert payload == bytes.fromhex("ffffffff00000101")
@pytest.mark.unit
def test_pack_unspecified_signals_use_init_value(db: LdfDatabase):
"""LDF defines non-zero init_values for ColorConfigFrameRed signals;
pack() with no kwargs should fall back to those defaults."""
frame = db.frame("ColorConfigFrameRed")
payload = frame.pack()
decoded = frame.unpack(payload)
# ColorConfigFrameRed_X init_value is 5665, _Y is 2396, _Z is 0, _Vf_Cal is 2031
assert decoded["ColorConfigFrameRed_X"] == 5665
assert decoded["ColorConfigFrameRed_Y"] == 2396
assert decoded["ColorConfigFrameRed_Z"] == 0
assert decoded["ColorConfigFrameRed_Vf_Cal"] == 2031
@pytest.mark.unit
def test_pack_dict_argument(db: LdfDatabase):
frame = db.frame("ALM_Req_A")
a = frame.pack(AmbLightColourRed=0x12, AmbLightColourBlue=0x34)
b = frame.pack({"AmbLightColourRed": 0x12, "AmbLightColourBlue": 0x34})
assert a == b
@pytest.mark.unit
def test_pack_rejects_args_and_kwargs_together(db: LdfDatabase):
frame = db.frame("ALM_Req_A")
with pytest.raises(TypeError):
frame.pack({"AmbLightColourRed": 1}, AmbLightColourGreen=2)
@pytest.mark.unit
def test_unpack_round_trip(db: LdfDatabase):
frame = db.frame("ALM_Req_A")
values = {
"AmbLightColourRed": 0xAB,
"AmbLightColourGreen": 0xCD,
"AmbLightColourBlue": 0x12,
"AmbLightIntensity": 0x80,
"AmbLightUpdate": 2, # 2 bits
"AmbLightMode": 0x15, # 6 bits
"AmbLightDuration": 0x40,
"AmbLightLIDFrom": 0x01,
"AmbLightLIDTo": 0xFE,
}
payload = frame.pack(**values)
decoded = frame.unpack(payload)
for k, v in values.items():
assert decoded[k] == v, f"signal {k} mismatch: {decoded[k]} vs {v}"
@pytest.mark.unit
def test_alm_status_decode_real_payload(db: LdfDatabase):
"""ALM_Status: byte 0 carries ALMNadNo (8 bits at offset 0)."""
frame = db.frame("ALM_Status")
assert frame.length == 4
decoded = frame.unpack(b"\x07\x00\x00\x00")
assert decoded["ALMNadNo"] == 7
@pytest.mark.unit
def test_frame_lengths_includes_all_unconditional_frames(db: LdfDatabase):
lengths = db.frame_lengths()
assert lengths[0x0A] == 8 # ALM_Req_A
assert lengths[0x11] == 4 # ALM_Status
assert lengths[0x06] == 3 # ConfigFrame
# Every entry should map to a positive length
assert all(l >= 1 for l in lengths.values())
@pytest.mark.unit
def test_frames_returns_wrapped_frame_objects(db: LdfDatabase):
frames = db.frames()
assert all(isinstance(f, Frame) for f in frames)
names = {f.name for f in frames}
assert {"ALM_Req_A", "ALM_Status", "ConfigFrame"}.issubset(names)
@pytest.mark.unit
def test_ldf_repr_does_not_explode(db: LdfDatabase):
s = repr(db)
assert "LdfDatabase" in s
@pytest.mark.unit
def test_missing_file_raises_filenotfounderror(tmp_path):
with pytest.raises(FileNotFoundError):
LdfDatabase(tmp_path / "nope.ldf")