Documents the new layers introduced over the past several commits.
- docs/19_frame_io_and_alm_helpers.md (new): full reference for the
FrameIO and AlmTester helpers — three access levels (high/mid/low),
full API tables, fixture wiring, cookbook patterns, and §7
describing the four-phase SETUP/PROCEDURE/ASSERT/TEARDOWN test
pattern with the three template flavors plus a §7.4 link to the
PSU+LIN template.
- docs/14_power_supply.md: rewritten and expanded.
§3 cross-platform port resolution (Windows / WSL1 / WSL2 +
usbipd-win / Linux native compatibility table)
§4 auto-detection via idn_substr
§5 session-managed power: contract for tests, must-not list,
what changed in the existing tests
§6 the settle-then-validate pattern: two-delays table (PSU
bench-dependent vs ECU firmware-dependent), copy-paste
example, tuning guidance for ECU_VALIDATION_TIME_S
§6 PSU settling characterization (-m psu_settling)
§7 library API reference table + safe_off_on_close
§9 troubleshooting expanded with WSL2 usbipd-win + dialout
- docs/18_test_catalog.md: voltage-tolerance section refreshed for
the settle-then-validate shape, new "Hardware – PSU settling
(opt-in)" category, new §8 "Hardware-test infrastructure"
documenting conftest.py, frame_io.py, alm_helpers.py,
psu_helpers.py, and both templates.
- docs/05_architecture_overview.md: components list split into
framework core / hardware test layer / artifacts. Mermaid diagram
gained a Hardware-test helpers subgraph showing FrameIO,
AlmTester, rgb_to_pwm, and the templates. Data/control flow
summary describes the session-managed PSU and the helper layer.
- docs/15_report_properties_cheatsheet.md: PSU section split into
per-test (function-scoped rp) and module-scoped (testsuite
property) blocks; added psu_resolved_port, psu_resolved_idn,
psu_settled_s, validation_time_s.
- docs/README.md: links to the new doc 19.
- README.md, TESTING_FRAMEWORK_GUIDE.md: project-structure trees
expanded to show the full current layout — every file and
directory under tests/hardware/ (conftest, helpers, templates,
tests), tests/unit/, config/, docs/, scripts/, and vendor/.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
30 KiB
Test Catalog
Comprehensive description of every test case in the framework — what each
one does, what it expects, what hardware it needs, and how to run it.
Generated by hand from the source files; rerun
pytest --collect-only -q --no-cov to see the live list.
Quick reference
| Category | Files | Tests (incl. parametrize expansions) | Hardware? |
|---|---|---|---|
| Unit (pure logic) | 6 | 28 | none |
| Mock-loopback smoke | 2 | 6 | none |
| Plugin self-test | 1 | 1 | none |
| Hardware – MUM | 4 | 12 | MUM + ECU |
| Hardware – Voltage tolerance | 1 | 5 | MUM + ECU + Owon PSU |
| Hardware – Owon PSU | 1 | 1 | Owon PSU |
| Hardware – PSU settling (opt-in) | 1 | 4 | Owon PSU |
| Hardware – BabyLIN (DEPRECATED) | 4 | 4 | BabyLIN + ECU + Owon PSU |
| Total | 20 | 61 | mixed |
Infrastructure (not collected as tests):
| File | Role |
|---|---|
tests/hardware/conftest.py |
Session-scoped autouse PSU fixture (powers the ECU once at session start) + the public psu fixture |
tests/hardware/frame_io.py |
FrameIO class — generic LDF-driven I/O |
tests/hardware/alm_helpers.py |
AlmTester class + ALM constants and tolerance utilities |
tests/hardware/_test_case_template.py |
ALM-only test starting point (leading underscore → not collected) |
tests/hardware/_test_case_template_psu_lin.py |
PSU + LIN test starting point (leading underscore → not collected) |
The numbers count the cases pytest reports when collecting. Some tests are
@parametrize-expanded (e.g. test_linframe_invalid_id_raises[-1],
[64]) and listed once below with a note on the parameters.
How to run a category
pytest -m "unit" # pure unit tests
pytest -m "not hardware" # everything except hardware (≈ 35 cases)
pytest -m "hardware and mum" # MUM-only hardware tests
pytest -m "hardware and babylin" # DEPRECATED BabyLIN hardware tests (legacy rigs only)
pytest -m "hardware and not slow" # hardware excluding the slow tests
pytest -m psu_settling # PSU voltage-settling characterization (opt-in)
1. Unit tests — tests/unit/
Pure-Python tests that don't touch hardware or external I/O. Run on every PR.
1.1 test_linframe.py — LinFrame validation
Source: tests/unit/test_linframe.py
| Test | Markers | Purpose |
|---|---|---|
test_linframe_accepts_valid_ranges |
unit |
Construct a LinFrame(id=0x3F, data=8 bytes of zero) and assert id/length round-trip cleanly. Ensures the maximum legal LIN classic ID and 8-byte payload are accepted. |
test_linframe_invalid_id_raises[-1] / [64] |
unit |
Parametrized: LinFrame(id=-1) and LinFrame(id=0x40) must raise ValueError. Confirms the 0x00–0x3F clamp on classic LIN IDs. |
test_linframe_too_long_raises |
unit |
LinFrame(id=0x01, data=9 bytes) must raise ValueError. Confirms the 8-byte payload upper bound. |
Why it matters: LinFrame is the type every adapter (Mock/MUM/BabyLIN) hands back to tests. If validation drifts, all downstream tests get more permissive silently.
1.2 test_config_loader.py — YAML configuration precedence
Source: tests/unit/test_config_loader.py
| Test | Markers | Purpose |
|---|---|---|
test_config_precedence_env_overrides |
unit |
Writes a temp YAML with interface.type: babylin / channel: 7, points ECU_TESTS_CONFIG at it, then loads with overrides={"interface": {"channel": 9}}. Asserts the YAML's type made it through and the in-code override beat the YAML's channel. |
test_config_defaults_when_no_file |
unit |
With no ECU_TESTS_CONFIG and no workspace root, load_config() must return defaults (type: mock, flash.enabled: false). |
Precedence order asserted: in-code overrides > ECU_TESTS_CONFIG env > config/test_config.yaml > built-in defaults.
1.3 test_babylin_adapter_mocked.py — BabyLIN adapter error path
Source: tests/unit/test_babylin_adapter_mocked.py
| Test | Markers | Purpose |
|---|---|---|
test_connect_sdf_error_raises |
unit |
Inject a fake BabyLIN wrapper whose BLC_loadSDF returns a non-OK code. BabyLinInterface.connect() must raise RuntimeError. Validates that SDK error codes during SDF download surface as Python exceptions instead of being silently ignored. |
1.4 test_mum_adapter_mocked.py — MUM adapter plumbing
Source: tests/unit/test_mum_adapter_mocked.py
All cases inject fake pymumclient and pylin modules so the adapter can be exercised with no MUM hardware.
| Test | Markers | Purpose |
|---|---|---|
test_connect_opens_mum_and_powers_up |
unit |
connect() calls MelexisUniversalMaster.open_all(host), linmaster.setup(), sets lin_dev.baudrate, and powers up the ECU exactly once. |
test_disconnect_powers_down_and_tears_down |
unit |
disconnect() calls power_control.power_down() and linmaster.teardown() exactly once each. |
test_send_publishes_master_frame |
unit |
lin.send(LinFrame(0x0A, 8 bytes)) calls lin_dev.send_message(master_to_slave=True, frame_id=0x0A, data_length=8, data=[...]). |
test_receive_uses_frame_lengths_default |
unit |
lin.receive(id=0x11) reads the configured length (4) from the default frame_lengths map and returns the slave bytes wrapped in a LinFrame. |
test_receive_returns_none_on_pylin_exception |
unit |
If pylin raises during send_message(master_to_slave=False, ...), receive() must return None (treated as timeout). Stops tests from having to wrap every receive in try/except. |
test_receive_without_id_raises |
unit |
lin.receive(id=None) must raise NotImplementedError. The MUM is master-driven; passive listen is unsupported. |
test_send_raw_uses_classic_checksum_path |
unit |
lin.send_raw(bytes) calls transport_layer.ld_put_raw(data, baudrate=19200). This is the path BSM-SNPD diagnostic frames need (Classic checksum). |
test_power_cycle_calls_down_then_up |
unit |
lin.power_cycle(wait=0) issues at least one extra power_down() and the matching power_up() on top of the connect-time power up. |
1.5 test_ldf_database.py — LDF parser wrapper
Source: tests/unit/test_ldf_database.py
Module is skipped automatically if ldfparser isn't installed. Uses vendor/4SEVEN_color_lib_test.ldf as fixture data.
| Test | Markers | Purpose |
|---|---|---|
test_loads_metadata |
unit |
db.protocol_version is one of 1.3/2.0/2.1 and db.baudrate == 19200 for the 4SEVEN LDF. |
test_lookup_by_name_and_id |
unit |
db.frame("ALM_Req_A") and db.frame(0x0A) return the same frame; id/name/length match the LDF Frames block. |
test_unknown_frame_raises |
unit |
db.frame("not_a_real_frame") raises FrameNotFound. |
test_signal_layout_matches_ldf |
unit |
frame.signal_layout() for ALM_Req_A contains the exact (start_bit, name, width) tuples from the LDF (spot-checks AmbLightColourRed, AmbLightUpdate, AmbLightMode, AmbLightLIDTo). |
test_pack_kwargs_full_payload |
unit |
frame.pack(...) with all signals provided produces an 8-byte payload ffffffff00000101. |
test_pack_unspecified_signals_use_init_value |
unit |
frame.pack() with no kwargs uses each signal's LDF init_value. Verified by decoding the packed output for ColorConfigFrameRed (which has non-zero init values like 5665). |
test_pack_dict_argument |
unit |
frame.pack({...}) and frame.pack(**{...}) produce identical bytes. |
test_pack_rejects_args_and_kwargs_together |
unit |
frame.pack({"X": 1}, Y=2) raises TypeError. |
test_unpack_round_trip |
unit |
A non-trivial value set (RGB, intensity, mode bits, LID range) packs and unpacks back to the same dict. |
test_alm_status_decode_real_payload |
unit |
unpack(b"\\x07\\x00\\x00\\x00") on ALM_Status yields ALMNadNo == 7. |
test_frame_lengths_includes_all_unconditional_frames |
unit |
db.frame_lengths() contains every unconditional frame ID with a positive length (sanity: ALM_Req_A=8, ALM_Status=4, ConfigFrame=3). |
test_frames_returns_wrapped_frame_objects |
unit |
db.frames() returns wrapped Frame objects whose names cover the expected set (ALM_Req_A, ALM_Status, ConfigFrame…). |
test_ldf_repr_does_not_explode |
unit |
repr(db) includes LdfDatabase and doesn't raise. |
test_missing_file_raises_filenotfounderror |
unit |
LdfDatabase(missing_path) raises FileNotFoundError. |
1.6 test_hex_flasher.py — flashing scaffold
Source: tests/unit/test_hex_flasher.py
| Test | Markers | Purpose |
|---|---|---|
test_hex_flasher_sends_basic_sequence |
unit |
Writes a minimal Intel HEX (EOF-only) and runs HexFlasher(stub_lin).flash_hex(path). Asserts no exception and that lin.sent is a list. Placeholder until the flasher is fleshed out with UDS — once real UDS is wired in, this test gains real assertions about the byte sequence. |
2. Mock-loopback smoke — tests/
Tests that exercise the full LinInterface API (send / receive / request) using either the in-process Mock adapter or the BabyLIN adapter with a mock SDK wrapper.
2.1 test_smoke_mock.py — Mock adapter end-to-end
Source: tests/test_smoke_mock.py
Module-local lin fixture forces MockBabyLinInterface regardless of the central config, so these always run as mock-only tests.
| Test | Markers | Purpose |
|---|---|---|
TestMockLinInterface::test_mock_send_receive_echo |
smoke req_001 req_003 |
Send LinFrame(0x12, [1,2,3]) and receive it back through the mock's loopback. ID and data must match exactly. |
TestMockLinInterface::test_mock_request_synthesized_response |
smoke req_002 |
lin.request(id=0x21, length=4) returns a deterministic frame where data[i] == (id + i) & 0xFF. The mock implements this pattern so request/response logic can be tested without hardware. |
TestMockLinInterface::test_mock_receive_timeout_behavior |
smoke req_004 |
lin.receive(id=0xFF, timeout=0.1) (no matching frame queued) returns None and doesn't block longer than the requested timeout. |
TestMockLinInterface::test_mock_frame_validation_boundaries[…] |
boundary req_001 req_003 |
Parametrized 4 ways: (id, payload) ∈ {(0x01, [0x55]), (0x3F, [0xAA,0x55]), (0x20, 5 bytes), (0x15, 8 bytes)}. Each frame round-trips through send/receive with byte-for-byte integrity. Covers the legal LIN ID and payload-length boundaries. |
2.2 test_babylin_wrapper_mock.py — BabyLIN adapter against a mocked SDK
Source: tests/test_babylin_wrapper_mock.py
Constructs BabyLinInterface(wrapper_module=mock_bl) so the adapter exercises real code paths without needing the BabyLIN native library.
| Test | Markers | Purpose |
|---|---|---|
test_babylin_sdk_adapter_with_mock_wrapper |
babylin smoke req_001 |
Connect (discover port, open, load SDF, start schedule) → send(LinFrame(0x12, [0xAA,0x55,0x01])) → receive(timeout=0.1). The mock wrapper echoes the transmitted bytes; the test asserts ID and data round-trip. |
test_babylin_master_request_with_mock_wrapper[…] |
babylin smoke req_001 |
Parametrized 2 ways. vendor.mock_babylin_wrapper-True: full mock with BLC_sendRawMasterRequest(channel, id, length) — expects the deterministic pattern. _MockBytesOnly-False: shim where only the bytes signature is supported; the adapter falls back to sending zeros and the response is asserted to be zeros of the requested length. Together these cover both SDK signatures the adapter must handle. |
3. Plugin self-test — tests/plugin/
3.1 test_conftest_plugin_artifacts.py
Source: tests/plugin/test_conftest_plugin_artifacts.py
| Test | Markers | Purpose |
|---|---|---|
test_plugin_writes_artifacts |
unit |
Uses pytest's pytester to run a synthetic test in a temp dir with the reporting plugin loaded. Asserts reports/requirements_coverage.json is created with REQ-001 mapped, that reports/summary.md exists, and that the JSON references the generated report.html and junit.xml. Validates the plugin's full artifact pipeline end-to-end. |
4. Hardware – MUM (Melexis Universal Master)
Tests gated on interface.type == "mum". All require:
- A reachable MUM (default
192.168.7.2over USB-RNDIS) - Melexis
pylinandpymumclientPython packages installed - An ECU wired to the MUM's
lin0and powered throughpower_out0 interface.ldf_pathpointing at the LDF that matches the ECU
4.1 test_e2e_mum_led_activate.py
Source: tests/hardware/test_e2e_mum_led_activate.py
| Test | Markers | Purpose |
|---|---|---|
test_mum_e2e_power_on_then_led_activate |
hardware mum |
The "smoke + LED on" flow. Reads ALM_Status, decodes ALMNadNo via the LDF, builds an ALM_Req_A payload (full-white RGB at full intensity, immediate setpoint, mode 0) targeting that NAD, sends it, and re-reads ALM_Status to confirm the bus is still alive afterward. |
Notes:
- Power-up is implicit — the MUM
linfixture already callspower_control.power_up()on connect. - Frame layouts come from the
ldffixture, not hand-coded byte positions.
4.2 test_mum_alm_animation.py
Source: tests/hardware/test_mum_alm_animation.py
Suite of automated checks for the four behaviour buckets in
vendor/automated_lin_test/test_animation.py. A module-scoped fixture
reads the ECU's NAD once; an autouse fixture forces an OFF baseline
before and after every test so cases don't bleed state into each other.
| Test | Markers | Purpose |
|---|---|---|
test_mode0_immediate_setpoint_drives_led_on |
hardware mum |
AmbLightMode=0, bright RGB+I, target single NAD. Polls ALMLEDState and asserts it reaches LED_ON within ~1 s. |
test_mode1_fade_passes_through_animating |
hardware mum |
AmbLightMode=1 with AmbLightDuration=10 (≈ 2 s expected). Asserts ALMLEDState enters ANIMATING during the fade and reaches LED_ON afterward. |
test_duration_scales_with_lsb[5-0.6] / [10-0.6] |
hardware mum |
Parametrized: with Duration=N, the ANIMATING window must be within ±0.6 s of N × 0.2 s. Loose tolerance accounts for the 50 ms poll cadence and bus latency. |
test_update1_save_does_not_apply_immediately |
hardware mum |
AmbLightUpdate=1 (Save) with bright payload — ALMLEDState must NOT transition to ANIMATING or LED_ON. Verifies save-only semantics. |
test_update2_apply_runs_saved_command |
hardware mum |
After a save (Update=1), an apply (Update=2) with throwaway payload should execute the saved command — ANIMATING is observed. |
test_update3_discard_then_apply_is_noop |
hardware mum |
Save → Discard (Update=3) → Apply. Apply must be a no-op (no ANIMATING, no LED_ON). Verifies the discard clears the saved buffer. |
test_lid_broadcast_targets_node |
hardware mum |
AmbLightLIDFrom=0x00, AmbLightLIDTo=0xFF with bright RGB. Node must react and reach LED_ON, regardless of its actual NAD. |
test_lid_invalid_range_is_ignored |
hardware mum |
LIDFrom > LIDTo (e.g. 0x14 > 0x0A). Node must ignore the frame — ALMLEDState stays at OFF baseline. |
Caveats:
- Visual properties (color, smoothness of fade) cannot be asserted without a camera. These tests assert only what the LIN bus exposes (
ALMLEDStatetransitions, ANIMATING duration). For a human-verified visual run, use the originalvendor/automated_lin_test/test_animation.py. test_duration_scales_with_lsbpolls every 50 ms; the tolerance is intentionally loose. Tighten it once you've measured your firmware's actual jitter.
4.3 test_mum_auto_addressing.py
Source: tests/hardware/test_mum_auto_addressing.py
| Test | Markers | Purpose |
|---|---|---|
test_bsm_auto_addressing_changes_nad |
hardware mum slow |
Drives the full BSM-SNPD sequence (INIT → 16× ASSIGN → STORE → FINALIZE) with a target NAD different from the ECU's current one, then re-reads ALM_Status and asserts ALMNadNo == target. Always restores the original NAD in a finally block (the restore result is recorded as report properties). Uses lin.send_raw() so the LIN 1.x Classic checksum is used — Enhanced would be silently rejected by the firmware. |
Notes:
- Marked
slowbecause the full sequence runs in ~3-4 seconds (two BSM cycles plus settle). Skip with-m "hardware and mum and not slow". - Restore is best-effort: if the second BSM cycle fails, the bench stays at the target NAD. The restore failure is visible as
restore_warning/restore_errorin the report properties.
4.4 test_e2e_power_on_lin_smoke.py (DEPRECATED, BabyLIN-marked)
Source: tests/hardware/test_e2e_power_on_lin_smoke.py
Despite living in tests/hardware/, this file targets the deprecated BabyLIN adapter (it predates the MUM migration). See section 5.4.
5. Hardware – BabyLIN (DEPRECATED)
Retained only so existing BabyLIN rigs can keep running. New work should add tests under section 4 (Hardware – MUM).
Tests gated on interface.type == "babylin" (deprecated). Require:
- BabyLIN device + native libraries placed under
vendor/ - An SDF compiled from your LDF, path supplied via
interface.sdf_path - For the E2E test: an Owon PSU on a serial port (the BabyLIN doesn't supply ECU power)
5.1 test_babylin_hardware_smoke.py
Source: tests/test_babylin_hardware_smoke.py
| Test | Markers | Purpose |
|---|---|---|
test_babylin_connect_receive_timeout |
hardware babylin |
Minimal sanity: open the BabyLIN device via the configured lin fixture and call lin.receive(timeout=0.2). Accepts either a LinFrame or None (timeout) — verifies the adapter is functional and not crashing. |
5.2 test_babylin_hardware_schedule_smoke.py
Source: tests/test_babylin_hardware_schedule_smoke.py
| Test | Markers | Purpose |
|---|---|---|
test_babylin_sdk_example_flow |
hardware babylin smoke |
Verifies interface.type == "babylin" and an sdf_path is set, then exercises the receive path while the configured schedule_nr runs. Mirrors the vendor example flow (open / load SDF / start schedule / receive). Accepts either a frame or a timeout. |
5.3 test_hardware_placeholder.py
Source: tests/test_hardware_placeholder.py
| Test | Markers | Purpose |
|---|---|---|
test_babylin_placeholder |
hardware babylin |
Always passes. Used to confirm the marker filter and CI plumbing for hardware jobs without requiring any specific device behaviour. |
5.4 test_e2e_power_on_lin_smoke.py
Source: tests/hardware/test_e2e_power_on_lin_smoke.py
| Test | Markers | Purpose |
|---|---|---|
test_e2e_power_on_then_cco_rgb_activate |
hardware babylin |
Full BabyLIN E2E. Powers the ECU through the Owon PSU, switches to the LDF's CCO schedule via lin.start_schedule("CCO") (which resolves the schedule name to its index using BLC_SDF_getScheduleNr), publishes an ALM_Req_A payload with full-white RGB at full intensity, captures bus traffic for ~1 s, and asserts at least one frame was observed. Always disables PSU output in finally. |
Notes:
- This test was the original E2E target before the MUM migration. It still works as a BabyLIN smoke test if you flip
interface.type: babylinand provide a valid SDF. - The Owon PSU section of
config.power_supplymust be enabled (port,set_voltage,set_current,do_set: true).
6. Hardware – Owon PSU only
6.1 test_owon_psu.py
Source: tests/hardware/test_owon_psu.py
| Test | Markers | Purpose |
|---|---|---|
test_owon_psu_idn_and_measurements |
hardware |
Read-only smoke against the session-managed PSU (opened by tests/hardware/conftest.py). Queries *IDN? (asserts non-empty; checks idn_substr if configured), output? (asserts ON — the session fixture parked it that way), and the parsed-numeric helpers measure_voltage_v() / measure_current_a(). Verifies measured voltage is within ±10% of cfg.set_voltage. |
Notes:
- Does not toggle the output — that would brown out the ECU and break every test that follows in the same session. The toggle path is exercised once at session start by the conftest fixture.
- Settings can live in
config/test_config.yaml(central) orconfig/owon_psu.yaml(per-machine override; the latter wins).
7. Hardware – Voltage tolerance (PSU + LIN)
7.1 test_overvolt.py
Source: tests/hardware/test_overvolt.py
Drives the bench supply through known thresholds and observes
ALM_Status.ALMVoltageStatus on the LIN bus. All cases use the
SETUP / PROCEDURE / ASSERT / TEARDOWN four-phase pattern with a
try/finally that restores nominal voltage. The session-scoped
PSU stays open across every case; voltage is perturbed but output
is never toggled.
Pattern (settle then validate). Each PROCEDURE goes through
apply_voltage_and_settle() from psu_helpers: set the target,
poll the PSU meter until the rail is actually there, then hold
for ECU_VALIDATION_TIME_S so the firmware can detect and republish
status. After that, a single deterministic read of
ALMVoltageStatus gives the answer — no polling-the-bus race. See
docs/14_power_supply.md for the full pattern reference.
| Test | Markers | Purpose |
|---|---|---|
test_template_overvoltage_status |
hardware mum |
Confirm baseline ALMVoltageStatus == Normal, then apply_voltage_and_settle(OVERVOLTAGE_V, ECU_VALIDATION_TIME_S), single read of status, assert OverVoltage (0x02). Restore nominal and verify recovery to Normal. |
test_template_undervoltage_status |
hardware mum |
Symmetric: apply UNDERVOLTAGE_V, settle + validation hold, assert UnderVoltage (0x01), restore. Failure message hints when the slave browns out before tripping the UV flag. |
test_template_voltage_status_parametrized[nominal|overvoltage|undervoltage] |
hardware mum |
One parametrized walk over (voltage, expected_status, label). Each row runs SETUP/PROCEDURE/ASSERT/TEARDOWN independently via the autouse _park_at_nominal fixture. |
Report properties recorded per case:
psu_setpoint_v— requested voltagepsu_settled_s— measured PSU slew time (bench-dependent)psu_final_v— last measured voltagevalidation_time_s— firmware-side hold (ECU_VALIDATION_TIME_S)voltage_status_after— single status read used for the assertionvoltage_trace— downsampled(elapsed_s, v)trace from the settle phase
Notes:
- Tune the constants at the top of the file to your firmware spec:
NOMINAL_VOLTAGE,OVERVOLTAGE_V,UNDERVOLTAGE_V,ECU_VALIDATION_TIME_S. - The autouse
_park_at_nominalfixture also usesapply_voltage_and_settle, so the rail is measurably back at nominal before AND after every test — voltage cannot leak between cases. cfg.power_supply.do_setis no longer required (the session fixture owns the PSU lifecycle);enabled: trueand a reachable port are sufficient.
7.2 test_psu_voltage_settling.py (opt-in: -m psu_settling)
Source: tests/hardware/test_psu_voltage_settling.py
Characterization test — extracts how long the bench Owon PSU takes
to actually deliver a new voltage at its terminals after a setpoint
change. Other voltage-tolerance tests use the result to budget their
detect timeouts. Marked psu_settling + slow so it stays out of
default -m hardware runs unless explicitly selected.
| Test | Markers | Purpose |
|---|---|---|
test_psu_voltage_settling_time[13_to_18_OV] |
hardware psu_settling slow |
Park PSU at 13 V (un-timed), then set_voltage(18) and poll measure_voltage_v() every 50 ms until within ±0.10 V of target or 10 s timeout. Records settling_time_s and a downsampled voltage trace. |
test_psu_voltage_settling_time[18_to_13_back] |
same | The return path: 18 V → 13 V. Slewing down often differs from slewing up; both numbers are useful for budgeting. |
test_psu_voltage_settling_time[13_to_7_UV] |
same | Nominal → undervoltage. |
test_psu_voltage_settling_time[7_to_13_back] |
same | Undervoltage → nominal. |
Notes:
- Run via
pytest -m psu_settling -sto see the per-case timing in stdout. - Per-case report properties:
settling_time_s,final_voltage_v,sample_count,voltage_trace(downsampled to ~30 entries), plus the inputs (start_voltage_v,target_voltage_v,voltage_tol_v). - Each case ends by restoring
NOMINAL_V(13 V) so subsequent tests don't inherit a perturbed setpoint. - Tune the four module-level constants (
VOLTAGE_TOL_V,POLL_INTERVAL_S,MAX_SETTLE_TIME_S,NOMINAL_V) to your bench if defaults don't fit.
8. Hardware-test infrastructure (not collected as tests)
These files support the suite but are not test bodies:
8.1 tests/hardware/conftest.py
Session-scoped fixtures:
_psu_or_none— opens the Owon PSU once viaresolve_port()(cross-platform), parks atcfg.power_supply.set_voltage/set_current, enables output. YieldsOwonPSUorNone(tolerant: never raises out of fixture)._psu_powers_bench—autouse=True. Realizes_psu_or_noneso even tests that don't requestpsuby name benefit from the session power-up.psu— public; skips cleanly when the PSU isn't available.
Tests must not call psu.set_output(False) or psu.close() — the conftest owns the lifecycle. See docs/14_power_supply.md §5.
8.2 tests/hardware/frame_io.py — FrameIO
Generic LDF-driven I/O. Three layers (send/receive/read_signal, pack/unpack, send_raw/receive_raw) plus introspection (frame_id, frame_length). Reusable for any frame in any LDF — no ALM-specific knowledge.
8.3 tests/hardware/alm_helpers.py — AlmTester + constants
ALM_Node domain helpers built on FrameIO: force_off, wait_for_state, measure_animating_window, read_led_state, assert_pwm_matches_rgb, assert_pwm_wo_comp_matches_rgb. Plus pure utilities (ntc_kelvin_to_celsius, pwm_within_tol) and the LED-state / pacing / PWM-tolerance constants.
8.4 tests/hardware/psu_helpers.py — settle-then-validate primitives
Shared PSU helpers used by every test that changes the supply voltage:
wait_until_settled(psu, target_v, *, tol, interval, timeout)— pollspsu.measure_voltage_v()until withintoloftarget_v, returns(elapsed_s, trace)or(None, trace)on timeout.apply_voltage_and_settle(psu, target_v, *, validation_time, ...)— composite: issues the setpoint, callswait_until_settled, then sleepsvalidation_timeso the firmware-side observer can detect and republish. Returns{settled_s, validation_s, final_v, trace}. RaisesAssertionErrorif the PSU can't reach the target.downsample_trace(trace, max_samples=30)— utility to keep poll traces in report properties readable.
Module-level defaults: DEFAULT_VOLTAGE_TOL_V = 0.10, DEFAULT_POLL_INTERVAL_S = 0.05, DEFAULT_SETTLE_TIMEOUT_S = 10.0, DEFAULT_VALIDATION_TIME_S = 1.0.
Used by test_overvolt.py, test_psu_voltage_settling.py, and the _test_case_template_psu_lin.py template.
8.5 Test starting points (leading underscore → not collected)
tests/hardware/_test_case_template.py— three flavors (minimal / with isolation / single-signal probe) for ALM-touching MUM tests.tests/hardware/_test_case_template_psu_lin.py— three flavors (overvoltage / undervoltage / parametrized sweep) for tests that drive the PSU and observe the LIN bus.
Both contain pedagogical inline comments explaining fixture scopes, autouse, yield, the four-phase test pattern, and per-flavor when-to-use guidance. Copy to test_<feature>.py and edit.
Test naming conventions
When adding new tests, follow these patterns so the catalog stays scannable:
- Unit tests live in
tests/unit/and carry@pytest.mark.unit. Filename starts withtest_<thing>_<scope>(e.g.,test_mum_adapter_mocked.py). - Mock smoke tests live in
tests/and use either the in-process Mock adapter (override thelinfixture locally) or an injected SDK mock wrapper. - Hardware tests live in
tests/hardware/(preferred) ortests/(legacy) and carry@pytest.mark.hardwareplus an adapter marker (mumfor current work,babylinfor the deprecated path). - Slow tests (>5 s) carry
@pytest.mark.slowso they can be excluded with-m "not slow". - Requirement traceability is via
req_NNNmarkers on the test function and aRequirements:line in the docstring (parsed by the reporting plugin).
Docstring format
The reporting plugin extracts these fields from each test's docstring and renders them in the HTML report:
"""
Title: <short title>
Description:
<what the test validates and why>
Requirements: REQ-001, REQ-002
Test Steps:
1. <step one>
2. <step two>
Expected Result:
<succinct expected outcome>
"""
See docs/03_reporting_and_metadata.md and docs/15_report_properties_cheatsheet.md for the full schema.
Related docs
docs/12_using_the_framework.md— How to actually run the various suitesdocs/04_lin_interface_call_flow.md— Whatsend/receivedo per adapterdocs/16_mum_internals.md— MUM adapter implementation detailsdocs/17_ldf_parser.md—ldffixture andFrame.pack/unpackdocs/06_requirement_traceability.md— Howreq_NNNmarkers feed the coverage JSON