2025-10-20 20:43:55 +02:00

ECU Tests Framework

Python-based ECU testing framework built on pytest, with a pluggable LIN communication layer (Mock and BabyLin), configuration via YAML, and enhanced HTML/XML reporting with rich test metadata.

Highlights

  • Mock LIN adapter for fast, hardware-free development
  • Real BabyLIN adapter using the SDK's official Python wrapper (BabyLIN_library.py)
  • Hex flashing scaffold you can wire to UDS/XCP
  • Rich pytest fixtures and example tests
  • Self-contained HTML report with Title, Requirements, Steps, and Expected Results extracted from test docstrings
  • JUnit XML report for CI/CD
  • Using the framework (common runs, markers, CI, Pi): docs/12_using_the_framework.md
  • Plugin overview (reporting, hooks, artifacts): docs/11_conftest_plugin_overview.md

TL;DR quick start (copy/paste)

Mock (no hardware):

python -m venv .venv; .\.venv\Scripts\Activate.ps1; pip install -r requirements.txt; pytest -m "not hardware" -v

Hardware (BabyLIN SDK):

# Place BabyLIN_library.py and native libs under .\vendor per vendor/README.md first
$env:ECU_TESTS_CONFIG = ".\config\babylin.example.yaml"; pytest -m "hardware and babylin" -v

Quick start (Windows PowerShell)

  1. Create a virtual environment and install dependencies
python -m venv .venv
.\.venv\Scripts\Activate.ps1
pip install -r requirements.txt
  1. Run the mock test suite (default interface)
C:/E/TeqanyLogix_repos/ecu_tests/.venv/Scripts/python.exe -m pytest -m "not hardware" -v
  1. View the reports
  • HTML: reports/report.html
  • JUnit XML: reports/junit.xml

Tip: You can change output via --html and --junitxml CLI options.

Reporting: Metadata in HTML

We extract these fields from each tests docstring and render them in the HTML report:

  • Title
  • Description
  • Requirements (e.g., REQ-001)
  • Test Steps
  • Expected Result

Markers like smoke, hardware, and req_00x are also displayed.

Example docstring format used by the plugin:

"""
Title: Mock LIN Interface - Send/Receive Echo Test

Description: Validates basic send/receive functionality using the mock LIN interface with echo behavior.

Requirements: REQ-001, REQ-003

Test Steps:
1. Connect to mock interface
2. Send frame ID 0x01 with data [0x55]
3. Receive the echo within 100ms
4. Assert ID and data integrity

Expected Result:
- Echoed frame matches sent frame
"""

Configuration

Default config is config/test_config.yaml. Override via the ECU_TESTS_CONFIG environment variable.

$env:ECU_TESTS_CONFIG = (Resolve-Path .\config\test_config.yaml)

BabyLIN configuration template: config/babylin.example.yaml

interface:
  type: babylin          # or "mock"
  channel: 0             # Channel index used by the SDK wrapper
  bitrate: 19200         # Usually determined by SDF
  sdf_path: ./vendor/Example.sdf
  schedule_nr: 0         # Start this schedule on connect

Switch to hardware profile and run only hardware tests:

$env:ECU_TESTS_CONFIG = (Resolve-Path .\config\babylin.example.yaml)
C:/E/TeqanyLogix_repos/ecu_tests/.venv/Scripts/python.exe -m pytest -m hardware -v

Project structure

ecu_tests/
├── ecu_framework/
│   ├── config.py                # YAML config loader
│   ├── lin/
│   │   ├── base.py             # LinInterface + LinFrame
│   │   ├── mock.py             # Mock LIN adapter
│   │   └── babylin.py          # BabyLIN SDK-wrapper adapter (uses BabyLIN_library.py)
│   └── flashing/
│       └── hex_flasher.py      # Hex flashing scaffold
├── tests/
│   ├── conftest.py             # Shared fixtures
│   ├── test_smoke_mock.py      # Mock interface smoke and boundary tests
│   ├── test_babylin_hardware_smoke.py   # Hardware smoke tests
│   └── test_hardware_placeholder.py     # Future integration tests
├── config/
│   ├── test_config.yaml        # Default config
│   └── babylin.example.yaml    # Hardware template
├── vendor/                     # Place SDK wrapper and platform libs here
│   ├── BabyLIN_library.py      # Official SDK Python wrapper
│   └── BabyLIN library/        # Platform-specific binaries from SDK (DLL/.so)
├── reports/                    # Generated reports
│   ├── report.html
│   └── junit.xml
├── conftest_plugin.py          # HTML metadata extraction & rendering
├── pytest.ini                  # Markers and default addopts
├── requirements.txt
└── README.md

Usage recipes

  • Run everything (mock and any non-hardware tests):
C:/E/TeqanyLogix_repos/ecu_tests/.venv/Scripts/python.exe -m pytest -v
  • Run by marker:
C:/E/TeqanyLogix_repos/ecu_tests/.venv/Scripts/python.exe -m pytest -m "smoke" -v
C:/E/TeqanyLogix_repos/ecu_tests/.venv/Scripts/python.exe -m pytest -m "req_001" -v
  • Run in parallel:
C:/E/TeqanyLogix_repos/ecu_tests/.venv/Scripts/python.exe -m pytest -n auto -v
  • Run the plugin self-test (verifies reporting artifacts under reports/):
python -m pytest tests\plugin\test_conftest_plugin_artifacts.py -q
  • Generate separate HTML/JUnit reports for unit vs non-unit tests:
./scripts/run_two_reports.ps1

BabyLIN adapter notes

The ecu_framework/lin/babylin.py implementation uses the official BabyLIN_library.py wrapper from the SDK. Put BabyLIN_library.py under vendor/ (or on PYTHONPATH) along with the SDK's platform-specific libraries. Configure sdf_path and schedule_nr to load an SDF and start a schedule during connect. The adapter sends frames via BLC_mon_set_xmit and receives via BLC_getNextFrameTimeout.

Docs and references

  • Guide: TESTING_FRAMEWORK_GUIDE.md (deep dive with examples and step-by-step flows)
  • Reports: reports/report.html and reports/junit.xml (generated on each run)
  • CI summary: reports/summary.md (machine-friendly run summary)
  • Requirements coverage: reports/requirements_coverage.json (requirement → tests mapping)
    • Tip: Open the HTML report on Windows with: start .\reports\report.html
  • Configs: config/test_config.yaml, config/babylin.example.yaml (copy and modify for your environment)
  • BabyLIN SDK placement and notes: vendor/README.md
  • Docs index: docs/README.md (run sequence, config resolution, reporting, call flows)
  • Raspberry Pi deployment: docs/09_raspberry_pi_deployment.md
  • Build custom Pi image: docs/10_build_custom_image.md
  • Pi scripts: scripts/pi_install.sh, scripts/ecu-tests.service, scripts/ecu-tests.timer, scripts/run_tests.sh

Troubleshooting

  • HTML report missing columns: ensure pytest.ini includes -p conftest_plugin in addopts.
  • ImportError for BabyLIN_library: verify vendor/BabyLIN_library.py placement and that required native libraries (DLL/.so) from the SDK are available on PATH/LD_LIBRARY_PATH.
  • Permission errors in PowerShell: run the venv's full Python path or adjust ExecutionPolicy for scripts.
  • Import errors: activate the venv and reinstall requirements.txt.

Next steps

  • Plug in the actual BabyLin DLL and verify the hardware smoke tests
  • Replace HexFlasher with a production flashing routine (UDS/XCP)
  • Expand tests for end-to-end ECU workflows and requirement coverage
Description
Automation test
Readme 7.9 MiB