Verification at Runtime

Runtime verification (RV) is a key technology for ensuring the safety and reliability of Autonomous Driving Systems (ADS), which operate under uncertain and dynamic conditions. By evaluating data in real-time, RV identifies potential faults or inconsistencies, offering a balance between safety and performance. Our plan is to integrate RV into our ISS platform, in order to further ensure the safety of ADS. This is one of our ongoing projects. We will briefly introduce RV below.

Core Concepts and Key Technologies

Formal Specification and Modeling

The foundation of runtime verification lies in formal specifications that define the safety and functional constraints that autonomous driving systems must adhere to. Using formal languages such as Linear Temporal Logic (LTL) and Probabilistic Computation Tree Logic (PCTL), researchers can accurately describe complex system behaviors.

Moreover, modeling tools like UPPAAL-SMC and TeSSLa offer robust technical support for runtime verification.1 UPPAAL-SMC efficiently validates systems against probabilistic temporal logic, while TeSSLa monitors temporal properties of real-time data streams. In the study of AUTOSAR Timing Extensions2, researchers used TeSSLa to model and monitor timing constraints, including response times and signal processing delays​.

By formal specification and modeling, runtime verification can identify potential risks in complex and dynamic environments, providing robust assurances for system safety.

Data-Driven Anomaly Detection

Data-driven anomaly detection is central to runtime verification. Modern autonomous driving systems rely on deep learning models for environmental perception and decision-making, making it essential to monitor these models’ behavior to ensure system reliability. Studies show that monitoring activation patterns in neural networks significantly enhances anomaly detection capabilities​.

For example, in 2D object detection tasks, researchers introduced activation pattern preprocessing techniques that enhance anomaly detection by removing redundant information and noise.3 Experiments on BDD and KITTI datasets show that this preprocessing reduced the false negative rate by 14% and improved overall performance by 3%​. Furthermore, in 3D point cloud detection, combining activation patterns from multiple network layers not only improved detection performance but also enhanced robustness in complex driving scenarios​.4

Algorithm Optimization and Resource Requirements

Runtime veriification often operates on resource-constrained embedded devices, demanding lightweight algorithm designs. For example, the TeSSLa language reduces computational demands through stream processing while achieving efficient timing constraint verification.

Moreover, while multi-layer activation pattern fusion improves detection performance, it increases computational complexity. Future directions include hardware acceleration (e.g., FPGA) or developing more efficient feature extraction algorithms to balance performance and resource demands​.

Challenges and Future Directions

Data Complexity and Robustness

Autonomous driving systems must handle highly dynamic and complex data environments. Future research should develop more robust verification algorithms to adapt to various dynamic scenarios​.

Algorithm Optimization and Real-Time Performance

Runtime verification must operate under low latency, making computational complexity a critical challenge. Future directions include hardware acceleration technologies, such as FPGA and heterogeneous computing​.

References

  1. An D, Liu J, Zhang M, et al. Uncertainty modeling and runtime verification for autonomous vehicles driving control: A machine learning-based approach. Journal of Systems and Software 167 (2020): 110617. 

  2. Friese M J, Kallwies H, Leucker M, et al. Runtime verification of AUTOSAR timing extensions. 30th International Conference on Real-Time Networks and System (RTNS). ACM, 2022: 173-183. 

  3. Yatbaz H Y, Dianati M, Koufos K, et al. Introspection of 2d object detection using processed neural activation patterns in automated driving systems. IEEE/CVF International Conference on Computer Vision Workshops (ICCVW). IEEE, 2023: 4049-4056. 

  4. Yatbaz H Y, Dianati M, Koufos K, et al. Run-time monitoring of 3D object detection in automated driving systems using early layer neural activation patterns. IEEE/CVF International Conference on Computer Vision Workshops (ICCVW). IEEE, 2024: 3522-3531.