AuthorsM. Zhang, S. Ali, T. Yue and R. Norgren
TitleInteractively Evolving Test Ready Models with Uncertainty Developed for Testing Cyber-Physical Systems
AfilliationSoftware Engineering
StatusPublished
Publication TypeTechnical reports
Year of Publication2016
PublisherSimula Research Laboratory
KeywordsBelief State Machine, Cyber-Physical System, Model Evolution, Model-based Testing, Uncertainty
Abstract

Context: Cyber-Physical Systems (CPSs), when deployed for operation, are inherently prone to uncertainty. Considering their applications in critical domains (e.g., healthcare), it is important that such CPSs are tested sufficiently, with the explicit consideration of uncertainty. In the context of model-based testing (MBT), test ready models are developed to capture expected behavior of a CPS and its operating environment. As a critical component of a MBT methodology, test ready models are used for generating executable test cases. It is, therefore, necessary to develop methods that can continuously evolve, based on real operational data collected during the operation of CPSs, test ready models and uncertainty captured in them, all together termed as Belief test ready models.

Objective: Our objective is to propose a model evolution framework that can interactively improve the quality of belief test ready models, based on real operational data. Such belief test ready models are developed by one or more test engineers/modelers (belief agents) with their assumptions about the expected behavior of a CPS, its expected physical environment, and potential future deployments. Thus, these models explicitly contain subjective uncertainty of the test modelers.

Method: We propose a framework (named as UncerTolve) for interactively evolving belief test ready models (specified with extended UML notations) of CPSs with subjective uncertainty developed by test modelers. The key inputs of UncerTolve include initial belief test ready models of CPSs with known subjective uncertainty and real data collected from the operation of CPSs. UncerTolve has three key features: 1) Validating the syntactic correctness and conformance of belief test ready models against real operational data via model execution, 2) Evolving objective uncertainty measurements of belief test ready models via model execution, and 3) Evolving state invariants (modeling test oracles) and guards of transitions (modeling constraints for test data generation) of belief test ready models with a machine learning technique.

Results: As a proof-of-concept, we evaluated UncerTolve with one industrial CPS case study, i.e., GeoSports from the healthcare domain. Using UncerTolve, we managed to evolve 51% of belief elements, 18% of states, and 21% of transitions as compared to the initial belief test ready model developed in an industrial setting.

Conclusion: Based on the results, we can conclude that with UncerTolve, we can successfully evolve model elements of the initial belief test ready model, in addition to objective uncertainty measurements using real operational data. The evolved model can be used to generate a new set of test cases covering evolved model elements and objective uncertainty, in addition to the ones generated from the initial belief test ready model. These additional test cases can be used to test the current and future deployments of a CPS to warrant that it will handle uncertainty gracefully during its operations.      

Notes

This work is supported by U-Test project (Testing Cyber-Physical Systems under Uncertainty).

Citation Key24660