Modern software systems operate in complex and changing environments and are exposed to multiple sources of uncertainty. Testing methods shall be tailored to uncertainty as a first-class concern in order to quantify it and deliver increased confidence in the level of assurance of the final product. In this paper, we introduce novel model-based exploration strategies that generate test cases targeting uncertain components of the system under test. Our testing framework leverages Markov Decision Processes as modeling formalism of choice. The tester explicitly specifies uncertainty by means of beliefs attached to transition probabilities. The structural properties of the model and the uncertainty specification are then exploited to drive the test case generation process. Bayesian inference is used to achieve this objective by updating the initial beliefs through the evidence collected by testing. The proposed uncertainty-aware test selection strategies have been systematically evaluated on three realistic benchmarks and nine synthetic systems exhibiting up to 10k model transitions. We demonstrate the effectiveness of the novel strategies with well-established metrics. Results show they outperform existing testing methods with a gain up to 2.65× in terms of accuracy of the inference process.

(2021). Uncertainty-aware Exploration in Model-based Testing . Retrieved from http://hdl.handle.net/10446/190968

Uncertainty-aware Exploration in Model-based Testing

Camilli, Matteo;Gargantini, Angelo;Scandurra, Patrizia;
2021-01-01

Abstract

Modern software systems operate in complex and changing environments and are exposed to multiple sources of uncertainty. Testing methods shall be tailored to uncertainty as a first-class concern in order to quantify it and deliver increased confidence in the level of assurance of the final product. In this paper, we introduce novel model-based exploration strategies that generate test cases targeting uncertain components of the system under test. Our testing framework leverages Markov Decision Processes as modeling formalism of choice. The tester explicitly specifies uncertainty by means of beliefs attached to transition probabilities. The structural properties of the model and the uncertainty specification are then exploited to drive the test case generation process. Bayesian inference is used to achieve this objective by updating the initial beliefs through the evidence collected by testing. The proposed uncertainty-aware test selection strategies have been systematically evaluated on three realistic benchmarks and nine synthetic systems exhibiting up to 10k model transitions. We demonstrate the effectiveness of the novel strategies with well-established metrics. Results show they outperform existing testing methods with a gain up to 2.65× in terms of accuracy of the inference process.
2021
Camilli, Matteo; Gargantini, Angelo Michele; Scandurra, Patrizia; Trubiani, Catia
File allegato/i alla scheda:
File Dimensione del file Formato  
2021-ICST-uncertainty-aware-MBT.pdf

Solo gestori di archivio

Versione: postprint - versione referata/accettata senza referaggio
Licenza: Licenza default Aisberg
Dimensione del file 442.04 kB
Formato Adobe PDF
442.04 kB Adobe PDF   Visualizza/Apri
Pubblicazioni consigliate

Aisberg ©2008 Servizi bibliotecari, Università degli studi di Bergamo | Terms of use/Condizioni di utilizzo

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10446/190968
Citazioni
  • Scopus 10
  • ???jsp.display-item.citation.isi??? 9
social impact