Practical Test Dependency Detection

Alessio Gambi, Jonathan Bell, Andreas Zeller

Publikation: Beitrag in Buch/Bericht/KonferenzbandKonferenzbeitragBegutachtung

Abstract

Regression tests should consistently produce the same outcome when executed against the same version of the system under test. Recent studies, however, show a different picture: In many cases simply changing the order in which tests execute is enough to produce different test outcomes. These studies also identify the presence of dependencies between tests as one likely cause of this behavior. Test dependencies affect the quality of tests and of the correlated development activities, like regression test selection, prioritization, and parallelization, which assume that tests are independent. Therefore, developers must promptly identify and resolve problematic test dependencies. This paper presents PRADET, a novel approach for detecting problematic dependencies that is both effective and efficient. PRADET uses a systematic, data-driven process to detect problematic test dependencies significantly faster and more precisely than prior work. PRADET scales to analyze large projects with thousands of tests that existing tools cannot analyze in reasonable amount of time, and found 27 previously unknown dependencies.

OriginalspracheEnglisch
TitelProceedings - 2018 IEEE 11th International Conference on Software Testing, Verification and Validation, ICST 2018
Herausgeber (Verlag)IEEE Computer Society
Seiten1-11
Seitenumfang11
ISBN (elektronisch)9781538650127
DOIs
PublikationsstatusVeröffentlicht - 25 Mai 2018
Extern publiziertJa

Publikationsreihe

NameProceedings - 2018 IEEE 11th International Conference on Software Testing, Verification and Validation, ICST 2018

Fingerprint

Untersuchen Sie die Forschungsthemen von „Practical Test Dependency Detection“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren