skip to main content
10.1145/2642937.2643019acmconferencesArticle/Chapter ViewAbstractPublication PagesaseConference Proceedingsconference-collections
research-article

An empirical evaluation and comparison of manual and automated test selection

Published: 15 September 2014 Publication History
  • Get Citation Alerts
  • Abstract

    Regression test selection speeds up regression testing by re-running only the tests that can be affected by the most recent code changes. Much progress has been made on research in automated test selection over the last three decades, but it has not translated into practical tools that are widely adopted. Therefore, developers either re-run all tests after each change or perform manual test selection. Re-running all tests is expensive, while manual test selection is tedious and error-prone. Despite such a big trade-off, no study assessed how developers perform manual test selection and compared it to automated test selection.
    This paper reports on our study of manual test selection in practice and our comparison of manual and automated test selection. We are the first to conduct a study that (1) analyzes data from manual test selection, collected in real time from 14 developers during a three-month study and (2) compares manual test selection with an automated state-of-the-research test-selection tool for 450 test sessions.
    Almost all developers in our study performed manual test selection, and they did so in mostly ad-hoc ways. Comparing manual and automated test selection, we found the two approaches to select different tests in each and every one of the 450 test sessions investigated. Manual selection chose more tests than automated selection 73% of the time (potentially wasting time) and chose fewer tests 27% of the time (potentially missing bugs). These results show the need for better automated test-selection techniques that integrate well with developers' programming environments.

    References

    [1]
    T. Ball. On the limit of control flow analysis for regression test selection. In ISSTA, 1998.
    [2]
    J. Bible, G. Rothermel, and D. S. Rosenblum. A comparative study of coarse- and fine-grained safe regression test-selection techniques. TOSEM, 2001.
    [3]
    L. Briand, Y. Labiche, and S. He. Automating regression test selection based on UML designs. IST, 2009.
    [4]
    CodingTracker. http://codingtracker.web.engr. illinois.edu/.
    [5]
    R. A. DeMillo, R. J. Lipton, and F. G. Sayward. Hints on test data selection: Help for the practicing programmer. Computer, 1978.
    [6]
    S. G. Elbaum, G. Rothermel, S. Kanduri, and A. G. Malishevsky. Selecting a cost-effective test case prioritization technique. SQJ, 2004.
    [7]
    E. Engström and P. Runeson. A qualitative survey of regression testing practices. In PROFES, 2010.
    [8]
    K. Fischer, F. Raji, and A. Chruscicki. A methodology for retesting modified software. In NTC, 1981.
    [9]
    Use a gated check-in build process to validate changes. http://msdn.microsoft.com/en-us/library/dd787631.aspx.
    [10]
    M. Gligoric, R. Majumdar, R. Sharma, L. Eloussi, and D. Marinov. Regression test selection for distributed software histories. In CAV, 2014.
    [11]
    J. Goodenough and S. Gerhart. Toward a theory of test data selection. TOSEM, 1975.
    [12]
    T. L. Graves, M. J. Harrold, J.-M. Kim, A. Porter, and G. Rothermel. An empirical study of regression test selection techniques. In ICSE, 1998.
    [13]
    M. Greiler, A. van Deursen, and M. Storey. Test confessions: A study of testing practices for plug-in systems. In ICSE, 2012.
    [14]
    M. J. Harrold, J. A. Jones, T. Li, D. Liang, A. Orso, M. Pennings, S. Sinha, S. A. Spoon, and A. Gujarathi. Regression test selection for Java software. In OOPSLA, 2001.
    [15]
    M. J. Harrold and M. L. Soffa. An incremental approach to unit testing during maintenance. In ICSM, 1988.
    [16]
    J. Hartmann. Applying selective revalidation techniques at Microsoft. In PNSQC, 2007.
    [17]
    J. Hartmann. 30 years of regression testing: Past, present and future. In PNSQC, 2012.
    [18]
    J. L. Hintze and R. D. Nelson. Violin plots: A box plot-density trace synergism. The American Statistician, 1998.
    [19]
    Java, Java everywhere, 2012. http://sdtimes.com/content/article.aspx?ArticleID=36362.
    [20]
    Q. Luo, F. Hariri, L. Eloussi, and D. Marinov. An empirical analysis of flaky tests. In FSE, 2014. to appear.
    [21]
    S. Negara, N. Chen, M. Vakilian, R. E. Johnson, and D. Dig. A comparative study of manual and automated refactorings. In ECOOP, 2013.
    [22]
    S. Negara, M. Codoban, D. Dig, and R. E. Johnson. Mining fine-grained code changes to detect unknown change patterns. In ICSE, 2014.
    [23]
    S. Negara, M. Vakilian, N. Chen, R. E. Johnson, and D. Dig. Is it dangerous to use version control histories to study source code evolution? In ECOOP, 2012.
    [24]
    A. K. Onoma, W.-T. Tsai, M. Poonawala, and H. Suganuma. Regression testing in an industrial environment. Communications, 1998.
    [25]
    A. Orso, T. Apiwattanapong, and M. J. Harrold. Leveraging field data for impact analysis and regression testing. In FSE, 2003.
    [26]
    D. Parsons, T. Susnjak, and M. Lange. Influences on regression testing strategies in agile software development environments. SQJ, 2013.
    [27]
    X. Qu, M. B. Cohen, and G. Rothermel. Configuration-aware regression testing: An empirical study of sampling and prioritization. In ISSTA, 2008.
    [28]
    X. Ren, F. Shah, F. Tip, B. G. Ryder, and O. Chesley. Chianti: A tool for change impact analysis of Java programs. In OOPSLA, 2004.
    [29]
    G. Rothermel and M. Harrold. Selecting regression tests for object-oriented software. In ICSM, 1994.
    [30]
    G. Rothermel and M. Harrold. Empirical studies of a safe regression test selection technique. TOSEM, 1998.
    [31]
    G. Rothermel and M. J. Harrold. A safe, efficient regression test selection technique. TOSEM, 1997.
    [32]
    G. Rothermel, R. H. Untch, C. Chu, and M. J. Harrold. Test case prioritization: An empirical study. In ICSM, 1999.
    [33]
    D. Saff and M. D. Ernst. An experimental evaluation of continuous testing during development. In ISSTA, 2004.
    [34]
    A. Srivastava and J. Thiagarajan. Effectively prioritizing tests in development environment. In ISSTA, 2002.
    [35]
    D. Talby, A. Keren, O. Hazzan, and Y. Dubinsky. Agile software testing in a large-scale project. Software, 2006.
    [36]
    Streamline testing process with test impact analysis, 2013. http://msdn.microsoft.com/en-us/library/ff576128%28v=vs.100%29.aspx.
    [37]
    Testing at the speed and scale of Google, 2011. http://goo.gl/OKqBk.
    [38]
    Tools for continuous integration at Google scale, 2011. http://www.youtube.com/watch?v=b52aXZ2yi08.
    [39]
    M. Vakilian, N. Chen, S. Negara, B. A. Rajkumar, B. P. Bailey, and R. E. Johnson. Use, disuse, and misuse of automated refactorings. In ICSE, 2012.
    [40]
    D. Willmor and S. M. Embury. A safe regression test selection technique for database-driven applications. In ICSM, 2005.
    [41]
    W. E. Wong, J. R. Horgan, S. London, and H. Agrawal. A study of effective regression testing in practice. In ISSRE, 1997.
    [42]
    S. Yau and Z. Kishimoto. A method for revalidating modified programs in the maintenance phase. In COMPSAC, 1987.
    [43]
    S. Yoo and M. Harman. Regression testing minimization, selection and prioritization: A survey. STVR, 2012.
    [44]
    L. Zhang, M. Kim, and S. Khurshid. Localizing failure-inducing program edits based on spectrum information. In ICSM, 2011.

    Cited By

    View all
    • (2023)HybridCISave: A Combined Build and Test Selection Approach in Continuous IntegrationACM Transactions on Software Engineering and Methodology10.1145/357603832:4(1-39)Online publication date: 26-May-2023
    • (2023)Inline Tests37th IEEE/ACM International Conference on Automated Software Engineering10.1145/3551349.3556952(1-13)Online publication date: 5-Jan-2023
    • (2023)A Case Study on the “Jungle” Search for Industry-Relevant Regression Testing2023 IEEE 23rd International Conference on Software Quality, Reliability, and Security (QRS)10.1109/QRS60937.2023.00045(382-393)Online publication date: 22-Oct-2023
    • Show More Cited By

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ASE '14: Proceedings of the 29th ACM/IEEE International Conference on Automated Software Engineering
    September 2014
    934 pages
    ISBN:9781450330138
    DOI:10.1145/2642937
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 15 September 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. regression testing
    2. software quality
    3. test selection

    Qualifiers

    • Research-article

    Conference

    ASE '14
    Sponsor:

    Acceptance Rates

    ASE '14 Paper Acceptance Rate 82 of 337 submissions, 24%;
    Overall Acceptance Rate 82 of 337 submissions, 24%

    Upcoming Conference

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)34
    • Downloads (Last 6 weeks)2

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)HybridCISave: A Combined Build and Test Selection Approach in Continuous IntegrationACM Transactions on Software Engineering and Methodology10.1145/357603832:4(1-39)Online publication date: 26-May-2023
    • (2023)Inline Tests37th IEEE/ACM International Conference on Automated Software Engineering10.1145/3551349.3556952(1-13)Online publication date: 5-Jan-2023
    • (2023)A Case Study on the “Jungle” Search for Industry-Relevant Regression Testing2023 IEEE 23rd International Conference on Software Quality, Reliability, and Security (QRS)10.1109/QRS60937.2023.00045(382-393)Online publication date: 22-Oct-2023
    • (2023)On factors that impact the relationship between code coverage and test suite effectiveness: a survey2023 IEEE International Conference on Software Testing, Verification and Validation Workshops (ICSTW)10.1109/ICSTW58534.2023.00071(381-388)Online publication date: Apr-2023
    • (2023)An information retrieval-based regression test selection techniqueIran Journal of Computer Science10.1007/s42044-023-00145-w6:4(365-373)Online publication date: 15-May-2023
    • (2022)Comparing and combining file-based selection and similarity-based prioritization towards regression test orchestrationProceedings of the 3rd ACM/IEEE International Conference on Automation of Software Test10.1145/3524481.3527223(115-125)Online publication date: 17-May-2022
    • (2022)Software Regression Testing in Industrial Settings: Preliminary Findings from a Literature ReviewTrends in Artificial Intelligence and Computer Engineering10.1007/978-3-030-96147-3_18(227-237)Online publication date: 10-Feb-2022
    • (2020)Using Relative Lines of Code to Guide Automated Test Generation for PythonACM Transactions on Software Engineering and Methodology10.1145/340889629:4(1-38)Online publication date: 26-Sep-2020
    • (2020)Risk-Based Test Case Prioritization by Correlating System Methods and Their Associated RisksArabian Journal for Science and Engineering10.1007/s13369-020-04472-z45:8(6125-6138)Online publication date: 2-Apr-2020
    • (2019)Developer Testing in the IDEIEEE Transactions on Software Engineering10.1109/TSE.2017.277615245:3(261-284)Online publication date: 1-Mar-2019
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media