PH.D DEFENCE - PUBLIC SEMINAR

Analyzing the Behavior of Deployed Software

Speaker
Mr Marc Brunink (NGS PhD student)
Advisor
Dr David S. Rosenblum, Provost'S Chair Professor, School of Computing


25 Jan 2018 Thursday, 09:30 AM to 11:00 AM

Executive Classroom, COM2-04-02

Abstract:

Software is routinely tested before release. Testing aims to prevent functional and non-functional failures during deployment. A thorough testing process tries to instill confidence in the software artifact. However, to cope with the complexity of software systems as well as budget and time constraints, developers make assumptions and approximations about real-world usage during development and testing. Regularly, these assumptions are made unconsciously and without intention and lack detailed analysis and documentation. Whether or not these assumptions hold during deployment can have a profound impact on the validity of any testing activity.

We address this challenge from multiple dimensions. First, we present a method to collect operational field data in deployed systems. The data is gathered from real executions with low overhead. The collected data reflects actual executions and thus does not contain any assumptions and approximations. Field data can be used to achieve many different goals. Second, we use the gathered field data to mine performance specifications in the form of performance models and performance assertions. The mined models can aid in comprehension, verification, monitoring, and performance regression testing. Third, we demonstrate a method to evaluate the quality of a test suite by comparing field executions with runs observed during testing. Our technique increases confidence in the test suite and the testing process. The results of the technique can also be used to guide test suite augmentation or aid in comprehension.