Abstract: Performance regression is an important type of performance issue in software systems. It indicates that the performance of the same features in the new version of the system becomes worse than that of previous versions, such as increased response time or higher resource utilization. In order to prevent performance regressions, current practices often rely on conducting extensive system performance testing before releasing the system into production based on the testing results. However, faced with a great demand for resources and time to perform system performance testing, it is often challenging to adopt such approaches to the practice of fast-paced development and release cycles, e.g., DevOps. This thesis focuses on addressing software performance regressions in DevOps without relying on expensive system performance tests. More specifically, I first propose a series of approaches to helping developers detect performance regressions and locate their root causes by only utilizing the readily-available operational data when the software system is running in the field and used by real end users. I then leverage small-scale performance testing and architectural modeling to estimate the impact of source code changes on the end-to-end performance of the system in order to detect performance regres-sions early in the software development phase. Through various case studies on open-source projects and successful adoptions by our industrial research collaborator, we expect that our study will provide helpful insights for researchers and practitioners who are interested in addressing performance regressions in DevOps without expensive system performance testing,
Loading