Abstract: In model-driven engineering, metamodels are central artifacts that allow to capture domain concepts and build domain-specific languages. However, bad design decisions, continuous changes, and the evolution of requirements may introduce bad smells and deteriorate the quality of metamodels. Refactoring metamodels is a complex task as it should be performed according to many conflicting quality attributes while maximizing the removal of smells. In this paper, we propose a generic automated approach based on a multi-objective heuristic search to refactor metamodels. The process aims at generating a set of refactoring recommendations with various quality trade-offs from which the modeler can choose the most appropriate for her context. We evaluate the efficiency of our approach with a user-based experiment, on time to perform understandability and extendibility tasks, as well as the correctness of the task output. Our results show that, globally, considering trade-offs between quality and smell removal is significantly better than focusing on smell removal alone. The observed difference is statistically significant for the extendibility but only partially for the understandability.
0 Replies
Loading