Project Details
Projekt Print View

Model Transformation Performance Engineering

Subject Area Software Engineering and Programming Languages
Term from 2017 to 2021
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 358569332
 
Final Report Year 2022

Final Report Abstract

Model-Driven-Engineering advocates using models and model transformations as primary and key artifacts in the development process. The used models, e.g., in the automotive domain and for the performance prediction of information systems, can become huge leading to long and unpredictable execution times when running model transformations. The starting point of the project was existing research as well as personal experience showing that optimizing model transformations to improve the execution time by refactoring them is possible. However, current model transformation languages and tools do not assist the software engineer in that endeavour as the execution of the model transformation is a black box to the software engineer and, hence, those activities did require expert knowledge about how the model transformation engines interact with the transformation script. For the first time the MICE project now breaks up this black box and enables software engineers to understand how model transformations are executed and assists in improving its performance. We initially conducted an extensive empirical study to summarize the current state of performance engineering for model transformation, i.e., how relevant is performance for engineers of model transformations and what are impediments and strategies used to improve the performance. This empirical study did confirm that performance is highly relevant and specific support is needed. Based on these results, we developed the first performance engineering approach for model transformations. This approach supports (1) an in-depth monitoring w.r.t. to performance characteristics, (2) an analysis of the monitored data on different levels of granularity, and (3) a visualization of the analysis data opening up the black box of model transformation execution. Additionally, we developed a machine learning based performance prediction approach which enables developers to predict the likely performance of a model transformation based on size and structure of potential future input models. The approach has been applied to two different classes of model transformation languages – the operational model transformation language QVT-o and the declarative model transformation languages Henshin and ATL. The quantitative and qualitative evaluation of the results did show that the monitoring only results in a small overhead and that engineers were able to understand the visualized analysis results, were able to identify root causes of performance problems, and were able to propose solutions for the performance problems.

Publications

  • Monitoring the Execution of Declarative Model Transformations, 9th Symposium on Software Performance (SSP 2018), Hildesheim, p. 1–3, 2018
    Raffaela Groner, Sophie Gylstorff, and Matthias Tichy
  • Towards Performance Engineering of Model Transformation, Companion of the 2018 ACM/SPEC International Conference on Performance Engineering (ICPE 2018), Berlin, p. 33–36, 2018
    Raffaela Groner, Matthias Tichy, and Steffen Becker
    (See online at https://doi.org/10.1145/3185768.3186305)
  • User-Centered Performance Engineering of Model Transformations, ACM/IEEE 22nd International Conference on Model Driven Engineering Languages and Systems Companion (MODELS-C 2019), Munich, p. 635–641, 2019
    Raffaela Groner
    (See online at https://doi.org/10.1109/MODELS-C.2019.00097)
  • A Profiler for the Matching Process of Henshin, Proceedings of the 23rd ACM/IEEE International Conference on Model Driven Engineering Languages and Systems: Companion Proceedings (MODELS 2020), Virtual Event – Montreal, p. 1–5, 2020
    Raffaela Groner, Sophie Gylstorff, and Matthias Tichy
    (See online at https://doi.org/10.1145/3417990.3422000)
  • An Exploratory Study on Performance Engineering in Model Transformations, Proceedings of the 23rd ACM/IEEE International Conference on Model Driven Engineering Languages and Systems (MODELS 2020), Virtual Event – Montreal, p. 308–319, 2020
    Raffaela Groner, Luis Beaucamp, Matthias Tichy, and Steffen Becker
    (See online at https://doi.org/10.1145/3365438.3410950)
  • Extended Abstract of Performance Analysis and Prediction of Model Transformation, Companion of the ACM/SPEC International Conference on Performance Engineering (WOSP-C 2020), Virtual Event – Edmonton, p. 8–9, 2020
    Vijayshree Vijayshree, Markus Frank, and Steffen Becker
    (See online at https://doi.org/10.1145/3375555.3384935)
  • A Survey on the Relevance of the Performance of Model Transformations, Journal of Object Technology, p. 1–27, 2021
    Raffaela Groner, Katharina Juhnke, Stefan Götz, Matthias Tichy, Steffen Becker, Vijayshree Vijayshree and Sebastian Frank
    (See online at https://doi.org/10.5381/jot.2021.20.2.a5)
 
 

Additional Information

Textvergrößerung und Kontrastanpassung