Empirical validation of a quality framework for evaluating modelling languages in MDE environments

Faber D. Giraldo, Angela J. Chicaiza*, Sergio España, Oscar Pastor

*Corresponding author for this work

    Research output: Contribution to journalArticleAcademicpeer-review

    Abstract

    In previous research, we proposed the multiple modelling quality evaluation framework (MMQEF), which is a method and tool for evaluating modelling languages in model-driven engineering (MDE) environments. Rather than being exclusive, MMQEF attempts to complement other methods of evaluation of quality such as SEQUAL. However, to date, MMQEF has not been validated beyond some concept proofs. This paper evaluates the applicability of the MMQEF method in comparison with other existing methods. We performed an evaluation in which the subjects had to detect quality issues in modelling languages. A group of expert professionals and two experimental objects (i.e. two combinations of different modelling languages based on real industrial practices) were used. To analyse the results, we applied quantitative approaches, i.e. statistical tests on the results of the performance measures and the perception of subjects. We ran four replications of the experiment in Colombia between 2016 and 2019, with a total of 50 professionals. The results of the quantitative analysis show a low performance for all of the methods, but a positive perception of MMQEF.Conclusions: The application of modelling language quality evaluation methods within MDE settings is indeed tricky, and subjects did not succeed in identifying all quality problems. This experiment paves the way for additional investigation on the trade-offs between the methods and potential situational guidelines (i.e. circumstances under which each method is convenient). We encourage further inquiries on industrial applications to incrementally improve the method and tailor it to the needs of professionals working in real industrial environments.
    Original languageEnglish
    Pages (from-to)275-307
    Number of pages33
    JournalSoftware Quality Journal
    Volume29
    Issue number2
    Early online date2021
    DOIs
    Publication statusPublished - Jun 2021

    Bibliographical note

    Publisher Copyright:
    © 2021, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.

    Keywords

    • Empirical evaluation
    • Model-driven engineering
    • Quality
    • Quality frameworks
    • The MMQEF method

    Fingerprint

    Dive into the research topics of 'Empirical validation of a quality framework for evaluating modelling languages in MDE environments'. Together they form a unique fingerprint.

    Cite this