Article ID Journal Published Year Pages File Type
6885453 Journal of Systems and Software 2016 38 Pages PDF
Abstract
The scalability of model-related operations (e.g., model transformations), when they are to be applied in industrial model-driven engineering, becomes an important issue. However, there is a lack of an automated performance testing framework for those operations, since the existing ones for ordinary programs are ill-suited. Such a framework is required to provide the function of creating and organizing test cases, and the ability of generating test input of large size automatically, because large scale models are not widely available, making it hard to test the performance and coverage of those operations without any bias. This paper proposes a performance testing framework, integrated with a random model generation algorithm, for model-related operations. The framework, based on a test model, can be used to specify and arrange test cases into test suites. And the model generation algorithm can generate a random model correctly and efficiently, according to the metamodel and user-defined constraints. Finally, we present two case studies, one experiment in randomness, and two experiments in generation efficiency to evaluate the framework and algorithm. Results show that the framework is competent to support performance testing of model-related operations, and the algorithm is random and efficient enough to generate test data for performance testing.
Related Topics
Physical Sciences and Engineering Computer Science Computer Networks and Communications
Authors
, , , , ,