Article ID Journal Published Year Pages File Type
6856144 Information Sciences 2018 43 Pages PDF
Abstract
Testing and performance comparisons for optimization algorithms and methods are an important part of demonstrating accurate behavior. These tests are accomplished using numerical and graphical illustrations of the results obtained from the proposed algorithms. To emphasize the advantages and disadvantages of the proposed approaches and algorithms, a set of problems for which the solution and common properties are known is needed. Thus, the behavior of the algorithm as it obtains the solution set can be explained by using the common properties of the test problems. Therefore, a set of well-known benchmark problems has been proposed by researchers, and a portion of these problems is specifically designed for testing multi-objective optimization algorithms. Although these problems are sufficient to present the performances of optimization algorithms, there is no problem set for investigating the distributed performance of optimization algorithms. Hence, a method for the performance comparison of distribution methods for multi-objective optimization algorithms is needed. In this study, a set of new test problems, called hybrid problems, is defined by aligning two different well-known test functions for parallelization models. These novel problems are solved using the distributed models. Lastly, a set of approaches is proposed to increase the performance of any similar distributed models.
Keywords
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,