Article ID Journal Published Year Pages File Type
477002 European Journal of Operational Research 2011 8 Pages PDF
Abstract

In this paper we investigate to what extent random search methods, equipped with an archive of bounded size to store a limited amount of solutions and other data, are able to obtain good Pareto front approximations. We propose and analyze two archiving schemes that allow for maintaining a sequence of solution sets of given cardinality that converge with probability one to an ϵ-Pareto set of a certain quality, under very mild assumptions on the process used to sample new solutions. The first algorithm uses a hierarchical grid to define a family of approximate dominance relations to compare solutions and solution sets. Acceptance of a new solution is based on a potential function that counts the number of occupied boxes (on various levels) and thus maintains a strictly monotonous progress to a limit set that covers the Pareto front with non-overlapping boxes at finest resolution possible. The second algorithm uses an adaptation scheme to modify the current value of ϵ based on the information gathered during the run. This way it will be possible to achieve convergence to the best (smallest) ϵ value, and to a corresponding solution set of k solutions that ϵ-dominate all other solutions, which is probably the best possible result regarding the limit behavior of random search methods or metaheuristics for obtaining Pareto front approximations.

► We address the problem of finding Pareto front approximations of given size. ► Two algorithms are proposed and their convergence properties analyzed. ► The first algorithm uses a new multi-level grid archiving (MGA) strategy. ► The second algorithm uses epsilon-adaptation to find an optimal k-subset. ► Both algorithms converge almost surely to a subset of the Pareto front.

Related Topics
Physical Sciences and Engineering Computer Science Computer Science (General)
Authors
, ,