Article ID Journal Published Year Pages File Type
4387205 Biological Conservation 2007 10 Pages PDF
Abstract

Systematic approaches to efficient reserve network design often make use of one of two types of site selection algorithm; linear programs or heuristic algorithms. Unlike with linear programs, heuristic algorithms have been demonstrated to yield suboptimal networks in that more sites are selected in order to meet conservation goals than may be necessary or fewer features are captured than is possible. Although the degree of suboptimality is not known when using heuristics, some researchers have suggested that it is not significant in most cases and that heuristics are preferred since they are more flexible and can yield a solution more quickly. Using eight binary datasets, we demonstrate that suboptimality of numbers of sites selected and biodiversity features protected can occur to various degrees depending on the dataset, the model design, and the type of heuristic applied, and that processing time is not dramatically different between optimal and heuristic algorithms. In choosing an algorithm, the degree of suboptimality may not always be as important to planners as the perception that optimal solvers have feasibility issues, and therefore heuristic algorithms might continue to be a popular tool for conservation planning. We conclude that for many datasets, feasibility of optimal algorithms should not be a concern and that the value of heuristic results can be greatly improved by using optimal algorithms to determine the degree of suboptimality of the results.

Related Topics
Life Sciences Agricultural and Biological Sciences Ecology, Evolution, Behavior and Systematics
Authors
, , ,