Article ID Journal Published Year Pages File Type
4981035 Process Safety and Environmental Protection 2017 24 Pages PDF
Abstract
Although the main objective of utilizing gas detection systems is risk reduction by detecting high-risk scenarios, the majority of published placement procedures do not address the risk concept quantitatively. To include this concept, a risk-based methodology is proposed that consists of four key steps: input data, dispersion analysis, risk analysis, and optimization. In the first step, a set of release scenarios are defined and required data, including frequency of release, wind rose, and grid set, are provided. In the dispersion analysis step, the set of scenarios is simulated using a dispersion simulation tool and the ability of grid points to detect any scenario is stored in a binary matrix called detection matrix (DI). In the risk analysis, the risk of each scenario is calculated incorporating these factors: frequency, damage to personnel, asset loss, and probability of delayed ignition. Once the DI and risk of scenarios are provided, at the last step, optimum placement is performed using a risk-based objective function, which is defined as the sum of the risk of undetected scenarios. The optimization formulation called maximum risk reduction is solved by a greedy approach known as dynamic programming in which the location of detectors is determined by an iterative procedure: each time finding the grid point that can cover maximum undetected risk. The applicability of the methodology is shown in a case study and the results are compared with a coverage-based formulation.
Related Topics
Physical Sciences and Engineering Chemical Engineering Chemical Health and Safety
Authors
, , ,