Article ID Journal Published Year Pages File Type
5000979 Electric Power Systems Research 2017 6 Pages PDF
Abstract

•A critical review of CIGRE and IEEE methodologies is presented.•The quality of traditional methodologies to determine BFOR is assessed.•The application of an elaborate computational approach (HEM-DE) is taken as reference.•CIGRE methodology leads to 98-21% larger BFOR in relation to HEM-DE methodology.•IEEE methodology leads to 47-23% lower BFOR in relation to HEM-DE methodology.

This paper presents a critical review of CIGRE and IEEE traditional methodologies for calculating the backflashover outage rate (BFOR) of transmission lines, considering their adopted simplifications. These methodologies, along with an advanced computational approach named HEM-DE, were applied to a real 138-kV line to determine its backflashover rate. The quality of the results produced by the methodology at each step of their calculation procedure was assessed and discussed, taking as reference the accurate results provided by the advanced approach, based on the application of the hybrid electromagnetic model (HEM) and disruptive effect model (DE). According to this assessment, in the 40-to-10-Ω range of tower footing grounding resistance, the CIGRE methodology overestimates outage rate of the line in relation to the reference values (98-21% larger in the 40-to-10-Ω range of tower footing grounding resistance). The IEEE methodologies underestimates the BFOR, yielding rates 47-23% lower in the same resistance range.

Related Topics
Physical Sciences and Engineering Energy Energy Engineering and Power Technology
Authors
, , ,