|کد مقاله||کد نشریه||سال انتشار||مقاله انگلیسی||ترجمه فارسی||نسخه تمام متن|
|5000979||1368406||2017||6 صفحه PDF||سفارش دهید||دانلود کنید|
- A critical review of CIGRE and IEEE methodologies is presented.
- The quality of traditional methodologies to determine BFOR is assessed.
- The application of an elaborate computational approach (HEM-DE) is taken as reference.
- CIGRE methodology leads to 98-21% larger BFOR in relation to HEM-DE methodology.
- IEEE methodology leads to 47-23% lower BFOR in relation to HEM-DE methodology.
This paper presents a critical review of CIGRE and IEEE traditional methodologies for calculating the backflashover outage rate (BFOR) of transmission lines, considering their adopted simplifications. These methodologies, along with an advanced computational approach named HEM-DE, were applied to a real 138-kV line to determine its backflashover rate. The quality of the results produced by the methodology at each step of their calculation procedure was assessed and discussed, taking as reference the accurate results provided by the advanced approach, based on the application of the hybrid electromagnetic model (HEM) and disruptive effect model (DE). According to this assessment, in the 40-to-10-Î© range of tower footing grounding resistance, the CIGRE methodology overestimates outage rate of the line in relation to the reference values (98-21% larger in the 40-to-10-Î© range of tower footing grounding resistance). The IEEE methodologies underestimates the BFOR, yielding rates 47-23% lower in the same resistance range.
Journal: Electric Power Systems Research - Volume 153, December 2017, Pages 60-65