کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
8124510 1522771 2018 34 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Data-driven inverse modeling with a pre-trained neural network at heterogeneous channel reservoirs
ترجمه فارسی عنوان
مدل سازی معکوس داده با مدل شبکه های عصبی پیش از آموزش در مخازن کانونی ناهمگن
موضوعات مرتبط
مهندسی و علوم پایه علوم زمین و سیارات زمین شناسی اقتصادی
چکیده انگلیسی
This paper develops a reliable and efficient data-integration method, based on artificial neural networks (ANN) incorporated with a stacked autoencoder (SAE) in a deep neural network's framework. To handle scale-different static and dynamic data of heterogeneous channel reservoirs, the workflow suggests an unsupervised pre-training process coupled with ANN-based inverse modeling. The performances of the proposed neural network, i.e. the training efficiency, the predictability of future production rates and the computing time, are compared to those with an optimal ANN and the impact of hidden neurons are discussed. The pre-trained neural network demonstrates a reliable estimation of reservoir properties with the spatial characteristics of a true channel reservoir while the ANN fails with respect to the spatial heterogeneity. The pre-trained neural network decreases the mean absolute error of future oil production rates up to 9.1% which is less than 25% of the comparison case's level, i.e. the optimal ANN model. Its efficiency is validated by a computing time 14 times faster than that of the optimal ANN workflow. The pre-trained neural network evaluates the spatial characteristics of reservoir properties and facies models in a reasonable manner and is thus able to predict the water production rates and also the breakthrough time accurately. This pre-trained neural network manifests its applicability with robustness as an efficient method to integrate static and dynamic data.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Journal of Petroleum Science and Engineering - Volume 170, November 2018, Pages 785-796
نویسندگان
, , , ,