^{1}

^{2}

^{3}

^{1}

^{2}

^{3}

Permeability is a key parameter associated with the characterization of any hydrocarbon reservoir. In fact, it is not possible to have accurate solutions to many petroleum engineering problems without having accurate permeability value. The conventional methods for permeability determination are core analysis and well test techniques. These methods are very expensive and time consuming. Therefore, attempts have usually been carried out to use artificial neural network for identification of the relationship between the well log data and core permeability. In this way, recent works on artificial intelligence techniques have led to introduce a robust machine learning methodology called support vector machine. This paper aims to utilize the SVM for predicting the permeability of three gas wells in the Southern Pars field. Obtained results of SVM showed that the correlation coefficient between core and predicted permeability is 0.97 for testing dataset. Comparing the result of SVM with that of a general regression neural network (GRNN) revealed that the SVM approach is faster and more accurate than the GRNN in prediction of hydrocarbon reservoirs permeability.

In reservoir engineering, reservoir management, and enhanced recovery design point of view, permeability is the most important rock parameter affecting fluids flow in reservoir. Knowledge of rock permeability and its spatial distribution throughout the reservoir is of utmost importance. In fact, the key parameter for reservoir characterization is the permeability distribution. In addition, in most reservoirs, permeability measurements are rare and therefore permeability must be predicted from the available data. Thus, the accurate estimation of permeability can be considered as a difficult task. Permeability is generally measured in the laboratory on the cored rocks taken from the reservoir or can be determined by well test techniques. The well testing and coring methods are, however, very expensive and time consuming compared to the wire-line logging techniques [

Alternatively, neural networks have been increasingly applied to predict reservoir properties using well log data [

The Iranian South Pars field is the northern extension of Qatar’s giant North Field. It covers an area of 500 square miles and is located 3,000 m below the seabed at a water depth of 65 m. The Iranian side accounts for 10% of the worlds and 60% of Iran’s total gas reserves. Iran’s portion of the field contains an estimated 436 trillion cubic feet. The field consists of two independent gas-bearing formations, Kangan (Triassic) and Dalan (Permian). Each formation is divided into two different reservoir layers, separated by impermeable barriers. The field is a part of the N-trending Qatar Arch structural feature that is bounded by Zagros fold belt to the north and northeast. In the field, gas accumulation is mostly limited to the Permian-Triassic stratigraphic units. These units known as the “Kangan-Dalan Formations” constitute which are very extensive natural gas reservoirs in the Persian Gulf area and consist of carbonate-evaporate series also known as the Khuff Formation [

Geographical position of Southern Pars gas field.

The main objective of this study is to predict permeability of the gas reservoirs by incorporating well logs and core data of three gas wells in the Southern Pars field. As a matter of fact, the well logs are considered as the inputs, whereas horizontal permeability (

Independent component analysis (ICA) is a suitable method for feature extraction process. Unlike the PCA, this method both decorrelates the input signals and also reduces higher-order statistical dependencies [

PC analysis is orthogonal decomposition. It is based on covariance matrix analysis and Gaussian assumption. IC analysis is based on non-Gaussian assumption of the independent sources;

PC analysis uses only second-order statistics. IC analysis uses higher-order statistics. Higher-order statistics is a stronger statistical assumption, revealing interesting features in the usually non-Gaussian datasets [

Independent Component Analysis of the random vector

Mechanism of data transformation via ICA algorithm.

Notice that the nonlinear function

In pattern recognition, the SVM algorithm constructs nonlinear decision functions by training a classifier to perform a linear separation in high-dimensional space which is nonlinearly related to input space. To generalize the SVM algorithm for regression analysis, an analogue of the margin is constructed in the space of the target values (

Concept of

By introducing Lagrange multipliers (

Polynomial, normalized polynomial, and radial basis function (Gaussian) kernels [

Kernel function | Type of classifier |
---|---|

Complete polynomial of degree | |

Normalized polynomial kernel of degree | |

Gaussian (RBF) kernel with parameter |

Then, the nonlinear regression estimate takes the following form:

General regression neural network has been proposed by [

GRNN is a memory-based network that provides estimates of continuous variables and converges to the underlying regression surface. GRNN is based on the estimation of probability density functions, having a feature of fast training times, and can model nonlinear functions. GRNN is a one-pass learning algorithm with a highly parallel structure. GRNN algorithm provides smooth transitions from one observed value to another even with sparse data in a multidimensional measurement space. The algorithmic form can be used for any regression problem in which an assumption of linearity is not justified. GRNN can be thought as a normalised radial basis functions (RBF) network in which there is a hidden unit centred at every training case. These RBF units are usually probability density functions such as the Gaussian. The only weights that need to be learned are the widths of the RBF units. These widths are called ‘‘smoothing parameters.’’ The main drawback of GRNN is that it suffers badly from the curse of dimensionality. GRNN cannot ignore irrelevant inputs without major modifications to the basic algorithm. So, GRNN is not likely to be the top choice if there are more than 5 or 6 nonredundant inputs. The regression of a dependent variable,

The method does not need to assume a specific functional form. A Euclidean distance

The estimate

As it was mentioned, ICA is one of the suitable methods for extracting the most important and relevant features of any particular dataset. Hence, in this paper, we used this method for identification of those well logs that have a good relationship with permeability. Table

Correlation matrix of the well logs and permeability after applying ICA.

GR | DT | RHOB | NPHI | PEF | MSFL | LLD | LLS | KH | ||||
---|---|---|---|---|---|---|---|---|---|---|---|---|

1.000 | ||||||||||||

−.999 | 1.000 | |||||||||||

.655 | −.655 | 1.000 | ||||||||||

GR | .419 | −.216 | .490 | 1.000 | ||||||||

DT | .525 | .026 | .447 | .488 | 1.000 | |||||||

RHOB | .611 | .010 | .597 | .582 | .915 | 1.000 | ||||||

NPHI | .710 | .107 | .440 | .567 | .746 | .615 | 1.000 | |||||

PEF | .125 | .619 | .161 | .210 | .150 | .176 | .009 | 1.000 | ||||

MSFL | .772 | −.070 | .564 | .459 | .627 | .401 | −.318 | −.236 | 1.000 | |||

LLD | .607 | −.510 | .506 | .541 | .553 | .469 | −.101 | .278 | .574 | 1.000 | ||

LLS | .577 | −.678 | .523 | .602 | .539 | .648 | −.203 | .215 | .635 | .767 | 1.000 | |

KH | .607 | .052 | .676 | .689 | .829 | .534 | .836 | .045 | .642 | .720 | .603 | 1.000 |

As it is seen in Table

Rotated component matrix of the parameters.

Parameters | Component 1 | Component 2 |
---|---|---|

.971 | −.065 | |

−.065 | .865 | |

.768 | .129 | |

GR | .653 | −.018 |

DT | .808 | .100 |

RHOB | .630 | −.357 |

NPHI | .826 | .092 |

PEF | .094 | .802 |

MSFL | .677 | −.218 |

LLD | .779 | −.014 |

LLS | .608 | −.132 |

KH | .816 | .074 |

Regarding Table

A graphical form of representation for showing the relationship of well logs and permeability.

As it is also shown in Figure

Similar to other multivariate statistical models, the performance of SVM for regression depends on the combination of several parameters. They are capacity parameter

The optimal value for

Since in this study the nonlinear SVM is applied, it would be necessary to select a suitable kernel function. The obtained results of previous published researches [

In order to find the optimum values of two parameters (

Detailed process of selecting the parameters and the effects of every parameter on generalization performance of the corresponding model are shown in Figures

Sigma versus RMS error on LOO cross-validation.

Epsilon versus RMS error on LOO cross-validation.

In order to find an optimal

From the above discussion, the

Schematic diagram of optimum SVM for prediction of permeability.

In order to check the accuracy of the SVM in prediction of permeability, obtained results of SVM are compared with those of the general regression neural network (GRNN). Constructed GRNN of this study was a multilayer neural network with one hidden layer of radial basis function consisting 49 neurons and an output layer containing only one neuron. Multiple layers of neurons with nonlinear transfer functions allow the network to learn nonlinear and linear relationships between the input and output vectors.

In addition, the performance of GRNN depends mostly on the choice of smooth factor (SF) which is in a sense equivalent to the choice of the ANN structure. For managing this issue, LOO cross-validation technique has been used and optimal SF was found as 0.23. Figure

SF versus RMS error on LOO cross-validation.

After building an optimum SVM based on the training dataset, performance of constructed SVM was evaluated in testing process. Figure

Relationship between the measured and predicted permeability obtained by SVM (a); estimation capability of SVM (b).

As it is illustrated in Figure

As it was already mentioned, to check the accuracy of the SVM in the prediction of permeability, obtained results of SVM are compared with those of the general regression neural network (GRNN). Figure

Relationship between the measured and predicted permeability obtained by GRNN (a); estimation capability of GRNN (b).

As it is shown in Figure

In this research work, we have demonstrated one of the applications of artificial intelligence techniques in forecasting the hydrocarbon reservoirs permeability. Firstly, independent component analysis (ICA) has been used for determination of relationship between the well log data and permeability. In that section, we found that

Comparing the performance of SVM and BPNN methods in the training and testing process.

Model | R (train) | R (test) | RMSE (train) | RMSE (test) |
---|---|---|---|---|

GRNN | 0.998 | 0.94 | 0.16 | 0.35 |

SVM | 0.998 | 0.96 | 0. 16 | 0.28 |

According to this table, the RMS error of the SVM is smaller than the GRNN. In terms of running time, the SVM consumes a considerably less time (3 second) for the prediction compared with that of the GRNN (6 second). All of these expressions can introduce the SVM as a robust algorithm for the prediction of permeability.

Support vector machine (SVM) is a novel machine learning methodology based on statistical learning theory (SLT), which has considerable features including the fact that requirement on kernel and nature of the optimization problem results in a uniquely global optimum, high generalization performance, and prevention from converging to a local optimal solution. In this research work, we have shown the application of SVM compared with GRNN model for prediction of permeability of three gas wells in the Kangan and Dalan reservoir of Southern Pars field, based on the digital well log data. Although both methods are data-driven models, it has been found that SVM makes the running time considerably faster with the higher accuracy. In terms of accuracy, the SVM technique resulted in an RMS error reduction relative to that of the GRNN model (Table

For the future works, we are going to test the trained SVM for predicting the permeability of the other reservoirs in the south of Iran. Undoubtedly, receiving meaningful results from other reservoirs using well log data can further prove the ability of SVM in prediction of petrophysical parameters including permeability.

The authors appreciate anonymous reviewers for their constructive comments and contributions in improving the paper.