Sunteți pe pagina 1din 50

9th INTERNATIONAL FORUM ON RESERVOIR SIMULATION December 9 13, 2007 Abu Dhabi, United Arab Emirates

Modern Techniques for History Matching


Ralf Schulze-Riegert Shawket Ghedan

Abstract Traditional History Matching techniques applied to dynamic reservoir simulation models are cumbersome and time consuming. Typically these techniques modify a limited set of reservoir parameters within given uncertainty ranges. Underlying concepts allow delivering a single nonunique history matched simulation model without any verifiable uncertainty envelope. Modern History Matching techniques aim at generating a variety of validated models. These models should be capable of reproducing historical reservoir performance data and quantifying uncertainties related to reservoir performance predictions. This paper presents an introduction to concepts and trends in modern History Matching. Special attention is paid to stochastic optimisation techniques. Uncertainties in reservoir data are summarized and provide the motivation for introducing new workflows in probabilistic forecasting. A distributed computing framework is described which facilitates deploying algorithms for an increasing number of complex simulation problems which require solutions in a brief time. Keywords Reservoir Simulation, History Matching, Uncertainties in Reservoir Data, Uncertainty Quantification, Optimisation, Evolutionary Algorithms, Distributed Computing

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

TABLE OF CONTENTS
Page 1 2 INTRODUCTION UNCERTAINTIES IN RESERVOIR DATA 2.1 Reservoir Heterogeneity 2.2 Data for Integrated Reservoir Studies 2.3 Uncertainty in Reservoir Data 2.3.1 Uncertainty in Geophysical Data 2.3.2 Uncertainty of Geological Data 2.3.3 Uncertainty of Dynamic Reservoir Data 2.3.4 Uncertainty of Reservoir Fluids Data 2.4 Management of Reservoir Uncertainties 2.4.1 Uncertainty Assessment of Static Reservoir Parameters 2.4.2 Uncertainty Assessment of Dynamic Reservoir Parameters MODEL VALIDATION 3.1 Qualification 3.2 Optimisation Techniques 3.2.1 Evolutionary Algorithms 3.2.2 Multi-Objective Optimisation Techniques 3.2.3 Optimisation on the Response Surface WORKFLOW DESCRIPTION / EXAMPLES 4.1 Reference Workflow and Best Practices 4.2 Uncertainty Assessment using Experimental Design 4.3 Reducing Uncertainty Ranges using EDM 4.4 Example for Top-Down Reservoir Modelling Approach 4.5 Production Data and Uncertainty Quantification 4.6 What distinguishes modern from classical History Matching? DISTRIBUTED COMPUTING IN RESERVOIR SIMULATION REFERENCES 1 2 2 4 5 6 6 7 7 8 9 9 11 11 13 14 23 26 30 30 32 33 36 37 38 41 44

5 6

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

INTRODUCTION

The main objective of geo-science, and reservoir simulation in particular, is to optimise the value of a project, an asset or a reservoir portfolio. In this context a paradigm shift from deterministic thinking to a philosophy that recognises subsurface uncertainties has greatly influenced recent workflow developments in the oil and gas industry. In reservoir management, increasing effort is being made to develop structured approaches for assessing the impact of uncertainties on investment decisionmaking. Documented approaches mostly build on simplified component models for each decision domain, e.g. G&G models, production scenarios, drilling model, processing facilities, economics and related costs. Because of its complexity, the integration of dynamic modelling is only gradually entering this domain of decision-making processes. This trend has initiated a revision of the conventional deterministic bottom-up modelling paradigm, which implicitly assumed that more model precision would result in higher forecasting accuracy. New workflows in uncertainty assessment and probabilistic forecasting suggest a top-down approach. Within that framework the model precision should depend on the business decision to be made and not the other way around. Top-down or bottom-up, in reservoir simulation it is generally accepted that any model realistically predicting unknown future quantities should reproduce known history data. This requires a model validation process called History Matching, which is traditionally cumbersome and time consuming. Advances in computer technology and the availability of low cost high performance computers have generated an increasing interest in the application of automated optimisation algorithms to the problem of History Matching. This paper reviews selected recent developments in History Matching techniques. Special focus is given to stochastic optimisation techniques, since these are easy to understand, cover a broad application area, and highlight the potential of emerging technologies in optimisation and uncertainty assessment. The number of published works in this field has significantly increased in the last decade. It is not within the scope of this paper to give a historical overview or to discuss advantages and limitations of alternative techniques. An extended reference list is added for this purpose. The scope of this paper is to discuss recent developments which have entered the industry arena and are now used to support business decision processes. In the first chapter a review of uncertainties in reservoir simulation introduces the discussion of History Matching techniques. The following chapter gives an overview of selected optimisation techniques. Published workflow examples are summarized in order to document some real application scenarios. The closing chapter presents a technical focus on the computing framework which facilitates modern History Matching for industrial applications.

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

UNCERTAINTIES IN RESERVOIR DATA

This chapter is dedicated to a review of the uncertainties inherent in reservoir simulation models. It will start with a discussion of the subject of reservoir heterogeneity which determines the quantity and quality of data needed to characterize oil and gas reservoirs properly. The chapter will then cover the data needed for integrated reservoir studies and the uncertainties associated with these data. Finally, the chapter will present current industry practice in quantifying and assessing these uncertainties for the purpose of generating representative dynamic simulation models to be used for field production optimisation and maximizing oil and gas reserves.

2.1 Reservoir Heterogeneity


Reservoir heterogeneity is defined as the degree of variation in reservoir properties as a function of space and scale, starting from pore up to field level. This means that the geological and petrophysical properties of reservoir rock would change with a change in location within the reservoir. It is safe to state that all reservoirs are heterogeneous, varying only in their degree of heterogeneity. Description of homogeneous reservoirs is a very simple task, as measuring reservoir properties at any location permits full description of the reservoir. However, it is not so simple for heterogeneous reservoirs, as the reservoir properties vary as a function of spatial location. For proper heterogeneous reservoir description we need to predict the variation of the reservoir properties of rock facies, porosity, permeability, saturation, and faults and fractures as a function of spatial locations (Kelkar, 2002). Reservoir Heterogeneity occurs at all scales from pore scale variation to major reservoir units within a field, and every scale in between. Scales of heterogeneities are important because different heterogeneities have a different impact on reservoir performance and oil recovery. Proper identification and knowledge of reservoir heterogeneities on various scales is necessary for an optimum production performance. The scales of heterogeneities are defined at four levels of complexity as follows (Kelkar 2002). Scale of Reservoir Heterogeneities
Type Microscopic (Pore Level) Macroscopic (Core Level) Megascopic (Sim Grid Level) Gigascopic (Reservoir Level) Scale 10-100 m 1-100 cm 10-100 m >1000 m Measurements Pore and Throat Distribution, Grain Size Permeability, Porosity, Wettability, Saturation Log Properties, Residual Oil, Seismic Well Test, Geological Description Effect on Performance Displacement Efficiency (Trapped Oil) Sweep Efficiency (Bypassed Oil) Sweep Efficiency (Bypassed Oil) Extraction Efficiency (Untrapped Oil)

Table 1: Table presents the four levels of reservoir heterogeneities, the types of measurement and their effect on reservoir performance.

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Microscopic heterogeneities are measured on a micro level. Laboratory measurements physically investigated at the microscopic level include pores and pore throat size distributions, grain shape and size distributions, throat openings, rock lithology, packing arrangements, pore wall roughness, and clay lining of pore throats, etc. The major controls on these parameters are the deposition of sediments and subsequent processes of compaction, cementation, and dissolution. Due to Micro-scale pore level heterogeneities, displacing fluids may take preferential paths leaving behind residual hydrocarbons. Poor displacement efficiency results in higher residual or trapped hydrocarbon. This will directly impact upon the amount of oil recovered. Macroscopic heterogeneity is measured at the level of laboratory cores. Laboratory measurements physically investigated at the macroscopic level include porosity, permeability, fluid saturation, capillary pressure, and rock wettability. Macroscopic scale rock and fluid properties are employed to calibrate logs and well tests for input into reservoir simulation models. The macro-scale heterogeneities will define the shape of the flood front of the displacing fluids, which in turn will determine the amount of bypassed oil. Megascopic heterogeneity represents flow units, and is usually investigated through reservoir simulation. Examples of megascopic heterogeneities include: lateral discontinuity of individual strata; porosity pinch-outs; reservoir fluid contacts; vertical and lateral permeability trends; and reservoir compartmentalization. Reservoirs are engineered and managed at this scale of interwell spacing. These heterogeneities are commonly inferred from transient pressure well test analysis, tracer tests, well logs correlations, and high resolution seismic. It determines the well-to-well recovery variation which could result from reservoir units primary stratification and internal permeability trends. One of the very important heterogeneities in reservoir engineering calculations is stratification. Many reservoirs contain layers of productive rock that can be either communicating or non-communicating. These layers can vary considerably in permeability and thickness. A good description of the layers and their respective properties is critical in planning many EOR operations. Since the mega-scale heterogeneities may be an extension of the macro-scale heterogeneities, the mega-scale heterogeneities will have the same effect as the macro-scale heterogeneities, but on a larger reservoir scale. Areal heterogeneities affect areal sweep efficiency, while the vertical heterogametes affect vertical sweep efficiency. Macro- and mega-scale heterogeneities may result in variations in the areal and vertical reservoir properties which may cause the displacing fluids to reach the producing well without reaching all parts of the reservoir. This may leave behind large quantities of bypassed oil. The whole field (depositional basin) is encompassed in the giga-scopic scale of heterogeneities. Reservoirs are explored for, discovered, and delineated at this level. Characterization at this level begins from inter-well spacing and extends up to the field dimensions. Field-wide regional variation in reservoir architecture is caused by either the original depositional settings or subsequent structural deformation and modification due to tectonic activity. Lack of understanding of

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

reservoir heterogeneities in their giga-scopic scale means that some of the oil source remains uncontacted. It is well established that understanding and capturing reservoir heterogeneity in our reservoir characterization, and in turn in our reservoir simulation models, would help us devise the proper field development to optimise production performance and maximize the hydrocarbon reserves from oil and gas reservoirs. This will depend, however, on the availability of reservoir data, as the availability of reservoir data determines our understanding of reservoir heterogeneity, since: Well log and core data will help to define facies and assess diagenesis, thus defining factors affecting reservoir quality and in turn oil displacement efficiency. Well test and production data will help to define dynamic conditions in a reservoir, assess the contrast between effective and core measured permeabilities and, in turn, the connectivity of the permeable and impermeable zones, which has a direct impact on the reservoir areal and vertical sweep efficiencies. Seismic data are required at larger scales to define reservoir faults and assess continuity of major reservoir stratigraphic units, which again has a direct impact on the reservoir areal and vertical sweep efficiencies.

2.2 Data for Integrated Reservoir Studies


Integrated reservoir modelling studies aim at generating the most accurate description of the reservoir leading to the most accurate estimation of reserves and production profiles for a given recovery mechanism and a given development scheme. Components and Main Tasks of Integrated Reservoir Study
Components Tasks Reservoir Characterization Regional Geological Model Reservoir Depositional Model Reservoir Pore System Model Reservoir Rock Type Scheme Static Geological Model Reservoir Structural Model Reservoir Facies Model Reservoir Rock Typing Model Reservoir Flow Units Model Reservoir Property Model Defining Fluids Contacts Dynamic Simulation Model Upscaling of Static Geological Design of Wells/ Facilities Intial Resevoir Fluids Distribution Near Wellbore Performance Validation of Simulation Model Optimisation Development Plans Economic Model & Risk Analysis

Table 2: Integrated reservoir studies are performed in three main stages. Integrated reservoir studies combine several types of data, including geological, seismic, petrophysical, well, and production data. Some of these data are considered to be static and some to be dynamic. Static data provide information on the reservoir framework and fluid saturation at well positions, but no information on the way the fluids will move during production. Dynamic reservoir data, on the other hand, provide information on fluid movement within the reservoir. Reservoir data are either point-source or volume-source. Point-source data are volumetrically insignificant, while volume-source data are volumetrically significant. Most reservoir static data are considered to be point-

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

source data, while some dynamic data such as production and well testing data, as well as field and well performance data, are considered to be volume source data. These dynamic data are considered to be highly diffuse and interpretive, as a great deal of averaging goes on during their acquisition. Reservoir data may be determined from core plugs, whole cores, well logging, well testing, borehole geophysics, outcrop, and 3D-seismic data. Source of Static and Dynamic Reservoir Data
Reservoir Structural Data Structure Geophysical data Well tops data Well log data Reservoir Geological Data Facies Geophysical data Core data Well log data Reservoir Rock Properties Fluid Composition Core data Thickness Well logging data 3D seismic data Geology&Geostatistics Grain Size Distribution Thin Section Analysis NMR X-Ray,SEM&Cat Scan Fluid PVT Properties Core data Well log data Well test data Reservoir Fluid Properties Fluid Composition PVT Samples Production testing Fluid PVT Properties PVT Experiments Correlations Equation of State Rock-Fluid Properties Fluid Saturation Core data OH and CH log Well test data Wettability Special Core Analysis Capillary Pressure Special Core Analysis Well log data Relative Permeability Special Core Analysis Well test data Fluid Contacts Well logging data Well testing & pres data Seismic data PTS Distribution Thin Section Analysis Special Core Analysis Well log data Fluid Viscosity Core data Well test data Log-Derived Permeability Fluid Viscosity Lab Experiments Correlations Reservoir Geometry 3D Seismic data Facies analysis Well correlation Pore Compressibility Special Core Analysis Correlation Field data Fluids IFT Data Core data Well log data Well test data Fluids IFT Data Lab Experiments Correlation

Table 3: Required simulation data and their source

2.3 Uncertainty in Reservoir Data


Except for outcrop and 2-D and 3-D seismic data, most reservoir data are determined at the reservoir well points. Even for fully developed and matured fields these well points only account for less that 1% of the reservoir volume. Combining this fact with the fact that most of our reservoirs are highly heterogeneous makes the reservoir data, even in the best case, highly uncertain, especially at reservoir locations between wells. Uncertainty is defined as lack of assurance about the truth of a statement or about the exact magnitude of an unknown measurement or number (Ballin 1993, Olea, 1991). The degree of uncertainty may vary from one variable to another. The uncertainty of a parameter may result from difficulty in directly and accurately measuring the quantity. This is particularly true of the physical reservoir parameters which, at best, can only be sampled at various points, and

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

are subject to errors caused by presence of the borehole and borehole fluid or by changes that occur during the transfer of rock and its fluids to laboratory temperature and pressure conditions (Walstrom 1967). Accordingly, a large number of uncertainties can be identified in the integrated reservoir modelling process. These uncertainties could be classified into five groups: Uncertainty in Geophysical Data: This affects the reservoir envelope and its fault system. Uncertainty in Geological Data: This results in uncertainties in the reservoir sedimentary and petrophysical model that influence the hydrocarbon volume in place and the dynamics of fluid flow. Uncertainty in the Dynamic Data: these uncertainties have a substantial impact on the determination of reserves and production profiles. Uncertainty in the Reservoir Fluids Data: These uncertainties have a substantial impact on the optimisation of the processing capacities of oil and gas. Uncertainty in the Reservoir Performance Data: These uncertainties affect the tolerance needed for the model validation process. 2.3.1 Uncertainty in Geophysical Data In the geophysical domain there are uncertainties linked to acquisition, processing, and interpretation of seismic data (Vincent 1999 and Corre 2000). Some of these uncertainties are: Uncertainties and errors in picking The difference between several interpretations. Uncertainties and errors in depth conversion. Uncertainties in seismic-to-well-tie. Uncertainties in pre-processing and migration. Uncertainties in the amplitude map of the top reservoir. 2.3.2 Uncertainty of Geological Data In the geological domain, the uncertainties are linked to the geological scheme, to the sedimentary concept, to the nature of the reservoir rocks, their extent, and to their properties, that is their heterogeneities, and so on (Corre 2000). Ignoring the uncertainties in the reservoir lithology, would lead to underestimating the complexity of the reservoir between wells, resulting in overor under-estimating the connected reservoir pore volumes. Uncertainty in any geological model is unavoidable given the sparse well data and the difficulty in accurately relating geophysical measurements to reservoir scale heterogeneities. Some of the known geological uncertainties are: Uncertainties in gross rock volume. Uncertainties in the extension and orientation of sedimentary bodies.

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Uncertainties in the distribution, shape, and limits of reservoir rock types. Uncertainties in the porosity values and their distribution. Uncertainties in the horizontal permeability values and their distribution. Uncertainties in the layers Net-to-Gross Ratio. Uncertainties in the reservoir fluids contacts.

2.3.3 Uncertainty of Dynamic Reservoir Data In the production area, there are uncertainties linked to any parameter that affects flow within the reservoir, such as absolute and relative permeability, fault transmissibilities, horizontal barriers, thermodynamics, injectivity, and productivity indexes, well skin, and so on (Corre 2000). Some of these uncertainties are: Horizontal permeability. Permeability anisotropy. Vertical to Horizontal permeability ratio. Due to limited core data the associated uncertainty in Kv/Kh is frequently quite large (Verbruggen 2002). Water/oil and gas/oil relative permeability parameters. Capillary pressure curves. Aquifer behaviour. Fault sealing/ transmissibility. Extension of horizontal and vertical barriers. RFT, TDT/RST and production data have to be used to highlight areas of the reservoir where vertical flow barriers are present. These areas have then been studied more closely in core and logs for further evidence of low Kv units. Wells productivity Index and Injectivity Index. Wells skin. 2.3.4 Uncertainty of Reservoir Fluids Data Uncertainties in the description of reservoir fluid composition and properties contribute to the total uncertainty in the reservoir description, and are of special importance for the optimisation of the processing capacities of oil and gas, as well as for planning the transport and marketing of the products from the field (Meisingset 1999). Some of the reservoir fluids uncertainties are: Uncertainty in reservoir fluid samples: lack of representative samples from the reservoir is considered one of the dominant uncertainty factors in characterizing reservoir fluids. It has a direct impact on the evaluation process of EOR methods in reservoir development. Uncertainty in reservoir samples from different reservoir zones: possible variations in fluid properties in different parts of the field may introduce uncertainty in the reservoir fluids description.

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Uncertainty in the compositional analyses. Uncertainty in volumetric measurements in the PVT laboratory: this is considered to be of less importance compared with the representativity of the reservoir fluid samples (Meisingset 1999). Uncertainties in the reservoir fluids interfacial tension: this may be of importance due to its effect on the capillary pressure, and/or compositional effects like revaporization of oil into injection gas (Meisingset 1999).

2.4 Management of Reservoir Uncertainties


The decision to sanction an oil or gas field development should be taken with consideration of the possible upside and downside outcome (Smith 1993). Apart from economic factors such as oil price and capital cost, the critical factors affecting the economics of an oil or gas field are reservoir uncertainties. These directly affect reservoir recovery and ultimate reserves estimates. Reservoir recovery and performance estimates depend on many reservoir parameters, e.g.: Reservoir structural architecture Reservoir stratigraphic architecture Reservoir permeability architecture Reservoir rock properties Reservoir fluid properties Reservoir rock-fluid properties Reservoir drive mechanisms Spacing/orientation of producing and injecting wells. Estimation of recovery uncertainty is a complicated process and requires an understanding of the uncertainty, both of the reservoirs static architecture and of its dynamic behaviour during production (Friedmann 2003). To progress from the discovery of an oil or a gas reservoir to an economically viable project means to assess and reduce reservoir uncertainties, i.e.: Reduce reservoir uncertainties: This could be achieved by carefully planning a timely data acquisition campaign. Efforts should be made to guarantee that the data are acquired following best practice procedures in the measurement and interpretation of experimental and/or in-situ measured data. Assess reservoir uncertainties: This would allow asset teams to analyse the risk associated with development projects, improve the reservoirs appraisals, and manage their developments more efficiently. A single representation of the reservoir is often biased towards a special focus on few a small number of uncertainty parameters. Consequently, the underlying concept of the uncertainty assessment process is consideration of multiple representations of the reservoir which reflect uncertainties in static and dynamic reservoir data. This includes the assessment of different reservoir characteristics:

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Distribution of reservoir rock volume. This is achieved by quantifying uncertainties linked to reservoir faults and structural horizons. These uncertainties are modelled by generating variable structural scenarios or reservoir envelopes. Reservoir geological, sedimentological and petrophysical uncertainties. This is achieved by generating variable realizations for the reservoir facies, rock types and petrophysical rock properties in the reservoir areas between the wells. Reservoir dynamic uncertainties. This is achieved by model validation techniques for quantifying the accurate movement of reservoir fluids in the reservoir. This process is further discussed in the next section.

2.4.1 Uncertainty Assessment of Static Reservoir Parameters Static geological modelling based on results of the reservoir characterization process including geological and geophysical data may establish one or multiple detailed geological models. This includes uncertainties in reservoir geometry and heterogeneities that have a major impact on reservoir bulk and pore volume as wells, as fluids flow in these reservoirs. Geostatistical algorithms provide different ways of defining the uncertainties of estimated reservoir parameters at un-sampled locations or reservoir areas between the wells. The two largest static uncertainties the reservoir geometry as defined by reservoir faults and horizons and the petrophysical property distributions may be assessed and quantified by geostatistical methods (da Cruz 2004). One of the most important aspects of geological reservoir modelling is the connectivity of permeable and impermeable zones. Geostatistical methods provide means for incorporating connectivity through the indicator variograms, and help in describing the probability of lithologies at different locations resulting in different connectivity (Srivastave 1990). 2.4.2 Uncertainty Assessment of Dynamic Reservoir Parameters After building the most representative geological model, a few geostatistical realizations are generated by sampling from uncertainty distributions in geophysical and geological parameters. These multiple geostatistical realizations can be ranked to reduce the number of models for further processing through a flow simulator used for decision making. A representative combination of these geostatistical realizations along with reservoir rock and fluids flow parameters, reservoir schedule data on injectors and producers, pressure and production performance constitute the base-case dynamic simulation models (Friedmann 2003). Traditionally, the variation of one-parameter-at-a-time approach is used as a sensitivity analysis to assess effects of uncertainties in various dynamic reservoir parameters on simulation responses. The dynamic simulation model is validated by matching production history data of existing wells to simulation data by varying selected dynamic reservoir parameters such as aquifer pore volume and connectivity, relative permeability data, faults connectivity, or Kv/Kh data. Typically, one or two parameters are varied at a time (Vincent 1999). This

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

process is tedious and for large fields it becomes close to impossible to investigate relationships between the model responses and variations of different reservoir input parameters. Uncertainty in the production forecast results form interactions of uncertainties in all sources of data as well as from model assumptions in the numerical flow model. A procedure transferring uncertainties through the numerical flow model up to the production forecasts is highly desired (Ballin 1993). Documented approaches mostly build on simplified component models using proxy modelling (Begg 2001). The following chapter will focus on model validation and prepares the ground for a probabilistic forecasting process using dynamical reservoir simulation.

10

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

MODEL VALIDATION

This chapter will start with a discussion of the objective function definition which qualifies the reservoir model in comparison with the matching production data. Discussions in this section will mainly focus on stochastic optimisation techniques to develop a basis for describing underlying principles for modern workflows in History Matching. Experimental Designs, and Response Surface modelling techniques as input to an optimisation workflow are covered briefly.

3.1 Qualification
A typical initial verification of a reservoir simulation model is the reproduction of initial RFT pressures. The plot below shows an RFT plot, i.e., pressure vs. depth. The black squares indicate the measured pressure points and the blue solid line gives the simulation data. The mismatch is obvious. In zone 1 and 3 the simulated pressure is too low, while in zone 2 and 4 the simulated pressure is too high compared with the measured pressure data. Different parameter adjustments may be applied to history match the case, e.g., decrease or increase the permeability in different zones, increase or decrease the ztransmissibility at zone boundaries, adjust pore volumes by zone until the match is achieved.
Pressure

5000 5100 5200 Depth 5300


Zone 3 Zone 1 Zone 2

5400 5500 5600


Zone 4

Figure 1: RFT measurement data (black squares) and simulation data (line) In manual History Matching the plot gives an immediate indication of the mismatch, and possible actions can be worked out. Any numerical optimisation scheme requires an indicator, i.e., one or more numerical values for indicating a mismatch. This indicator is typically captured in the objective function definition which is given below. The difference between the observed values yobs and calculated values ycalc defines the quality of the History Match, and is described by the objective function QLikelihood. In addition, prior information may be available from logs, cores, seismic or geological interpretations. This information is included in the prior objective function QPrior, and is typically used as a constraint in the History

20 00 21 25 22 50 23 75 25 00 26 25 27 50
11

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Matching process. Under the assumption of a quadratic error model (Tarantola 1996), both contributions are described by a least square formula and are simply related. (3.1) Q = (1 )Q Likelihood + Q Prior From a practical point of view, an additional weight factor is included to prioritise either contribution in a global objective function definition. The objective function most often used in the literature is defined by
Q Likelihood = i
i

( y icalc y iobs ) 2

i2

(3.2)

y icalc (x) are the calculated values and depend on the model parameters xn .
Observed values y iobs are taken from the production history, while i2 defines the standard deviation of the observed values. Measurement errors are assumed to be independent. In an alternative formulation a full covariance matrix may be included to connect measurement data in time and space (Aanonsen 2002, Schulze-Riegert 2003). The summation index i refers in general to all measurement data types, e.g. well, field or region data etc., measurement values and time steps. In the case of RFT pressure data, the objective contribution may relate to well and pressure data as a function of depth at a particular point in time. In order to prioritise the importance of the observed values a weighting and normalization factor i is included.
8 7 Mismatch Quality 6 5 4 3 2 1 0 1 6 11 16 21 26 31 36 Simulation Run M1 M2 M3 Global Mismatch Quality 6 5 4 3 2 1 0 0 2 4 Global 6 8 M1 M2 M3

Figure 2: Contributions to the objective function The global objective function value as well as individual contributions become an important measure for analysing the quality of a History Match. The figure above shows the individual contributions (M1,M2,M3) to the global objective value (Global) for 40 different simulation runs (left-hand plot). Each dot represents an objective contribution, and is related to the full simulation run according to Equation 3.2. The same data base is used for the cross plot (righthand side) where individual contributions are plotted as a function of the global objective value. This view of the data shows that mismatch contributions M1

12

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

and M3 are correlated with the global objective function value, whereas mismatch contribution M2 becomes worse as the global value improves. In the case of the RFT pressure measurement data from the above example, the objective contribution could be separated into contributions from different zones. This will make it possible to analyse possible correlations among objective contributions, and give valuable information on the effect of parameter dependencies. The prior objective function is independent from the dynamical data, and is given by
Q Prior = ( x n x n ) C xprior
n,m

1 nm

( xm xm )

(3.3)

The prior objective function includes additional information on the expected average value xn and the correlation matrix C xprior . The summation indexes n,m include all model parameters. For a detailed discussion of including prior information cf. Brun 2001.

3.2 Optimisation Techniques


Optimisation methods applied in reservoir model simulation and History Matching projects in particular are manifold. Commercial and proprietary optimisation frameworks and techniques used for real full-field reservoir simulation studies include the following most common techniques: Evolutionary Algorithms (Soleng 1999, Romero 2000, Schulze-Riegert 2001, Williams 2004) Gradient techniques (Anterion 1989, Bissell 1994, Lepine 1998, Roggero 1998) Response surface modelling and optimisation on the response surface (Eide 1994, Little 2006, Cullick 2006) Hybrid schemes which couple different optimisation techniques (Gmez 1999, Landa 2003, Castellini 2006) In addition, new techniques are continuously being tested for History Matching as part of research or joint industry projects. The most prominent techniques that are receiving a high level of attention in the research community are Ensemble Kalman Filter Techniques (Naevdal 2003, Evensen 2007) Either method has advantages and limitations. It is not the intention of this paper to describe all methods in detail. An extensive review first presented in the mid 90s is given by Wen et al., cf. Wen 2005. From the reservoir engineering point of view any optimisation method must be transparent, and results must directly relate to verifiable workflows, in order to establish trust and confidence. Reservoir simulation models become more complex and need to be capable of delivering results for decision gates in shorter time periods. At the same time, reservoir simulation deals with substantial modelling uncertainties. Recent workflows in uncertainty quantification therefore aim at generating alternative

13

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

simulation models rather than producing one unique reservoir model. Hence, optimisation techniques should: have a broad application area with little introduction and customisation effort deliver good solutions within the framework of uncertainty be robust with satisfactory performance be simple to understand follow a transparent workflow and deliver reproducible results Evolutionary Algorithms satisfy all these requirements, and therefore provide a methodology for challenging deployment of algorithms for an increasing number of problems which become more and more complex and require solutions in less and less time. This section will therefore give a more detailed description of Evolutionary Algorithms which are easy to understand and cover a broad application area. 3.2.1 Evolutionary Algorithms Evolutionary Algorithms belong to the class of direct search methods. They use only the objective function value to determine new search steps, and do not require any gradient information from the optimisation problem. Therefore, they can be used in cases for which gradient information is not available and where traditional algorithms fail because of significant non-linearities or discontinuities in the search space. Evolutionary Algorithms have proven to be robust and easy to adapt to different engineering problems. The nature of Evolutionary Algorithms is to use parallel structures in generating parent-to-child sequences. This principal feature can easily be transferred to parallel structures of an optimisation program, allowing parallel computing to be used. The scalability of this methodology will have an important effect on the applicability of numerical optimisations in case of very time consuming simulations. What are EAs? This section presents general schemes that define the basic concept of Evolutionary Algorithms. Terminology and main operators are introduced. The literature reports multiple implementations of Evolutionary Algorithms. For extended reviews see Goldberg (1989), Bck (1996), Gen (2000), or Eiben (2003). The underlying idea among all these algorithms is derived from the following observation: Given a population of individuals, the environmental pressure causes natural selection (survival of the fittest), which leads to an increase in the fitness of the population. The objective function definition in History Matching describing the mismatch between simulated and observed production data (Equation 3.2), is related to their fitness, and should be minimised. An initial population, i.e., sets of reservoir uncertainty parameters, may be randomly generated and create a set of candidate solutions. Each candidate is assigned an objective value which quantifies the mismatch the smaller the better. Based on the abstract fitness

14

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

measure, some better candidates are chosen to be carried over into the next generation by applying recombination and mutation operators. The application of recombination and mutation operators generates a new set of individuals (offspring) that compete. Based on the fitness and possibly age, individuals from previous generations are taken into the next generation. This process is repeated until individuals with sufficient fitness (quality) are found. The convergence criterion may also be coupled with some CPU time limit. In this process two fundamental operations are applied. Variation operator (recombination, mutation) Selection operators. The following diagram shows the principal workflow of an Evolutionary Algorithm. Parents and Children refer to the generational sequence of individuals. Evolutionary Algorithms are population based. Each Evaluation requires the calculation of the objective function value. Since Evolutionary Algorithms are population based, the evaluation of individuals can be processed simultaneously.
Initialisation Initialisation of Population of Populatio

Evaluation

Selection of Parents Recombination

Stop criteria

Evolutionary Algorithm
Mutation

Selection of Children Evaluation Evaluation

Figure 3: Workflow diagram of an Evolutionary Algorithm

15

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Typical behaviour of the EA Evolutionary Algorithms follow a general pattern of behaviour. The following diagrams show the course of an optimisation process in three steps. Each dot in the following diagrams represents an objective function value which qualifies a simulation case. For simplicity a one-dimensional objective function is assumed. The function value should be minimized. Initially, results are randomly distributed and cover the entire search region. In the second phase, the better individuals survive and the average objective function value is reduced and the overall fitness of the population increases. In the final phase only the best individuals survive. Multiple optima are found.

Figure 4: Progress of an Evolutionary Algorithm Depending on the implementation of the Evolutionary Algorithm, the individual candidate sets are randomly distributed over the whole search space. In the course of optimisation the application of selection and variation operators improves the fitness of the population. Less qualified parameter sets are discarded. In principle it is possible for the algorithm to be trapped in suboptimal low-fitness regimes. Therefore two distinct features categorise the Evolutionary Algorithm, i.e., Exploration and Exploitation. Recombination operators typically explore the entire search space, whereas mutation operators are used to exploit the vicinity of good solutions. There is a trade-off between exploration and exploitation. A strong focus on the former feature may lead to inefficient search behaviour, while too much focus on the latter is likely to deliver sub-optimal solutions in the vicinity of the starting point. Standard implementations of Evolutionary Algorithms include both features, with the option of prioritising either mechanism. This will be discussed below in greater detail when describing Genetic Algorithms and Evolution Strategies. The convergence behaviour of Evolutionary Algorithms can typically be divided into two phases. The figure below describes a characteristic curve for Evolutionary Algorithms, showing rapid progress at the beginning and flattening out at the end. This is often called the any time behaviour, meaning that the algorithm can be stopped at any time. The iterative improvement in each iteration guarantees that a better solution is found although it might be suboptimal.

16

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Early and rapid convergence Objective Value

Slow convergence

Iteration Figure 5: Typical convergence behaviour of an Evolutionary Algorithm We conclude the introduction of Evolutionary Algorithms with a global perspective. Evolutionary Algorithms are generally accepted as robust and generalized problem solvers. They are applicable to a wide range of problems including cases of discontinuities and non-linearities in the search space. Evolutionary Algorithms deliver approximately equally good performance over a wide range of problem statements, and are therefore well suited for History Matching projects with a wide diversity of specific problem statements. Empirical evidence and theoretical results published in this field in recent years (Wolpert 1997) support the opinion that generalized problem solvers that are successful and efficient for all problem statements do not exist, i.e., algorithms tailored to a specific problem will show a better performance compared to Evolutionary Algorithms. One proven approach to increasing the performance of Evolutionary Algorithms is to include problem-specific heuristics. In this case the Evolutionary Algorithm is customized to a specific problem statement. It will outperform any other standard algorithm with application to this problem, but it may lose its effectiveness for other problem areas. This view is shown in the diagram from Michalewicz (1996).

17

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Performance of methods on all problems

EA3

EA2

EA1

P | Scale of all problems

P* |

Figure 6: Heuristics and domain knowledge. The search for an all-purpose optimiser may be unrealistic. Specialised optimisers (EA3) with implemented domain knowledge may be better for particular problems (P), but are outperformed by optimisers without heuristics and general applicability (EA1) for different problem statements (P*). The introduction of domain knowledge and hard-won user experience (heuristics) will increase the performance of the Evolutionary Algorithm. This information is typically introduced in specialist operators, rules or constraints which guide the progression of the Evolutionary Algorithm. This combination of an evolutionary and a heuristic method is referred to as Hybrid Evolutionary Algorithms (Axmann 1999, Eiben 2003). Implementation In this section we present two different implementations of an Evolutionary Algorithm which are used for History Matching projects: Genetic Algorithms Evolution Strategy Both implementations are defined by an iterative sequence of variation and selection operations. The type of parameter being changed by the original implementation of a Genetic Algorithms is binary. Variations are performed by so-called Cross-over operations. Later implementations of Genetic Algorithms also use real parameters. Evolution Strategies usually vary real or integer parameters by mutation operations which can be directly mapped to the model parameters of the underlying optimisation problem. A wide variety of different mutation operators are available and have been tested. Any solution obtained by the simulation is evaluated by the objective function. The selection is based on ranking objective values and allows only the best individuals to proliferate.

18

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Genetic Algorithm Genetic Algorithms are the most widely know type of Evolutionary Algorithm (Holland 1975, Goldberg 1989, Bck 1996, Gen 2000, Eiben 2003). For the sake of simplicity we describe an implementation using real-valued vectors, since this will relate directly to the uncertainty parameters of the underlying model. Representation Recombination Mutation Parent selection Survivor selection Real-valued vectors Simple arithmetic recombination Draw uniformly randomly from range Fitness proportional Generational Table 4: GA Overview Real-Value representation The most obvious way to present a candidate solution to a problem is to have a string of real values, e.g. x = (k , , S wcr K , OWC ) . In this case the uncertainty parameters are derived from a continuous rather than a discrete distribution. Mutation In the case of real valued parameter representations it is common to change the vector values randomly within the domain given by the lower Li and upper U i limit, i.e. ( x1 ,..., x n ) ( x'1 ,..., x' n ) where xi , x'i [Li ,U i ] The values x'i are drawn uniformly randomly from [Li , U i ] . Recombination For real valued parameter representations a recombination point k out of all vector values is picked randomly. The first k values of parent 1 are moved to the first offspring. The rest is the arithmetic average of parent 1 and 2, i.e. ( x1 ,..., x n ), ( y1 ,..., y n ) . The following example shows offspring 1 and 2. Offspring 1: ( x1 ,..., x k , y k +1 + (1 ) x k +1 ,..., y n + (1 ) x n ) Offspring 2: ( y1 ,..., y k , x k +1 + (1 ) y k +1 ,..., x n + (1 ) y n ) Selection The principle of fitness proportional selection is used to select a number of individuals out of a pool of candidates. The probability Pi that an individual with fitness value f i is selected into the mating pool, i.e., for the recombination and mutation process, is given by
Pi = f i

f
j =1

(3.4)

The following figure shows the selection process. The size of each segment shows the proportional probability Pi of each candidate being selected into the mating pool.

19

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Current Generation
Candidate #1 Candidate #2 Candidate #3 Candidate #N

#4 #1 #3 #2

Candidate #1

Candidate #2

Candidate #3

Candidate #M

Mating Pool Figure 7: Stochastic Selection Process Typically a scaling operation is applied to the proportionate selection probability in order to prevent premature convergence. This may happen if outstanding candidates with a high fitness value dominate all other candidates. Sigma scaling (Goldberg 1989) incorporates information about the mean f and the standard deviation f of the fitnesses in the population:
f ' ( x) = max( f ( x) ( f 2 f ),0.0 ) .

(3.5)

The following figure shows a typical progression of a genetic algorithm. In the first iteration there is a large spread of uncertainty parameters (PERMX). The match quality also spreads accordingly. In the late phase of the optimisation with an improved objective function value, few candidates remain, i.e., an optimum is found.

Figure 8: Parameter progression of a GA

20

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Evolution Strategy The Evolution Strategy is another implementation of an Evolutionary Algorithm with a very useful feature in evolutionary computing: Self-Adaptation of strategy parameters (Rechenberg 1972, Schwefel 1977). In general self-adaptation means that some parameters such as step size co-evolve with the solutions of the algorithm. The following table gives a short summary. Representation Recombination Mutation Parent selection Survivor selection Specialty Real-valued vectors Discrete or intermediary Gaussian perturbation Uniform random (+) or (,) Self-adaptation of mutation step sizes

Table 5: ES Overview In Evolution Strategies mutation of uncertainty parameters is the dominant operation. The algorithm therefore is preferentially used for the exploitation phase, whereas Genetic Algorithms with a strong focus on the recombination operation are best used for the exploration phase. For a theoretical overview see Bck (1996) and Eiben (2003). Mutation By analogy to evolution in biology, minor changes in model parameters occur more frequently than major ones. In the simplest case, the mutation of a continuous parameter can be achieved by multiplying initial parameters by normally distributed random numbers. Initial parameters are predefined and may vary between user defined limits. Variations of the parameters are performed on the basis of individual step sizes. Step Size Special attention to the size of search steps is required for Evolutionary Algorithms, as well as for traditional optimisation methods. On the one hand, any predefined range of the search space limits variation of the parameters. On the other hand, an adjustment to the local topology of the search space is necessary to reach an optimum, and helps to improve convergence. If the step sizes are too large, an optimum is only found by coincidence. If the step sizes are too small, the optimisation converges poorly, and only the next local optima will be found. At the beginning of the optimisation, the rate of variations should be set relatively high. In the course of approaching an optimum the rate should be reduced. Evolution Strategies modify the design parameters with their individual step sizes. These step sizes are varied in each iteration based on normally distributed random numbers. In combination with the selection process an adaptive step size adjustment can be established which maintains an efficient approach towards an optimum. In Evolution Strategies the mutation is the main operator (Bck 1996). Assume that xi , with i = 1,..., n is an n-dimensional parameter vector. The mutation is

21

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

performed by adding a random number generated distribution N (0, i2 ) , where i is the standard deviation.

from

normal

'i = i exp( 'N (0,1) + N i (0,1) ) x'i = xi + 'i N i (0,1)

(3.6)

with an overall learning rate ' and a coordinated wise learning rate . They are typically given by

'= 1

2 n and

(3.7) (3.8)

= 1 2n .

In Evolution Strategies i is also called a step size. They are evolved with the uncertainty parameters xi . This process is described as Self-adaptation, since uncertainty parameters as well as strategy parameters, i.e., the step sizes, evolve in the course of the optimisation process and are adapted to the topology of the search space. In addition, a destabilisation mechanism is often included to avoid the algorithm being trapped in a local optimum. Selection Mutations allow changing individual model parameter sets and therefore increasing the variability of a population. The selection process aims at reducing the number of individuals on the basis of an objective function which qualifies the result of any simulation run. Following the notation used for Evolution Strategies, a competition between an initial parameter set (parent) and a modified vector (child) by selection of the successor is called a (+) strategy. If the (+) selection is generalized, this leads to a (+) strategy, in which the parents compete with children. The (+) selection guarantees the preservation of the best sets of model parameters over many generations, until they are replaced by further control mechanisms. So called (,) strategies sometimes remove even the best parameter sets of earlier generations, since the selection procedure is focused on children only. These selection strategies allow leaving local optima. In addition to these basic strategies, mixed forms allow changing from one selection strategy to another during optimisation, by means of further control parameters. Another method is to destabilize stagnating improvements during the course of optimisations. A reduction of the quality is accepted, allowing randomly selected parameter sets with bad qualities to proliferate. This mechanism will generally decrease the convergence, but it offers the possibility of leaving a region near a local optimum. Depending on the choice of step sizes, strategies and destabilization, the solution space can be tested to find global solutions. The following figure shows a typical progression of an Evolution Strategy. Compared to the Genetic Algorithm the first iteration requires fewer simulations. The uncertainty parameters cover only part of the search space and gradually evolve to better regions until convergence is reached.

22

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Figure 9: Parameter progression of an ES 3.2.2 Multi-Objective Optimisation Techniques History Matching in Reservoir Simulation is generally a multi-objective optimisation problem (Schulze-Riegert 2007). The problem statement of History Matching includes many field and well measurements in time and type, e.g. pressure measurements, fluid rates, events such as water and gas breakthroughs, etc. Uncertainty parameters modified as part of the History Matching process have varying impact on the improvement of the match criteria. Competing match criteria often reduce the likelihood of finding an acceptable History Match. It is an engineering challenge in manual History Matching processes to identify competing objectives and to implement the changes required in the simulation model. The diagram below sketches a hypothetical problem statement in History Matching. Well measurements for two separate wells are available and a History Matching project is launched by modifying identified uncertainty parameters.
BHP, OPR, GPR, WCUT BHP, OPR, GPR

Gas Oil-Water-Contact meter + uncertainty Oil Water

Rock Properties , k k , ...

Figure 10: Multi-objective criteria in History Matching

23

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

BHP Match 1 Match 2 ++ --

WCUT -++

The left table shows that both match criteria are negatively correlated, i.e., either BHP is well matched or Water Cut. For the given model assumptions it is not possible to match both objectives simultaneously.

Multi-objective Optimisation Optimisation problems involving multiple, conflicting objectives are often addressed by aggregating the objectives into a scalar function and solving the resulting single-objective optimisation problem. In contrast, the focus of this section is on finding a set of optimal trade-offs, the so-called Pareto-optimal set. In the following, we formalize this established concept. A multi-objective search space is partially ordered, in the sense that two arbitrary solutions are related to each other in two possible ways: either one dominates the other or neither dominates. Definition 1: Let us consider, without loss of generality, a multi-objective minimization problem with m decision variables (uncertainty parameters) and n objectives:
Minimize where x = ( x1 , K , x m ) X y = f ( x ) = ( f 1 ( x ), K , f n ( x ) ) y = ( y1 , K , y n ) Y

(3.9)

x is called a decision vector, X parameter space, y objective vector and Y objective space. A decision vector a X is said to dominate a decision vector b X (also written a f b ) if and only if
i { , K , n}: f i ( a ) f i (b) 1 j { , K , n}: f j ( a ) < f j (b) 1

(3.10)

Based on the above relationship, we can define non-dominated and Paretooptimal solutions: Definition 2: Let a X be an arbitrary decision vector: The decision vector a is said to be non-dominated regarding a set X ' X if and only if there is no vector in X ' which dominates a; formally (3.11) a ' X ' : a ' f a / The set X ' is left out, if it is clear what is meant. The decision vector is Pareto-optimal if and only if it is non-dominated regarding X. Pareto-optimal decision vectors cannot be improved for any objective without causing degradation in at least one other objective; they represent, in our terminology, globally optimal solutions. However, analogous to single-objective optimisation problems, there may also be local optima which constitute a nondominated set within a certain neighbourhood.

24

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

f2
Pareto Front

Feasible Region

Infeasible Region

min

Pareto Solution Set

min

f1

Figure 11: The figure above shows a number of solutions (f1,f2). All solutions shown as red dots are non-dominated solutions and belong to the Pareto solution set. Once the Pareto Front or an approximation is known, valuable information can be derived from it. The following figure shows two different examples for a Pareto Front.
Criterion I Criterion I Pareto front Region of Interest

Pareto front

Criterion II

Criterion II

Improvement for Criterion II with equal costs for Criterion I

Improvement for Criterion II with high costs for Criterion I.

Figure 12: Pareto Front The Pareto Front in the left-hand figure indicates that any improvement in criterion I will be achieved at the cost of worsening criterion II. However, the reduction and improvement of either criterion is about the same. In contrast the right-hand plot shows an asymmetric curve of the Pareto Front with respect to criterion I and II, i.e., a small improvement in criterion II will cause a drastic increase in the objective value of criterion I. Alternatively one can argue; you are worse off further improving criterion I, since there will be only little change in criterion II.

25

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Evolutionary Multi-objective Optimisation In dealing with multi-objective optimisation problems, classical search and optimisation methods are not sufficient for several reasons: 1. Most of them cannot find multiple solutions in a single run, thus requiring them to be applied as many times as the number of desired Paretooptimal solutions, 2. multiple application of these methods does not guarantee that widely different Pareto-optimal solutions will be found, and 3. most of them cannot efficiently handle problems with discrete variables and problems having multiple optimal solutions. On the contrary, studies of Evolutionary Algorithms over the past few years have shown that these methods can be efficiently used to eliminate most of the above-mentioned difficulties of classical methods (Deb 2001, Coello 2002). Since they use a population of solutions in their search process, multiple Pareto-optimal solutions can, in principle, be found in one single search and optimisation process. 3.2.3 Optimisation on the Response Surface Experimental Designs and Response Surface Modelling (RSM) are often used in diverse workflows to quantify uncertainties and to investigate the propagation of parameter uncertainties by using Monte Carlo sampling techniques. At the same time, the technique can be used to approximate complex simulation models for a number of simulated responses. The whole suite of available optimisation methods can then be used to optimise responses based on surrogate models, also called proxy or response surface models. Full field reservoir simulation results are then captured in a proxy model with a number of chosen design (uncertainty) parameters. There are numerous response modelling techniques described in the literature (Damsleth 1994, Manceau 2001, Montgomery 2001). This section will present key principles in using response surface modelling for History Matching. Most available tools generate 2nd order polynomials In History Matching workflows a response will be identified with any contribution to the objective function definition, e.g. the pressure mismatch contribution of a particular well. The global objective function value will be another response. For each response an independent response surface model can be created, i.e.,

F ( x1 , x 2 ,..., x N ) = 0 + i xi + ii xi2 + ij xi x j
i i i< j

(3.12)

Depending on the number of parameters, the number of terms increases rapidly. For 20 uncertainty parameters a 2nd order polynomial will consist of 20 linear, 20 quadratic and 190 cross terms, i.e., a total of 230 contributions. A response model is high dimensional and may not represent a non-linear system well. In order to handle a large number of uncertainty parameters, the polynomial is typically reduced by identifying less sensitive terms, which are excluded from the calculation. Once a response surface of acceptable quality is obtained, any optimisation technique can be used to identify optimal solutions on the response surface. Identified solutions are verified through full field

26

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

simulations, and update the database for recalculating an improved proxy model. Workflow The following diagram shows the principal workflow using Experimental Designs and Response Surface Modelling as input to a History Matching process. In all practical application workflows this is an iterative procedure, as suggested by the diagram below. In addition, prior information may be included in a Bayesian formulation which is used in the updating process (cf. Craig 1998).
Start

Definition of Optimisation Objective Define objective contributions

Identify Uncertainty Parameters Rock properties, ...

Experimental Design Run Explore Uncertainty Space

Update Design Add new candidate sets to experimental design No RSM Quality is sufficient? Yes Quality Assurance Verify Qualify of Proxy modell

Full field simulation Run multiple experiemental design runs with full field simulator

Response Surface Modelling Use e.g. regression techniques to generate Proxy model

Optimisation on the Response Surface Use any optimisation method to find optima (typically used schemes include: GAs, Congugate Methods, etc.) No Quality Assurance Verify Qualify of History Match

Termination criteria is met? Yes

Full field simulation Use optimal parameter set identified in optimisation process

Stop

Figure 13: Principal workflow of an Optimisation on the Response Surface Global optimisation? Optimisation on the response surface can be very efficient. The following example, however, also shows limitations of the approach. One key goal of History Matching is to reduce uncertainty and deliver reliable results for production forecasts. The following figure shows the response surface of the global objective function value, depending on two selected uncertainty

27

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

parameters. The case is taken from a real reservoir simulation case with a complicated faulted structure. The multi-modal solution space is clearly visible, i.e., there are multiple optima.

Figure 14: Response surface of the global objective function value Assume that one starts with a classical Experimental Design using 4 design points to resolve linear and quadratic contributions. The left-hand figure below shows red dots as design points. It is easy to estimate that any response model derived from a simple Experimental Design cannot resolve the full structure of the true multi-modal response surface as shown in the right-hand figure. Optimisation results may therefore be misleading, and require a thorough verification process.

Figure 15: Resolution of the solution space including multiple optima using 10 simulation runs (left-hand figure) and 100 simulation runs (righthand figure). Advanced techniques in response surface modelling have recently been tested and applied to History Matching, e.g. neural network models which can handle high dimensional response spaces where classical regression models tend to fail (cf. Cullick 2006). Response Surface Modelling is also used in hybrid optimisation techniques. Optimisation results from RSM are then used to guide a stochastic optimisation scheme, which can significantly increase convergence behaviour, (Castellini 2006).

28

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Depending on the objective of the History Matching project, RSM techniques may be used. Depending on the business case, response surface modelling could be used initially for identifying global effects. A matched model may be sufficient for a field-wide projection. At the same time, full physics models may be required for modelling detailed flow behaviour with high sensitivities to parameter adjustments, e.g. History Matches and prediction runs on a well-bywell basis as input to a decision process on infill drilling campaigns. In other words, RSM techniques may be insufficient for individual well projections. The applicability of RSM techniques therefore must be decided on a case-by-case basis.

29

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

WORKFLOW DESCRIPTION / EXAMPLES

History Matching workflows are characterised by an enormous diversity of problem statements reflecting the experience that every reservoir may be different. At the same time it is possible to investigate common features, applied methodologies and best practices. This section investigates and describes selected workflow processes used in the oil and gas industry to quantify the impact of subsurface uncertainties on reservoir performance. All reference cases are based on published papers. In part they can be regarded as examples of best practices applied within individual companies. The section starts with an example of a workflow description, and introduces a notation for a workflow diagram. The primary idea is to reduce the workflow to few key events, decisions points, methodologies and algorithms. With this methodology several published cases are analysed and are briefly summarized. Workflow charts will then make it possible to identify common principles and features. It is of no interest whether the respective engineering groups managed to meet the business objective. This would go far beyond the scope of this review. Each study is briefly summarized by The scope of the study Uncertainty parameter definition Methods and Algorithms Workflow description Workflow diagram With this information at hand it will be easy to understand and compare approaches among different studies.

4.1 Reference Workflow and Best Practices


The following workflow diagram shows key elements for a History Matching project, including a choice of algorithms used to support the optimisation process. Assisted History Matching relies on engineering support. There is a continuous interaction between the reservoir engineer and the progress of the project. It is important to emphasise that the key elements of traditional History Matching are viable, i.e., integration of domain knowledge and heuristics. Best practices (Williams 1998, Selberg 2007) are integrated into modern workflows. But the tools cannot do without them. The use of modern tools allows structuring of the History Matching workflow, and methods are adopted to interpret the reservoir simulation response and help to guide the decision process. In this respect History Matching becomes a manageable project, with planned deliverables in time, quality and budget.

30

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Definition of Business Objective Define Objective Function

Start

Engineering Interaction:
Include Domain Knowlege and Heuristics, use Strategies, e.g.:

Identify Uncertainty Parameters Geological uncertainties, Rock properties, ...

Audit Trail Documentation of all simulation runs and decision processes

Sensitivity Run e.g. Plackett-Burman 2 level Design

Revise Model

Identify key parameters Investigate Sensitivities

Review

Strategraphic Approach: Global or field wide Flow units or layer group Individual layers Individual wells

Explore Uncertainty Range Credit parameter distributions e.g. Latin Hypercube Desgin of Experiements Define Starting Point for Optimisation

Optimisation Run Explore Optimisation Potential Use e.g. Genetic Algorithm

Build up Complexity

Check Optimsation Revise Strategy ?

Review Revisionof Strategy

Analysis Indentify one or multiple matches

Event Targeting Model Calibration:

Optimisation Run Fine tune multiple matches Use e.g. Evolution Strategy

Production Forecast Use selected matches for prediction runs. Compute uncertainty based on alternative matches

Stop

Figure 16: Abstract workflow example.

31

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

4.2 Uncertainty Assessment using Experimental Design


Reference: Peake, W.T., Abadah, M., Skander, L.: Uncertainty Assessment using Experimental Design: Minagish Oolite Reservoir, paper SPE91280 presented at the RSS 2005, Houston, Texas, January 31 - February 2, 2005 Scope Systematic way to assess reservoir performance using Experimental Designs (ED) combined with a History Matching approach. Estimate uncertainties on recoverable reserves, cumulative oil production for a mature oil field. Identify key subsurface uncertainties impacting water flood performance and identify uncertainties P10/P50/P90 for oil forecasts. Uncertainty Geological uncertainties (top structure, geologic contrast, OWC, Tar Zone Thickness, ...) Rock properties (Porosity, Permeability) Fluid properties ( Viscosity, ...) Saturation, relative permeability Method Plackett-Burman Experimental Design and Monte-Carlo Sampling Workflow Use traditional vary one-factor-at-a-time approach to identify the greatest mismatch. Adjust those factors to guarantee preservation of history match. Plackett-Burman 2-level Experimental Designs for screening a large number of variables. A centre point is added to investigate curvature. Step is designed to identify major variables influencing dependent variables (recovery ...) Apply Analysis of Variance (ANOVA) to identify most significant uncertainties. Display results in Pareto Charts (Tornado Charts) reflecting hierarchy of variables based on ANOVA. Generate response surface model based on multivariate analysis. Proxy models are used as input to Monte Carlo simulation to generate cumulative distribution functions (CDF). Recovery factors corresponding to different probabilities are then derived from cumulative distributions.

32

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Start

Definition of Business/Optimisation Objective Scope: Quantify uncertainties in recoverable reserves Match criteria for HM: Pressure, WCT Uncertainty Variables geologic uncertainties, rock properties, oil properties, ... Sensitivy Run Use traditional vary on-factor-at-a-time approach Identify factors which cause greatest mismatch. Adjust factors and fix them for subsequent runs.

Design of Experiments Use Plackett-Burman 2-level DoE (add centre point to evaluate curvature) Identify uncertainty parameters which most impact recoverable reserves.

Analysis Use Tornado Charts (ANOVA) Identify uncertainty parameters which most impact recoverable reserves. Use reduced number of uncertainty parameters in repeated ED run incl. adjustments to parameters with largest impact on water production.

Response Surface Modelling Use RSM technique to generate Proxy model Use result from ANOVA to select most important parameters for input to MC sampling

Monte Carlo Simulation Use MC sampling to generate CDF Calculate uncertainty range of recoverable reserves.

Stop

4.3 Reducing Uncertainty Ranges using EDM


Reference: Al Salhi, M.S., Van Rijen, M.F., Alias, Z.A., Visser, F., Dijk, H., Lee, H.M., Timmerman, R.H., Upadhyaya, A.A., Wei, L.: Reservoir Modelling for Redevelopment of a Giant Fractured Carbonate Field, Oman: Experimental Design for Uncertainty Management and Production Forecasting, paper IPTC10537 presented at the IPTC 2005, Doha, Qatar, 21-23 November 2005 Scope: A systematic way of assessing reservoir uncertainties using an Experimental Design approach.

33

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Application to complex brown-field redevelopment (Giant fractured carbonate field in Oman) Establishment of a systematic approach with three stages 1. Initial framing of the opportunity 2. Identification and assessment of development options 3. Selection and refinement of optimum concept for field development plan Uncertainty Static uncertainties: Matrix, Fracture Dynamic uncertainties: Saturation Height Function, Residual Oil Saturation, Imbibition Capillary Pressure Curves, Relative Permeability, Aquifer properties Non-quantified large production uncertainties, which probably do not allow a well by well resolution Method: Experimental Designs (Plackett-Burman, Box-Behnken) ANOVA (Analysis of Variance), Pareto Plots Workflow Three-stage process (see above). Paper describes details of stage 2, i.e. application of (ED) and RSM Uncertainty definition Inclusion of different static reservoir models representing the full range of possible realisations. Uncertainties relate to matrix and fractures. Identification of dynamic reservoir uncertainties. Generation of sector models to capture full geological details in fine grid models. Application of (ED) (Plackett-Burman) for screening and pre-selection of different static realisations and dynamic parameters. Scope: Identify parameters with highest impact on historical reservoir performance. Revision of uncertainty ranges for screening runs which fall outside the acceptable range of historical data. Repeat screening runs until history data lie within simulated data Reduction of number of uncertainty parameters, keep those with largest impact Use Pareto Chart (Tornado Diagram) to identify parameters with largest impact. Basis for this analysis can be an Analysis of Variance (ANOVA) or linear response surface modelling. Application of an alternative design including curvature information (three-level design). Use of Box-Behnken (ED). Consideration of 6 uncertainty parameters and 50+ simulation runs. Use additional runs for quality checks, i.e. compare reservoir simulation data with values generated through the response surface model, e.g. cumulative oil production at time X. Application of an objective function definition to compare the response surface model with a number of balanced sets of different responses.

34

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

The full uncertainty space is scanned to discard realisations in cases where the match is not acceptable. This allows significant reductions in the uncertainty space. Authors indicate a factor of 50%. Application of all history matched cases to different development options (scenarios). A range of forecasts is constructed for each development option. Construction of response surface models for prediction parameters Application of Monte Carlo sampling process to assess propagation of uncertainties Presentation of CDFs of oil reservoirs for various development options. Selection of representative high (P15), mid (P50) and low (P85) reserves. Workflow Diagram
Definition of Business/Optimisation Objective Scope: Idendify Uncertainties related to different redevelopment options of a brown field reservoir Uncertainty Variables Static uncertainties (matrix, fracture) Dynamical uncertainties (SHF, SOR, PC, Kr, Aqufier) Reduce Uncertainty Range Check if simulated responses mismatch with historical data. Reduce uncertainty range and repeat simulations.

Start

Design of Experiments Use Plackett-Burman 2-level DoE Include all static and dynamical uncertainties

Analysis Use Tornado Charts (ANOVA) Identify uncertainty parameters which most impact recoverable reserves. Use reduced number of uncertainty parameters in following step. Design of Experiments Use 3 factorial DoE (Box-Behnken) Analyse linear and non-linear effects

Response Surface Modelling Use RSM technique to generate Proxy model

Check Quality Use addition runs not included in RSM

Screening Use RSM for screening full uncertainty space and reduce parameter ranges. Objective function definition is used. Uncertainty ranges are claimed to be reduced by one half Rerun history match with narrowed down uncertainty range

Objective Function Defintion

Production Forecast Run all models for different development options

Response Surface Modelling Use RSM technique to generate Proxy model Monte Carlo Simulation Use MC sampling to generate CDF Calculate uncertainty range for different development options.

Stop

35

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

4.4 Example for Top-Down Reservoir Modelling Approach


TDRM is one of the prominent examples favouring the use of a modern History Matching approach. TDRM defines a concept rather then a specific methodology. The following case gives a representative example. Reference: Kromah, M.J., Liou, J., MacDonald, D.G.: Step Change in Reservoir Simulation Breathes Life Into Mature Oil Field, paper SPE94940 presented at the Latin American and Caribbean Petroleum Engineering Conference, Rio de Janiero, Brazil, 20-23. June 2005 Scope Understanding Uncertainties and optimising depletion plans Optimising infill well location for a mature oil field The optimisation objective was to match associated water rates and pressure data. Uncertainty Location of the remaining oil Thickness of the oil column Current depth of OWC Geological uncertainties (Pore Volume, Permeability, Fault transmissibility Shale transmissibility, Aquifer strength) Method Genetic Algorithm Workflow TDRM was set up to ensure that the unknowns defining the uncertainties were investigated Run 500 cases (25 models in each generation) Identify mismatches, e.g. two wells were not matched for runs Modify geological model, i.e., include fault Rerun optimisation and find better convergence Analyse full uncertainty range of parameters Find that some values concentrate for best models All best models were able to match oil rates. There were varying match qualities for the water rates. Significant deviations between measured and simulated data were observed for two wells. 16 models were identified and used for prediction runs to estimate uncertainties in performance of new infill wells. Alternative infill locations were considered in the prediction process. Prediction runs yield different potential An alternative infill well location with a 16% recovery increase was identified compared to the planned infill location Analysis of model parameters with recovery factors and cumulative production rates showed no correlation. No single model parameter was responsible for the success or failure of an infill well.

36

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Workflow Diagram
Start Definition of Business/Optimisation Objective History Match Uncertainty Quantification for Infill Drilling wells

Identify Uncertainty Variables Geological uncertainties, GWC, Oil column etc.

Global Search Use Genetic Algorithm (50;0.5), 20 Iterations

Revision Check Optimsation Revise Model Revise Simulation Model Include new fault

Global Search Use Genetic Algorithm (50;0.5), <20 Iterations

Prediction Runs Identify 16 best matches used for prediction runs

Analyse Results Identify optimimal infill well location

Stop

4.5 Production Data and Uncertainty Quantification


Reference: Nicotra, G., Godi, A., Cominelli, A. Christie, M: Production Data and Uncertainty Quantification: A Real Case Study, paper SPE93280 presented at the RSS 2005, Houston, Texas, 31.1-2.2.2005 Scope Apply consistent workflow for quantifying uncertainties related to production forecasts Under the assumption of a realistic error model, this paper presents a consistent evaluation of the uncertainties related to production forecasts. Uncertainty range for future Field Oil Production and Water Production Rate. Uncertainty Transmissibility multiplier for 11 faults Method Neighbourhood algorithm (NA) Workflow Full Bayesian framework is used for setting up an error model Manual History Match (HM) was available as a reference case The same uncertainty parameters as in manual HM were used

37

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

640 simulations were calculated based on NA 400 cases were identified to be better than the manual match Misfit error model was constructed for a consistent treatment of uncertainties Posterior probability was calculated for each model Calculation of the posterior probability function is accomplished by a Markov Chain Monte Carlo method. Uncertainty envelope with a P10 to P90 confidence bound is calculated
Workflow Diagram
Definition of Business/Optimisation Objective Scope: Uncertainty range for FOPT, FWPT Match Parameters: BHP (producers), THP (injectors), WCT Incl. well dependend std.dev. for misfit function Assumption: Errors are stationary and Gaussian

Start

Identify Uncertainty Variables 11 Transmissibility Multipliers Assume initial uniform distribution

Global Search Neighbourhood Algorithm (ns=40), 15 Iterations Coverage of global search space

Next Iteration It < 15

NAB:Neighbourhood Algorithm Bayes Calculate Posterior Probability for each Voronoi cell after each iteration Scope: calculate an approximate PPD everywhere in the model space

Analyse Results Verify that entire search region is covered. Use parameter distributions for sensitivity analysis, i.e. regional concentration indicates strong sensitivity. Verify all results for geological consistency Analyse dynamic parameters as a function of time, e.g. staturation and pressure maps.

Uncertainty Analysis Identify multiple solutions to HM problem Calculate marginal pdf for uncertainty parameters compared to initial uniform distributions Calculate Uncertainty for every key parameter of interest; calculate uncertainty envelope

Stop

4.6 What distinguishes modern from classical History Matching?


We will address this question with some practical remarks based on a workflow example. The diagram below gives an overview of a complete history matching project including several optimisation cycles and interaction points. The reservoir model was shown to be difficult to match manually, and was therefore chosen as a candidate for an assisted History Matching project.

38

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Details of the reservoir simulation project are not important at this stage. Instead, we prefer to emphasize the progress of the project and relevant interactions by the responsible engineer. The diagram below documents the improvement of the match quality based on an objective function definition (red line). Initial and best matches in each optimisation cycle are shown by blue dots at the beginning and the end of each cycle (red line). At the end of each cycle, the responsible engineer evaluated and analysed results and made a decision on how to continue. Decisions are briefly summarized in the quotes linked to each decision point.
60 Day 1 Launch optimisation with 54 design parameters, PERMH, PERMV, PORVM, FLTMLT Day 17 New PVT table included. Introduction of 4 CarterTracy Aquifers and critical gas saturation parameters Day 22 Introduction of barrier at well X

50

40

Global Mismatch

30

Day 4 Introduction of global Aquifer parameter

Day 10 Modification of objective definition. Selected pressure points removed.

20 Day 8 Introduction of 5 Aquifer parameters and WPI. Fixing of most of the other design parameters 0 50 100

Day 31 Horizontal barrier introduced in layer N, saturation end points changed near the wells

10

Day 29 WPI fixed

Day 35 Match accepted 200 250

0 150

Cumulative Iterations

Figure 17: Progress of a History Matching Project The optimisation process was supported by a (2+4) Evolution Strategy, i.e. each iteration includes 4 independent full field reservoir simulation runs which could potentially run in parallel. Due to hardware and licensing limitations the overall elapsed time for the History Matching project was five weeks. About 20 days of single CPU time was used. One or at most two simulations were run at the same time. The initial progress of the optimisation followed a standard pattern with a gradual increase of complexity in terms of uncertainty parameter adjustments. It is worth noting that during cycle 4 the engineer had to acknowledge that incorrect PVT tables were used. In terms of the match quality this took the project almost back to the starting point, cf. day 17. In manual History Matching this would mean a waste of three weeks of work. Using assisted History Matching techniques together with the understanding acquired in the first optimisation cycles, it was possible to re-establish match quality within one optimisation cycle. This example demonstrates the strength of using optimisation tools in reservoir simulation. Reservoir simulation becomes a tool set for testing ideas and concepts. Modelling mistakes are not encouraged; however, they can be resolved much more quickly. In addition, the workflow becomes transparent, since individual decision steps are monitored.

39

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

The project workflow presented here required 8 interactions by the reservoir engineer. This number should be compared with the number of simulations typically run in manual History Matching projects not the overall number of simulations. Each of the above simulation cycles could have been processed within one day. That means that the History Matching project could have been performed in much less than 2 weeks. In summary selected key distinguishing features of modern History Matching are: New technologies applied to assisted History Matching facilitate manageable workflows towards decision gates, and support reservoir simulation becoming a reliable and indispensable part of the decision process in reservoir management. o The reservoir engineer is the key factor guiding the process. An iterative process of running optimisations, stopping and revising input strategies allows the identification of potential modelling conflicts at an early stage instead of manually adjusting model parameters for several weeks before giving up. o Tools will primarily take over cumbersome parameter tuning, launching simulation decks and delivering response data for analysis and process revision through the reservoir engineer. Application of data mining and statistics on a larger database of simulation responses adds significant value to understanding of the reservoir model. o Optimisation techniques and Experimental Designs with application in History Matching support probabilistic forecasting scenarios. Identification of alternative reservoir models becomes part of the process. o Modern workflows for assisted History Matching allow testing ideas and modelling concepts. This supports closer interaction with other G&G disciplines. History Matching becomes transparent and review processes are easily supported. (For requirements cf. Rietz 2005) o Audit trail capabilities and transparent workflows allow new engineers to catch up and become productive easily. The workflow becomes less sensitive to personnel fluctuations o A History Matching project can be repeated within a short time to integrate updated production data, new wells, etc. This facilitates regular updates of reservoir models as part of yearly revision plans. Advances in computer technologies significantly reduce the cycle time for History Matching projects. o Distributed computing contributes to saving time and resources o More engineering work can be accomplished within the project time. This is the basis of the next chapter on distributed computing in reservoir engineering.

40

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

DISTRIBUTED COMPUTING IN RESERVOIR SIMULATION

Advances in modelling techniques and uncertainty workflows initiated parallel research activities using modern computing frameworks for reservoir simulation (Schiozer 1999, Schulze-Riegert 2001, Landa 2005). In the face of complex reservoir models and long simulation times, distributed computing has become more important for delivering uncertainty quantification assessments in time and in quality. This chapter gives a short overview on key features of a distributed computing environment designed for reservoir simulation in order to use the availability of cost efficient computing resources effectively. Reservoir simulation problems are diverse and no single optimisation method has the potential to solve all problems. This gives rise to the need to integrate a number of different sequential and parallel optimisation methods into a common framework for optimisation and Experimental Designs. Some key features of such a framework are (Krosche 2005): Adaptable software architecture Extensibility Reuse of available software modules Plug-in functionality Ability to handle large optimisation problems Multi-platform capability Parallel scalability in a heterogeneous computing environment Host management and load balancing The following diagram presents the potential of selected optimisation algorithms to exploit distributed computing capabilities. As explained in a previous section, Evolutionary Algorithms are built on parent-to-child relationships and are designed to exploit parallel computing capabilities. All new individual members of one generation are independent of each other and can therefore run simultaneously. This applies for Evolution Strategies as much as for Genetic Algorithms. Each diamond in the diagram below represents an interaction with the respective algorithms. Each circle represents the simulation of a candidate case. All simulations between two interaction points (diamond) can in principally run simultaneously. Due to the nature of Experimental Designs, e.g. Latin Hypercube, any number of generated candidates can run at the same time. However, there is no generational sequence as with optimisation methods. Gradient techniques show a special feature in connection with parallel computing. The calculation of the gradient information can run in parallel. In the simplest case of a black box gradient technique, a minimum of (n+1) candidates can run simultaneously for generating data for an n-dimensional uncertainty parameter space. This is related to the one-parameter-at-a-time approach for calculating sensitivities. The gradient calculation is followed by a regression calculation to find the next local optimum. This may require a few steps which are purely sequential.

41

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Evolution Strategy ES(2+4)

Genetic Algorithm GA(50,100) or ES(10,50) or EnKF (50)

Latin Hypercube LH(100)

Gradient Method + Regression

Algorithm; generation of candidates

Simulation and processing of candidates

Figure 18: Selected optimisation schemes and the use of parallel computing capabilities The notation indicates the number of parallel runs. An Evolution Strategy ES(2+4) generates 4 independent candidates which can be simulated simultaneously. A Genetic Algorithms GA(50,100) uses a population of 50 candidates which are completely replaced in each generation, i.e., 50 candidates are potentially simulated at the same time. Ensemble Kalman Filter techniques follow a different optimisation scheme in the sense that models are sequentially updated. Each processing step (diamond) therefore reflects the calculation of the Kalman gain (Naevdal 2003). In the example above it is assumed that 50 ensemble members are simulated forward in time. In the best case, all 50 ensemble members are simulated simultaneously. All algorithms rely on efficient computing frameworks for distributing and managing simulation as well as pre- and post-processing tasks. The following flow chart gives an example with basic design requirements. Ideally, the following modules, i.e.: user interface, main processing unit including the algorithm management as well as modelling module which handles pre- and post-processing tasks and the simulation

42

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

User Interface Project Setup, Monitoring, Computational Steering, Analysis Communication Middelware Experiment Generator Optimisation Algorithms, Experimental Design Techniques

Network

Application Server Host

PC

Data Store

Algorithm Management

Network

Communication Middelware

Modelling Module
Templates

Computing Host #1

PostProcessor Computing Host #2 Computing Host #N

Simulation Launcher

PreProcessor

Simulation Output

Simulator

Simulation Input

#1 #2 #N

Figure 19: Overview of a software design (middle part) and a Hardware implementation (left-hand side) for an optimisation environment with distributed computing capabilities. are processed on different CPUs. This is fully automated by modern queuing systems which allow submission of independent computing tasks on distributed execution hosts. An intelligent host and task management system is required to collect results and communicate information for further processing. All these techniques are available for Modern History Matching. The reservoir engineer remains the key factor in this process of reservoir modelling and model validation. This person prepares and controls simulation strategies, monitors processes and makes decisions that successfully advance and conclude a simulation project.

43

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

REFERENCES
[1] Aanonsen S.I., Aavatsmark I., Cominelli, A., Gosselin, O.: "Integration of 4D Data in the History Match Loop by Investigating Scale Dependent Correlations in the Acoustic Impedance Cube," 8th European Conference on the Mathematics of Oil Recovery, Freiberg, Germany, September 3-6, 2002 Al Salhi, M.S., Van Rijen, M.F., Alias, Z.A., Visser, F., Dijk, H., Lee, H.M., Timmerman, R.H., Upadhyaya, A.A., Wei, L.: "Reservoir Modelling for Redevelopment of a Giant Fractured Carbonate Field, Oman: Experimental Design for Uncertainty Management and Production Forecasting," paper IPTC10537 presented at the IPTC 2005, Doha, Qatar, 2123 November 2005 Anterion, F., Eymard, R., Karcher, B.: "Use of Parameter Gradients for Reservoir History Matching," paper SPE 18433 presented at the SPE 1989 Symposium on Reservoir Simulation, Houston, February 6-8, 1989 Axmann, J.K.: "Parallel Optimization of Technical Systems - Adaptive Evolutionary Algorithms Workstation Clusters and Multi-Processor Systems," Shaker, Aachen (1999) Bck, T.: "Evolutionary Algorithms in Theory and Practice," Oxford University Press, Oxford (1996) Ballin, P.R., Aziz, K., Journel, A.G.: Quantifying the Impact of Geological Uncertainty on Reservoir Performing Forecasts, paper SPE 25238 presented at the 12th SPE Symposium on Reservoir Simulation, New Orleans, USA, 28 Feb 3 March 1993. Begg, S., Bratvold, R.B., Campbell, J.M.: "Improving Investment Decisions Using a Stochastic Asset Model," paper SPE71414 prepared for the Annual Technical Conference and Exhibition, New Orleans, 30 September 3 October, 2001 Bissell, R : "Calculating optimal parameters for history matching," Proc. of the 4th European Conference on the Mathematics of Oil Recovery, Roros, Norway (710 June 1994) Brun, B., Gosselin, O., Baker, J.W.: "Use of Prior Information in Gradient-Based HistoryMatching," paper SPE 66353 presented at the SPE 2001 Reservoir Simulations Symposium, Houston, USA Castellini, A., Yeten, B., Singh, U., Vahedi, A., Sawiris, R.: "History Matching and Uncertainty Quantification Assisted by Global Optimization Techniques," 10th European Conference on the Mathematics of Oil Recovery Amsterdam, The Netherlands 4 - 7 September 2006 Coello, C.A., Veldhuizen, D.A., Lamont, G.B. : "Evolutionary Algorithms for Solving MultiObjective Problems," Kluwer, New York, 2002 Corre, B., Thore, deFeraudy, V., Vincent, G.: Integrated Uncertainty Assessment for Project Evaluation and Risk Analysis, paper SPE 65205, presented at SPE European Petroleum Conference, Paris, France, 24-25 Oct 2000. Craig, P.S., Goldstein, M., Seheult, A.H., Smith, J.A.: "Pressure Matching for Hydrocarbon Reservoirs: A Case Study in Use of Bayes Linear Strategies for Large Computer Experiments," Case Studies in Bayesian Statistics, Volume III, Technometrics, Vol. 40, No. 1 (Feb., 1998) Cullick, S., Johnson, D., Shi, G.: "Improved and More-Rapid History Matching With a Nonlinear Proxy and Global Optimization," paper SPE 101933 presented at the SPE 2006 Annual Technical Conference and Exhibition, San Antonio, USA Da Cruz, P.S., Horne, R.N., Deutsch, C.V.: The Quality Map: A Tool for Reservoir Uncertainty Quantification and Decision Making, SPEREE (Feb 2004), 6 Damsleth, E., Hage, A., Volden, R.: "Maximum Information at Minimum Cost: A North Sea Field Development Study with Experimental Design," JPT (December 1992) 1350-1360 Deb, K.: "Multi-Objective optimization using evolutionary algorithms," Wiley, Chichester, UK, 2001 Eiben, A.E., Smith, J.E.: "Introduction to Evolutionary Computing," Springer, Berlin (2003)

[2]

[3]

[4] [5] [6] [7]

[8]

[9]

[10]

[11] [12] [13]

[14]

[15] [16] [17] [18]

44

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

[19]

[20] [21]

[22] [23] [24] [25] [26]

[27] [28] [29]

[30] [31]

[32]

[33]

[34]

[35]

[36]

[37]

Eide, A.L., Holden, L., Reiso, E., Aanonsen, S.I.: "Automatic History-Matching by use of Response Surfaces and Experimental Design," Proc. of the 4th European Conference on the Mathematics of Oil Recovery, Roros, Norway, 1994 Evensen, G. : "Data Assimilation, The Ensemble Kalman Filter," Springer, Berlin (2007) Evensen, G., Hove, J., Meisingset, H.C., Reiso, E., Seim, K.S., Espelid, O.: "Using the EnKF for Assisted History Matching of a North Sea Reservoir Model," paper 106184 presented at the 2007 Reservoir Simulation Symposium, The Woodlands, USA, 2007 Floris, F.J.T. et al.: "Comparison of production forecast uncertainty quantification methods An integrated study," Petroleum Geoscience (May 2001) 587 Friedmann, F., Chawathe, A., Larue, D.K.: Assessing Uncertainty in Channelized Reservoirs Using Experimental Designs, SPEREE (August 2003), 264. Gen, M., Cheng, R.: "Genetic Algorithms & Engineering Optimization," John Wiley & Sons, Inc., New York (2000) Goldberg, D.E.: "Genetic Algorithms in Search, Optimization and Machine Learning," Addison-Wesley, Reading (1989) Gmez S., Gosselin O. Barker J.W.: "Gradient-Based History-Matching With A Global Optimization Method," paper SPE 56756 prepared for the Annual Technical Conference and Exhibition, Houston, October 3-5, 1999 He, N., Reynoldy, A.C., Oliver, D.S.: "Three-Dimensional Reservoir Description From Multiwell Pressure Data and Prior Information," SPEJ (Sep 1997) 413 Holland, J.H.: "Adaptation in Natural and Artificial Systems," University of Michigan Press, Ann Arbor, Michigan (1975) Jutila, H.A., Goodwin, N.H.: "Schedule Optimization to Complement Assisted History Matching and Prediction under Uncertainty," paper SPE100253 accepted for publication at the SPE Europec/EAGE Annual Conference and Exhibition held in Vienna, Austria, 12-15 June 2006 Kelkar, M, Godofredo, P.: Applied Geostatistics for Reservoir Characterization, published by SPE, 2002. Kromah, M.J., Liou, J., MacDonald, D.G.: "Step Change in Reservoir Simulation Breathes Life Into Mature Oil Field," paper SPE94940 presented at the Latin American and Caribbean Petroleum Engineering Conference, Rio de Janiero, Brazil, 20-23. June 2005 Krosche, M., Axmann, J.K., Pajonk, O., Schulze-Riegert, R., Haase, O.: "A Software Component Based Parallel Simulation and Optimization Environment for Reservoir Simulation," Proceedings of the DGMK Spring Conference 2005, ISBN 3-936418-35--7 Landa, J.L., Gyagla, B.: "A Methodology for History Matching and the Assessment of Uncertainties Associated with Flow Predictions," paper SPE 84465 prepared the SPE Annual Technical Conference and Exhibition held in Denver, Colorado, U.S.A., 5-8 October 2003 Landa, J.L., Kalia, R.K., Nakano, A., Nomura, K., Vashishta, P.: "History Match and Associated Forecast Uncertainty Analysis Practical Approaches Using Cluster Computing," paper IPTC10751 presented at the IPTC 2005, Doha, Qatar, 21-23 November 2005 Leitao, H., Schiozer, D.: "A New Automated History Matching Algorithm Improved by Parallel Computing," paper SPE 53977 presented at the SPE 1999 Latin American and Caribbean Petroleum Engineering Conference, Caracas, Venezuela Lepine, O.J., Bissell, R.C., Aanonsen, S.I., Pallister, I., Barker, J.W.: "Uncertainty Analysis in Predictive Reservoir Simulation Using Gradient Information," paper SPE 48997 presented at the SPE 1998 Annual Technical Conference and Exhibition, New Orleans, USA Li, R., Reynolds, A.C., Oliver, D.S.: "History Matching of Three-Phase Flow Production Data," paper SPE 66351 presented at the SPE 2001 Reservoir Simulations Symposium, Houston, USA

45

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

[38]

[39]

[40] [41] [42] [43] [44]

[45] [46]

[47]

[48] [49]

[50]

[51]

[52]

[53]

[54]

[55]

Little, A.J., Jutila, H.A., Fincham, A.: "History Matching With Production Uncertainties Eases Transition Into Prediction," paper SPE100206 accepted for publication at the SPE Europec/EAGE Annual Conference and Exhibition held in Vienna, Austria, 12-15 June 2006 Manceau, E., Mezghani, M., Zabalza-Mezghani, I., Roggero, F.: "Combination of Experimental Design and Joint Modeling Methods for Quantifying the Risk Associated With Deterministic and Stochastic Uncertainties An Integrated Test Study," paper SPE71620 prepared for the SPE ATCE, New Orleans, 30. September 3 October, 2001 Meisingset, K.K.: Uncertainties in Reservoir Fluid Description for Reservoir Modeling, SPEREE Oct 1999, 2 (5). Michalewicz, Z.: "Genetic Algorithms + Data Structures = Evolution Programs," Berlin Springer, 3rd Edn., 1996 Montgomery, D.C: "Design and Analysis of Experiments," John Wiley & Sons, Inc., New York (2001) Nvdal, G., Johnsen, L., Aanonsen, S.: "Reservoir Monitoring and Continuous Model Updating Using Ensemble Kalman Filter," SPEJ (Mar 2005) 66 Nicotra, G., Godi, A., Cominelli, A. Christie, M: "Production Data and Uncertainty Quantification: A Real Case Study," paper SPE93280 presented at the Reservoir Simulation Symposium, Houston, 31 .January-2 .February, 2005 Olea, R.A.: Geostatistical Glossary and Multilingual Dictionary, Oxford University Press, 1991. Parish, R.G., Calderbank, V.J., Watkins, A.J., Muggeridge, A.H., Robinson, P.R.: "Effective History Matching: The Application of Advanced Software Techniques to the History Matching Process," paper SPE25250 presented at the Reservoir Simulation Symposium, New Orleans, 28 .January-3 .March, 1993 Peake, W.T., Abadah, M., Skander, L.: "Uncertainty Assessment using Experimental Design: Minagish Oolite Reservoir," paper SPE91280 presented at the RSS 2005, Houston, Texas, 31.1-2.2.2005 Rechenberg, I.: "Evolution Strategy: Optimizations of Technical Systems Based on Principles of Biological Evolution," Frommann-Holzboog, Stuttgart (1972) Rietz, D., Usmani, A.: "Reservoir Simulation and Reserves Classification - Guidelines for Reviewing Model History Matches To Help Bridge the Gap Between Evaluators and Simulation Specialists," paper SPE 96410 prepared for presentation at the SPE Annual Technical Conference and Exhibition held in Dallas, Texas, U.S.A., 9-12 October, 2005 Roggero, F., Hu, L.Y. : "Gradual Deformation of Continuous Geostatisticals Model for History Matching," paper SPE 49004 presented at the SPE 1998 Annual Technical Conference and Exhibition, New Orleans, USA Romero, C.E., Carter, J.N., Zimmermann, R.W., Gringarten, A.C.: "Improved Reservoir Characterization through Evolutionary Computation," paper SPE 62942 prepared for Annual Technical Conference and Exhibition, Dallas, 1-4 October, 2000 Schiozer, D.J.: "Use Of Reservoir Simulation, Parallel Computing and Optimization Techniques to Accelerate History Matching and Reservoir Management Decisions," paper SPE 53979 prepared for the SPE Latin American and Caribbean Petroleum Engineering Conference, Caracas, April 21-23, 1999 Schulze-Riegert, R., Axmann, J., Haase, O., Rian, D., You, Y.: "Evolutionary Algorithms Applied to History Matching of Complex Reservoirs," SPE Reservoir Evaluation & Engineering (Apr. 2002) 5 (2) Schulze-Riegert, R., Haase, O., Nekrassov, A.: "Combined Global and Local Optimization Techniques Applied to History Matching," paper SPE 79668 prepared for presentation at the SPE Reservoir Simulation Symposium held in Houston, Texas, U.S.A., 35 February 2003 Schulze-Riegert, R., Krosche, M., Fahimuddin, A., Ghedan, S.: "Multi-Objective Optimization with Application to Model Validation and Uncertainty Quantification," paper SPE 105313 presented at the 15th SPE Middle East Oil & Gas Show and Conference, Kingdom of Bahrain, 1114 March 2007

46

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

[56] [57]

[58] [59] [60]

[61]

[62] [63]

[64]

[65] [66]

Schwefel, H.P.: "Numerical Optimization of Computer Models with the Evolution Strategy," Birkhuser, Basel (1977) Selberg, S., Schulze-Riegert, R., Stekolschikov, K.: "Event Targeting Model Calibration Used for History Matching Large Simulation Cases," paper 106044 was prepared for presentation at the 2007 SPE Reservoir Simulation Symposium held in Houston, Texas, U.S.A., 2628 February 2007 Smith, P.J., Hendry, D.J., Crowther, A.R.: The Quantification and Management of Uncertainty in Reserves, paper SPE 26056, presented at the SPE Western Regional Meeting held in Anchorage, Alaska, 1993. Soleng, H.H.: "Oil Reservoir Production Forecasting with Uncertainty Estimation using Genetic Algorithms," Proceedings of the 1999 Congress of Evolutionary Computing, 1999 Srivastave, R.M.: An Application of Geostatistical Methods for Risk Analysis in Reservoir Management, paper SPE 20608, presented at the 65th Annual Technical Conference and Exhibition of the SPE, New Orleans, Sept 23-28, 1990. Subbey, S., Christie, M., Sambridge, M.: "A Strategy for Rapid Quantification of Uncertainty in Reservoir Performance Prediction," paper SPE 79678 presented at the SPE 2003 Reservoir Simulation Symposium, Houston, USA Tarantola, A.: "Inverse Problem Theory - Methods for Data Fitting and Model Parameter Estimation," Elsevier Science Publishers, Amsterdam 1987 Vasco, D.W., Yoon, S., Datta-Gupta, A.: "Integrating Dynamic Data Into High-Resolution Reservoir Models Using Streamline-Based Analytic Sensitivity Coefficients," SPEJ (Dec 1999) 389 Verbruggen, R., Pannett, S., Stone, G.: Understanding Reserves Uncertainties in a Mature Field by Reservoir Modeling, paper SPE 77896, presented at the SPE Asia Pacific Oil and Gas Conference and Exhibition, Melbourne, Australia, 8-10 Oct 2002. Vincent, G., Corre, B., Thore, P.: Managing Structural Uncertainty in a Mature Field for Optimal Well Placement, SPEREE August 1999, 2(4). Walker, G.J., Pettigrew, S.: "Measures of Efficiency for Assisted History Matching," 10th European Conference on the Mathematics of Oil Recovery Amsterdam, The Netherlands 4 - 7 September 2006 Walstrom, J.E., Mueller, T.D., McFarlane, R.C.: Evaluating Uncertainty in Engineering Calculations, Journal of Petroleum Technology, Dec 1967, p 1595. Watkins, A.J., Parish, R.G., Modine, A.D.: "A Stochastic Role for Engineering Input to Reservoir History Matching," paper SPE 23738 presented at the Latin American Petroleum Engineering Conference, Caracas, Venezuela, March 8-11, 1992 Wen, X.H., Chen, W.: "Real-Time Reservoir Model Updating Using Kalman Filter," paper 92991 presented at the 2005 Reservoir Simulation Symposium, The Woodlands, USA, 2005 Wen, X.H., Deutsch, C.V., Cullick, A.S., Reza, Z.A.: "Integration of Production Data in Generating Reservoir Models," Centre For Computational Geostatistics, University of Alberta, Monograph Series Vol. 1, (2005) Williams, G.J.J., Mansfield, M., MacDonald, D., Bush, M.D.: "Top-Down Reservoir Modelling," paper SPE89974 presented at the ATCE 2004, Houston, Texas, 26-29. Sep. 2004 Williams, M., Keating, J., Barghouty, M.: "The Stratigraphic Method: A Structured Approach to History Matching Complex Simulation Models," SPEREE (Apr 1998) 169 Wolpert, D.H., Macready, W.G.: "No Free Lunch Theorems for Optimisation," IEEE Transactions on Evolutionary Computation, 1:1 pp 67-82, 1997

[67] [68]

[69]

[70]

[71]

[72] [73]

47

Modern Techniques for History Matching

R. Schulze-Riegert / S. Ghedan

Acknowledgement The authors would like to thank Olaf Haase and Stig Selberg for critical feedback and Stefan Djupvik for preparing selected graphics. Special thanks goes to Matthew Harris and Arthur Cropley who supported proof reading the manuscript. The authors acknowledge and appreciate the SPT Group for their provision of resources. Authors Ralf Schulze-Riegert holds a Ph.D. in theoretical physics from the University of Hanover, Germany. For more than ten years he has worked as a Principal Consultant in the Oil & Gas and Automotive industries, with a focus on project management and software development. He is a part-time lecturer at the Technical University of Clausthal, Germany, specialising in Model Validation, Computer-Aided Design and Optimisation. At SPT he is currently Technology Development Manager of the MEPO optimisation tool, and is responsible for developing and evaluating optimisation techniques and new workflow scenarios. With application to reservoir simulation, he regularly presents and co-authors publications on research findings and case studies on model validation, optimisation and uncertainty quantification. Shawket G. Ghedan is an associate professor of PE at the Petroleum Institute of Abu Dhabi and a Schlumberger-NExT Instructor. He has over 20 years of diverse academic and industrial experience. Before joining the PI, he worked with different oil and gas operating companies, dealing with reservoir operations and management, reservoir characterization and modelling as well as long term development planning studies. He is the principal author of a number of technical papers. His research areas include: modelling of oil recovery in transition zones, dynamic rock typing, modelling of heterogeneous fractured reservoirs and stochastic optimisation techniques for history matching and full field development planning. Dr. Ghedan is the chairman of the Abu Dhabi SPE Section. He was the 2005-2006 SPE distinguished lecturer. He holds a B.Sc. degree in Petroleum and Minerals Engineering from Baghdad University, and M.E. and Ph.D. degrees in Reservoir Engineering from the Colorado School of Mines.

48

S-ar putea să vă placă și