--> Data Calibration in Basin Modeling – The Ultimate Integration Step

AAPG Hedberg Conference, The Evolution of Petroleum Systems Analysis

Datapages, Inc.Print this page

Data Calibration in Basin Modeling – The Ultimate Integration Step

Abstract

The reconstruction of thermal, erosion and uplift history of sedimentary basins with computer models is a technology that has been developed over the last 40+ years. Many processes, chemical and physical, have been added over this time. As a result, modern basin or petroleum systems models are now capable of describing the basin evolution and the presence and nature of hydrocarbons within sedimentary basins. These models are often used in the oil and gas industry to predict the hydrocarbon occurrence in conventional traps or for source rock plays. In addition, some models are used to predict predrill hydrocarbon properties such as API gravity, Gas‐Oil‐Ratios and hydrocarbon volumes in source rock or traps. The fundamental workflow for making these predictions is to calibrate different processes that are calculated as part of a model with measured properties. This talk will discuss (a) how a valid calibration can lead to predictive petroleum systems models and (b) demonstrate how perfectly calibrated models can be non‐predictive. Examples and case studies will elaborate on the difficulty to calibrate to temperature, porosity, pore pressure, API, GOR, etc., and cover how scenario testing can be used to validate the geological interpretation. The chosen case studies include the Gulf of Mexico and some “unconventional” plays in onshore basins. There are three main pieces of the puzzle that impact the results and ultimately the accuracy of a basin model: (1) the equations and their parametrization in the solver used to describe a physical or chemical process, (2) the conceptual model of the basin evolution, and (3) the uncertainty in the calibration data itself. (1) Equations and their parametrization: As mentioned, the nature of the equation and their parametrization will fundamentally impact the calculation and therefore the prediction away from control or calibration data. For example, the temperature calculation looks at first glance to be a linear relationship, allowing for a simple and easy calculation of the thermal field within a sedimentary basin. The Fourier’s law of heat conduction is used to describe the problem: q= ‐Λ x grad T/ grad z [where Λ is thermal conductivity (W/m/K) and T is Temperature (K), and z is depth (m). q= heat flow in W/m2]. However, the thermal conductivity Λ is not a constant. For many rocks, Λ is a non‐linear function of temperature, as is the heat capacity. In addition, porosity decreases as the sediments are buried. The simple linear heat flow equation becomes complex and a non‐linear problem. (2) Conceptual Model: The second important piece that introduces ambiguity is the conceptual model of the basin and its realization as a numerical basin model. The discretization of the geometry (stratigraphy) and time is usually a simplification and will impact the result. For example, let’s assume that a model has a late erosion and exhumation event. The geological data describing this event is ambiguous, with the initiation ranging between 10 and 2 million years before present (Mbpa). Basin models with a 10 versus 2 Mbpa erosional event will show a different present‐day temperature, depending on the erosion amount and the amount of time allowed to equilibrate the heat energy within the basin after the disturbance ends. (3) The data itself: All data does have an inherent uncertainty range that must be considered when calibrating a basin model. For the used temperature example, the measurement (device) error is usually negligible. The error is introduced through the circumstances (where and when) the temperature is measured. It is known that the temperature reading within a borehole needs to be corrected based on the time when circulation of the mud stopped. However, that key information is in many cases not available and many times the “experience” factor of +10% or +10/20°C is added to “correct” the raw temperature measurement. The question remains: should the model match the data or is the model that is too warm the best? Which of the data should one to calibrate too? The latter question arises quite often for unconventional projects, simply due to the data density for onshore projects. Other parameters are even more complex than temperature, e.g. permeability, bubble/dew point pressure of a hydrocarbon mixture, or gas‐oil‐ratios. The talk will provide specific examples for these types of data and set them into context how to make a good prediction (or not). In summary, understanding the uncertainty of the conceptual model is the foundation to create a predictive model that helps sharpen the understanding of geological processes and/or the predict the presence of hydrocarbons. Within the model building process, the interaction of the parameterization and within the frame of the conceptual model is also very important to understand. These two aspects will pre‐determine whether a calibrated model has any predictive power or not. The calibration process itself is therefore not necessarily the most important step. Alternatively, the first step is to question whether the calculated porosities, temperatures, source rock maturities, GOR’s, etc. makes sense within the framework. A case study from the Gulf of Mexico clearly illustrates the need to understand fully the geology and the way geological aspects are parsed into a basin model. With its unique salt bodies and mini‐basins, the GoM creates a special challenge for creating a valid and predictive basin model. Uncertainties range from which seismic wavelet to pick, depth conversion confidence, and imaging, in general, thereby providing a challenging framework. Simple tasks such as the thermal calibration become very complex, and it remains to be seen if accurate predictions around hydrocarbon fluid properties can be achieved.