--> Reservoir Modeling – An Insider's History of a Key Enabling Technology
[First Hit]

AAPG ACE 2018

Datapages, Inc.Print this page

Previous HitReservoirNext Hit Previous HitModelingNext Hit – An Insider's History of a Key Enabling Technology

Abstract

As the capability of computers and software increased and cost significantly decreased in the early 1980’s, it did not take long for geological workstations and Previous HitreservoirNext Hit Previous HitmodelingNext Hit software to become a key enabling technology for the industry. The initial tools included integrated and interactive applications that allowed geologists to generate cross sections, maps, and 3D Previous HitreservoirNext Hit property models with relative ease facilitated by databases that could be easily updated and revised. Early adopters were generally project teams working on large assets with hundreds to thousands of wells for whom the workstation environment provided a clear benefit in terms of efficiency, technical quality, and cross-discipline cooperation. The “cultural” gap between the geoscience and Previous HitreservoirNext Hit engineering disciplines began to shrink in the early 1990's as technology improvements enabled easy use of increasingly detailed 3D Previous HitreservoirNext Hit property models to be readily up-scaled for the dynamic models used by Previous HitreservoirNext Hit engineers to evaluate development options and generate production forecasts. The 1990’s also witnessed the rapid acceptance of the use of a variety geostatistical algorithms (e.g. kriging, conditional Previous HitsimulationNext Hit, multiple-point Previous HitmodelingNext Hit, object-based Previous HitmodelingNext Hit, and process-mimicking Previous HitmodelingNext Hit) to populate the increasing detailed Previous HitreservoirNext Hit models. The ability to generate very large and very detailed Previous HitreservoirNext Hit models gave rise to the still unresolved issue of how much model complexity is actually useful – an issue variously referred to as “fit-for-purpose” Previous HitmodelingNext Hit or, somewhat divisively, as “Gilligan vs. Frankenstein” Previous HitmodelingNext Hit. The incorporation of a variety of geostatistical algorithms also led to significant improvements in the industry’s assessment and use of uncertainty in Previous HitreservoirNext Hit development decisions. By the early 2000’s the Previous HitreservoirNext Hit Previous HitmodelingNext Hit “toolkit” moved largely from proprietary software to vendor-provided software. This change significantly improved cooperation and decision making among private and national oil companies. In less than four decades, the industry Previous HitreservoirNext Hit Previous HitmodelingNext Hit capability went from Previous HitreservoirNext Hit models with a few thousand grid cells with dimensions on the order of hundreds to thousands of feet to today’s Previous HitreservoirTop models that may have up to a few billion cells (the so called “giga-cell" models) with grid dimensions of a few tens of feet or smaller.