--> 30+ Years – Petroleum Geochemistry: Basin Evaluation and Field Development

AAPG Hedberg Conference, The Evolution of Petroleum Systems Analysis

Datapages, Inc.Print this page

30+ Years – Petroleum Geochemistry: Basin Evaluation and Field Development

Abstract

The role of Petroleum Geochemistry in Basin Evaluation & Field Development has changed significantly over the past 50 years. It will continue to change as industry develops energy resources to power society for the future. It is critical to remember that we deal with a PETROLEUM SYSTEM. To impact Value and Success, we need to understand all aspects of the five petroleum system parameters of: Source Rock; Reservoir Rock; Seal Rock; Migration Route and Trap. As Geologists, we also need to understand the sequential timing of all these critical parameters. We must ensure that we integrate these parameters across the Earth Science and Engineering platforms. Early exploration focused on finding reservoirs / traps. Classic structural and stratigraphic geology drove exploration. The development of seismic and well logging technologies utilized simple models. Many fields were discovered and developed near seeps and existing fields. Most offshore programs were limited to stepouts. Saudi Arabia’s major discovery was made by identifying a slight dip anomaly sighted from Bahrain. As seismic technology and sequence stratigraphic concepts were developed, more “wildcat” exploration plays were pursued with improved mapping and modeling of geologic trends, aided by newly developing tectonic concepts. Exploration risk decreased as worldwide experience grew. The 1960’s and early ‘70’s ushered in an expansion of exploration with data from the Deep Sea Drilling Project and the continued development of plate tectonic concepts (Cox, 1969). Oceanographic ideas of source material types, accumulation, preservation and burial were added to the “System’s” approach (Lisitzin, 1972; Degens & Ross, 1974; Tissot & Welte, 1978; Hunt, 1979). This period increased the use of geochemistry in the spectrum of technologies, to better define the petroleum system (Katz, ed., 1994). “Early” geochemistry was used with electric and petrophysical logs, development of models for water/rock chemistry, and isotopic measurements of oils and gases. Organic geochemistry started making an impact not only for Production, but also Exploration as various types of oils were identified. “Biomarkers” were identified and studied relative to various source materials and thermal alteration processes. 1970’s ‐ 80’s the FUN begins: Computer technologies and modeling concepts stepped in with “Super” computers. The Oil/Gas industry was a first user along with the military and automobile industries. Seismic processing and modeling took leaps forward. However, without good geologic and geochemical data, “Nintendo” geology and “End of Oil” hysteria drove research as well as Exploration/Production efforts, leading to poorly defined Exploration plays. Fortunately some research and development programs “allowed” geologists/geochemists to “Beta” test seismic modeling packages, adding “real” rock and fluid properties to geophysical modeling programs. Satellite remote sensed data improved oceanographic circulation models and our understanding of global tectonics. For example, early exploration (and related geophysics) in the Gulf of Mexico was driven by the “model” that the oil was sourced from post‐ Salt deposited source materials – the salt was considered a “basement”. When biomarkers indicated that the oil was from pre‐Salt sources, geophysical research shifted to developing “through/under” salt seismic techniques. Salt “tectonic” models were improved as data from post‐salt fields indicated migration of fluids up salt dome flanks and through fault systems. Satellite images enabled identification of coastal salt and mud diapiric structures. R&D projects reexamined “old” DSDP cores, core data and maps to identify “new” offshore/deep water plays. Today: Computer research and development experienced a dilemma seen by geochemists in the 1970 ‐ 80’s – “the study of petroleum is too complicated…too many variables!” Exploration and Production groups have tried to deal with “BIG DATA” for decades. We were early adopters of Cray and other Super computers, now we need to drive new developments in computing technologies. Automated SCADA systems and power micro‐grids are being tested to continuously monitor fields for fluid production, pump efficiencies, power grid usage, etc. The key is to use the data in real time for continuous optimization: increased efficiency, lower operating costs, and improved reservoir drainage. New procedures are being developed for drilling and oilfield production monitoring including fluid DNA analyses to optimize well placement and completions, and Frac patterns. Continuous, “real‐time” seismic networks are “listening” to well operations and micro‐seismic responses. Measurement while drilling capabilities and logging tools are improving. EOR concepts also are improving. To add value, these technologies must be incorporated into dynamic reservoir/production models. The old days of using static models with infrequent updating are over. We must move toward real time, inline, geochemical analyses and monitoring to improve reservoir production optimization. Identifying changing fluid flow patterns in the reservoir can lead to significant improvement in optimization. It also must lead to increased value through volume/quality of hydrocarbons produced. We need to increase our fundamental understanding of reservoir compartmentalization, fluid flow, and changes in rock/fluid properties. We need to build “intelligent” systems that will improve company performance, and stay away from Nintendo geology! There is still a strong future for Oil & Gas Exploration and Production technology development: conventional, unconventional, or alternative energy.