--> --> Abstract: Al-Khalij Field (Block 6.Qatar) Development Optimization: A New Dedicated Reprocessing, by Enrico Zamboni; #90105 (2010)
[First Hit]

Datapages, Inc.Print this page

AAPG GEO 2010 Middle East
Geoscience Conference & Exhibition
Innovative Geoscience Solutions – Meeting Hydrocarbon Demand in Changing Times
March 7-10, 2010 – Manama, Bahrain

Al-Khalij Field (Block 6.Qatar) Development Optimization: A New Dedicated Reprocessing

Enrico Zamboni1

(1) TOTAL SA, Pau, France.

Description of the Proposed Paper
The scope of this presentation is to illustrate the recently performed seismic processing applied to the Al-Khalij field, block 6 Qatar. The main objective of this reprocessing was to better define the subtle fault/fracture pattern over the field, to allow the incorporation of new water breakthrough Previous HitdataNext Hit observed in the development wells. Furthermore, results will be used to target future development well locations over the Al Khalij field. In the framework of this project, particular effort has been put into attenuating the surface related multiples without damaging the subtle faults/lineaments necessary for the reservoir delineation and flow understanding. Moreover, the presence in the area of numerous production facilities and the shallow nature of the reservoir lead to non-negligible effects on the fold and footprint pattern. The fold in the undershoot area was correctly handled by a 3D multi parametric interpolation. Geostatistical filtering was used as an alternative solution to tackle the remaining footprint effect.

The Upper Mishrif reservoir production of Al Khalij field in Qatar has seen an increase in water production in recent years. Until recently the impact of faults (or fracture corridors) was considered minor on the dynamic reservoir behaviour. However, recent monitoring Previous HitdataNext Hit (pressure interference, water-cut) clearly prove that faults/fractures are responsible for interferences between a number of adjacent wells. Despite several efforts in previous processing projects the structural imaging was not clear enough to track/interpret the small lineaments and faults necessary for a comprehensive understanding of the Mishrif reservoir. With this in mind a reprocessing on a test area has been performed in 2008 with an extensive testing part. The main idea beyond this has been to use the test area as a driver for the quality of the processing, enabling a sort of go/no-go decision for the full field processing. Moreover the limited extent of zone allowed for a complete testing in a reasonable amount of time. Particular care has been devoted to the multiple attenuation, footprint attenuation and linear noise attenuation.

Processing Flow
The Previous HitdataNext Hit was acquired in 1998 with a four streamer configuration (50m spacing) and a group/shot interval of 6.25m and 12.5m respectively. The maximum offset available was 2300m. Based on previous experiences, the main objective was pure structural imaging.

The Previous HitdataNext Hit is characterised by quite a strong linear noise and by the presence of multiples (surface related and internal). With the pure structural imaging in mind, the main reprocessing challenges can be listed in order of importance:

  • Improve fault imaging within the Upper Mishrif reservoir interval
  • Attenuate multiples
  • Reduce the acquisition footprint
  • Attenuate linear noise
  • Previous HitDataNext Hit interpolation

The Previous HitdataNext Hit for Al Khalij had considerable noise removed by application of designature, swell noise reduction, Tau-p transformation and mute to remove the strong linear dipping energy. Particular attention has been paid to best preserve the far offsets at the target level (that are severely contaminated by the direct arrival energy).

There were also numerous multiples of different periods in the Previous HitdataNext Hit, the attenuation of which was one of the main objectives for the processing. After extensive testing of the state-of-the-art demultiple tools, a two step strategy was chosen for this case (shallow water bottom and limited extension of the acquisition spread). The first step was to tackle the short period multiple with a Tau-p deconvoultion and then apply a 2D SRME starting from the first strong reflector below the water bottom. This technique allowed the short period water bottom multiple removal and then addressed multiples with a period longer than the water period. It was quite difficult and time consuming to choose the parameters and evaluate the tests, due to the rather insensitive level of correlation between the seismic and the well for all the applied tests. For this reason, a strong interaction with the interpreters and a continuous comparison with the previous processing have been essential. A fast track cube (Post-stack time migrated), produced just after the demultiple phase, had shown us that the demultiple was efficient and the details of the fault/lineaments were preserved.

The footprint removal was the other main objective and also involved detailed testing. The tidal static correction has been computed and applied using a geostatistical filtering of the picked bathymetry (no tidal chart was available). The first run of amplitude footprint attenuation has been a standard three term technique.

The Previous HitdataNext Hit has then been regularised/interpolated using a 3D interpolator in the Previous HitcommonNext Hit Previous HitoffsetNext Hit domain, which provided an excellent solution for filling in missing Previous HitdataNext Hit and improving imaging in critical areas. Some shallow palaeo-channels have been perfectly preserved and imaged.
Next the Previous HitdataNext Hit was passed through a standard Kirchhoff pre-stack time Previous HitmigrationNext Hit, a step of high density and high resolution velocity analysis, a high resolution Radon filtering and then stacked. Despite extensive post-stack filtering tests a non-negligible level of residual footprint was still present. A correctly and carefully designed geostatistical filter effectively removed the residual footprint, whilst leaving the subtle lineaments and faults untouched and clearly visible on the time slices. Such a geostatistical filtering, commonly used for velocity QC and destriping, can be envisaged for a more wide ranging use, especially for residual footprint removal.

Conclusion and comments
The final result of this reprocessing shows in comparison with the previous reprocessing, an encouraging improvement of the N20 and N170 fault imaging and carbonate platform progradations. The success of the test area re-processing allowed us to proceed with the full field reprocessing, that is currently ongoing. A detailed analysis of the flow and a comparison with the previous ones clearly highlights that the effectiveness of the demultiple, coupled with the interpolation and the geostatistical filtering footprint reduction, has been the key reasons for the success of this project. Due to the limited impact of the correlation between seismic and well a daily interaction with the interpreters and the generation of attributes maps for the main processing steps have played the rest.