Sunteți pe pagina 1din 25

Summer training report

Seismic Data Processing


ONGC, Baroda

Department of Applied Geophysics


INDIAN SCHOOL OF MINES, Dhanbad

AN EFFORT BY
Abhishek Pandey Ambarish Anand Khavnekar Anushree Pandit Bipasha Sinha Gollapudi Mahesh Kuntal Bhukta Sadanand Chaurasiya Shantanu Samanta Siva Ganesh Mulagapaka Tamoghna Biswas

Acknowledgement
I am grateful to Dr. B.J. Rao, DGM (GP), Dehradun, to shedule our training. I am thankful to Dr. U.S.D. Pandey, DGM(GP), Vadodara for giving us an opportunity to learn from the finest experts of ONGC. I am grateful to Sri U.K. Chatterjee DGM (GP), and Mrs. Geeta for coordinating our summer training at ONGC, Baroda.It is matter of pride for me to have undergone a summer training under the guidance of a brilliant team of Geophysicists of ONGC at every stage of my training and building up my confidence and boosting up my interest in 3D seismic data acquisition and processing. I am thankful to the staff of the RCC, Gaveshna Bhavan. I am thankful to Sri M. Singh for making us go through fundamentals of processing and the basics early which made the learning process smooth and great. I am very much obliged to Sri C. Chakravorty for lending all his precious time to take me through all concepts being cleared whenever I wanted. I am highly indebted to Sri Vikas Chandra for working me through the finest details of Seismic processing, and letting me expend his very valuable time at my disposal. I am thankful to Sri R. K. Rawat ,Sri Harish Chander, Sri R. K. Garg, Mrs Viwvaja. Devalla for their valuable lecture on basic interpretation sequences used. I am grateful to my family and friends for their emotional support and love, which helped a lot. I am very thankful to all my friends and colleagues for making my stay at Baroda peaceful and fun. Last but not the least I am thankful to all those who have directly or indirectly rendered help and support during the course of my training

Contents

y y y y y y y

Introduction General geology of the area Acquisition Parameters Processing Sequence Processing Parameters Quality of data Conclusion

Processing
I. Introduction

Data processing converts field recordings into meaningful seismic sections that reveal and help delineate the subsurface stratigraphy and structure that may bear fossil hydrocarbons. The final interpretation of the seismic data is only as good as the validity of the processed data. It is imperative that the interpreter be aware of all the problems encountered in the field data acquisition and the data processing stage. General geology of the area A data processing geophysicist must know and understand the regional geology of the Basin and particulars of each processing step. Each geologic setting presents its own specific problems to solve. Before routine processing for a prospect, extensive testing on the data should be done to study the problems involved to design the optimum parameters for each step in the data processing data flow. III. Acquisition Parameters II.

GEOMETRY PARAMETERS

Sample of Geometry parameters are given.


Sl No ITEM PARTICULARS

1 2 2 3 4 5 6

Spread geometry 10 Receiver lines, Orthogonal Shooting Type of shooting Symmetric Split Spread No of channels per line 168( 84 + 84) Total active channels per shot 1680 Bin size 20 m x 20 m Fold 60 (12 x 5) No. of receiver lines 10

7 8 9 10 11 12 13 14 15

Maximum Far offset 3863 m (In Line) / 4321 m (X-line) Near offset 20m Receiver line interval 280 m Shot line interval 280 m Shot interval (Cross-line) 40 m Shooting Pattern Orthogonal, In-between Rec. lines Direction of shooting E TO W Direction of swath roll-over North to South Source Location In between receiver lines

SOURCE PARAMETERS Sl No ITEM 1 Source type 2 Charge size 3 Charge depth 4 Source pattern PARTICULARS Dynamite 6.0 Kg. 18 TO 45 M Single hole

RECEIVER PARAMETERS Sl No 1 2 3 ITEM Geophone type Natural Freq. Type of array PARTICULARS SM-24 10 Hz 06(BUNCHED)

INSTRUMENT PARAMETERS SL.NO. 1 2 3 ITEM Seismograph Recording Format Record Length PARTICULARS I/O SYSTEM FOUR (SCORPION) SEGY IEEE (32 BIT) 6 sec

4 5 6 7 8 9

Sampling Interval 2 m sec Pre-amplifier gain G0 (UNITY)/G6 Low cut filter 3.0 HZ High Cut filter 187HZ.(0.75FN,MINIMUM PHASE) Polarity Normal Aux. channel 3

IV.

Processing Sequence

Processing flow chart


PREPROCESSING FLOW CHART

1. PREPROCESSING ---- Format Conversion ---- Merging with Field Geometry ---- Editing ---- Geometric Spreading Correction ---- Application of Field Statics ---- Surface Consistent Amplitude Correction 2. SURFACE CONSISTENT DECONVOLUTION

--- Dephasing Filter from. Instrument Response

3. CMP SORTING

4. VELOCITY ANALYSIS 5. RESIDUAL STATIC CORRECTIONS 6.. VELOCITY ANALYSIS 7. PRE STACK TIME MIGRATION 8. RMS VELOCITY REFINEMENT FINAL PRE STACK MIGRATION 9. MUTING ---- STACKING 10 POST STACK PROCESS ---- Random Noise Attenuation ---- Decon After Stack ---- Time Variant Filtering & Phase conversion from minimum to Zero Phase

Processing flow chart for PSTM Stack


1. PREPROCESSING ---- Format Conversion SEG D format data from field is converted into software compatible format which is .cst extension for the CGG Veritas and every step data worked is saved in the same format compatible with the softwares of the workstation for pre processing. ---- Merging with Field Geometry

The combination of seismic data with the navigation data is called merging The Input data in SEGY format was converted to CGG format and subsequently merged with SPS which stands for the shell processing support , it is 3 diferent files giving data about navigation and survey and to the position of every shot There are three files s-file r-file and x-file These files relate different data the SPS is supplied by the field crew. Field statics provided by the party are used for processing. A priory velocity was used for spherical divergence correction, which is the reference velocity obtained from prior survey or vsp survey.

---- Editing Bad traces containing more noise than signal are rejected. This process long back used to b manual because rejecting a trace need very judicious decision making cause a trace is all signal if interpretable and all noise if we cannot interpret it. Although we have subroutines (modues ) to take care of this as today. The records from every shot were displayed on a workstation for the purposes of quality control. The bad traces containing spikes were automatically edited by determining a suitable threshold level for spike removal. The shot records were then QC d and further noisy traces were manually edited.

---- Geometric Spreading Correction Since every step in process is followed by QC or what we call quality control to make sure we have not jeopardised the information while removing noise . Geometry merging is one operation in which you need to do etensive QC cause once data is merged you actually move to the next step . Brute stack volume along with foldage plot was generated to check for any irregularity. Simultaneously Time Slices were also generated for QC purpose There are a few quality checks that are performed to check the deteoration of quality of data. These are : 1. Regional Elevation Variation 2. Elevation and Receiver statics 3. QC of Elevation, shot depth, up hole Time and shot static 4. Raw gather 5. Denoise Gather 6. Decon Gather 7. Brute stack

----NOISE ATTENUATION ON SHOTS The commonly occuring ground roll is much eliminated through the band pass filter Surface wave noise attenuation module attenuates surface wave noise by forming low frequency Arrays. Surface wave noise attenuation transforms from the time domain to the frequency domain.

Frequency components higher than the cutoff frequency remain unchanged. Finally, the data are transformed back to the time space domain. This methodology was capable of removing most of the unwanted noise. Sometimes Error occurs : error was reported due to incorrect numbering of au and du. Sometimes commonly occurring human errors can do blunders of this nature and create hell ifor the geophysicist doing processing. This error is hard to identify if not reported in the FFID. There are still lot of human errors that create havoc in processing

NOISE

The beauty of noise: Everything in the seismic trace which cannot be interpreted is noise rest from which you can infer information is signal or information

Noise: noises of different nature and characteristics appear on the records They can be of following nature :coherent noise and i ncoherent noise

Coherent noise has charachteristics of constant frequency throughout the trace in that channel. It may be cultural occuring due to pumps running in the field . Power line noise which consists of 50 Hz frequency throughout and occurs in channels near to the power line.

We can apply manual denoising operations cause we are not dealing with single trace but as a volume of traces for every shot. Incohrrent noise may be random. e.g. vehicle movement

It might be difficult to remove using a predefined algorithm and not everytime we can have a denoising module for removing incoherent noise. Spikes : high amplitude events. Ground roll: motion of ground recorded due to source creating vibrations in near surface layers. Cultural noise: various cohenrent and random noises due to human activities in the vicinity of the field operation.

There are close to 46 denoising modules in the CGGveritasTM. One must be very judicious while choosing which module best suits the requirement, because with every computer adaptive module that we use we risk information.

So noise modules are applied but we must be sure as to what we are risking, and the degree of acceptance of loss of information.

B: Some traces that might contain more noise than signal are dele ted off the record, this is called editing.

Spike removal: we set parameter for spike removal like x, y , z. and the event is smoothened, clipped or muted depending on whether the event's amplitude is (100+x) % of the average amplitude of the trace.

Ground roll: this is a low velocity noise and is removed by TDNFK module. We parameterise providing a lower velocity limit say 1000m/s the module concerns itself with these events only .

Despiking :a module FDNAT is used and parameters for 4 different frequencies are provided. This makes a band pass filter with tapering ends on both eft and right. Noise should not be dealt with at the cost of data

Seismic data and navigation data are merged toether and EXTENSIVE QC is run to make sure every trace is numbered.

---- Application of Field Statics Field statics are done for correction to bring reciver and shot positions on datum level. Uphole while shooting helps determine the static correction for shot point, integrating with navigation data. Survey data is used to correct for receiver point elevation.

---- Surface Consistent Amplitude Correction When shot is fired all the energy transmitted by the shot spreads out in an increasing wave front which expands with increasing time. Energy reaching different channels is not equal. It decreases with offset .Energy loss is proportional to the square of distance the wave front has travelled. The distance r is in turn proportional to the travel time T a nd the velocity V , so the spherical divergence part of the energy loss is proportional to (T x V)2. Geometric spreading correction is applied to compensate for wave front divergence early in processing, before deconvolution. In practice a suitable gain curve for application is decided upon by undertaking a suite of trials on appropriate shot records which are representative of the dataset.

Surface consistent amplitude compensation estimates and adjusts the relative amplitude contributions of sources and receiver on surface consistent basis.The traces of a weak shot will show lower than normal amplitudes. All traces recorded at a certain ground station can be of higher amplitude because of a better than average geophone coupling at that surface location. One or more channels may differ in response from the average. The shots are consistent after application of Surface Consistent Amplitudes.

--- Dephasing Filter from. Instrument Response

Dephasing filter is designed from the impulse response of the rec ording instrument was applied on the data before deconvolution.

DECONVOLUTION BEFORE STACK

DECONVOLUTION is done to increase the frequency of the data and to Source wavelet is convolved with the earth s reflectivity series to give us the seismic trace,the process of finding the earth s reflectivity series using incerse filter of the source wavelet is the process of deconvolution.

4. VELOCITY ANALYSIS Velocity field is used in normal move out correction of CMP gathers. Usually to start with the initial velocity analysis is carried out in a grid of approximately 500m x 500m using inbuilt modules. These velocity analyses are performed on denoised and deconvolved CDP gathers. Choosing inlines after certain interval those of whose cdp gathers are representative of most traces, we chose definite cdp points at certain intervals

by taking equal number of CDP points on either side of the CDP chosen for velocity picking. Subsequently a broad band filter and single window equalization was applied on the data to facilitate the velocity picking. The velocities picked from these analyses were applied to the data to generate an initial velocity stack. Semblance curve is obtained velocity semblance curve is obtained and picking of velocity is done ensuring we don t d o overcorrection or under correction. A second pass of velocities are generated after 1st residual statics application in a grid of 500 m X 500 m intervals and third pass of velocities (500m x 500m ) are generated after the application of 2nd pass residual statics. Iterative residual statics is applied in 2 stages after NMO correction.

5. RESIDUAL STATIC CORRECTIONS A second pass of velocities are generated after 1st residual statics application in a grid of 500 m X 500 m intervals and third pass of v elocities (500m x 500m ) are generated after the application of 2nd pass residual statics. Iterative residual statics is applied in 2 stages after NMO correction.

6. VELOCITY ANALYSIS Velocity analysis in done in second pass and velocities are generated

7. PRE STACK TIME MIGRATION Pre-stack Time Migration on selected lines (every 24th line) were carried out using 3rd pass velocity field. Further RMS velocity was picked on PSTM gathers in a grid of 480m X 480m. The refined RMS

velocity volume (On PSTM Ga thers) was finally used for PSTM with Aperture 4000X4000 m at 4000 ms for

8. RMS VELOCITY REFINEMENT Rms velocity is computed by dix,s formula. The dix's formula is also available in a module 9.FINAL PRE STACK MIGRATION Pre-stack Time Migration on selected lines at chosen is carried out using 3rd pass velocity field.

---- MUTING The first break mute is applied by picking the first break times of representative traces and muting is extended to the data. However if there is some noise to be removed surgically it is also performed.

---- STACKING

10 POST STACK PROCESS

The pre stack time migrated data is subjected to random noise attenuation to enhnce the signal to noise ratio by removing the random noise present in data.this is not done at first so as to not lose valuable information.

---- Decon After Stack The deconvolution predictive is done to suppress short period multiples, and improve the resolution of traces by compressing the wavelet.this allows better recognition of closely spaced layers at the time of interpretation.

PROCESSING SEQUENCE A master grid is prepared by fixing a x and y coordinate origin. A 3-D processing grid utilized for the processing of data is given below C D

Area-----

Area

POINTS A B C D

In Line 313 313 1670 1670

Cross line 343 1146 343 1146

2578824.88 621766.43 2593918.47 627253.43 2588097.42 596259.58 2603191.02 601746.58

SPS data, edit files, observer logs etc. along with raw data were thorou ghly checked swath wise and corrections incorporated after interaction with party. Subsequently, Fold map was generated using above grid which is shown in

Raw gathers were analyzed in different swaths across the area to study the noise pattern present in the data.

V.

Quality of data

The initial gather is full of noise and spikes hence denoising is done and despking module particularly deletes 8-9% of traces. The S/N of the initial gather is not good enough due to different type of noise present in the traces. Efforts are made to denoise the data and considerable time was spent for the same. Northern Part of the data was full of spikes & ground roll. Problems associated with SPS and NTBC files we re resolved after discussion with party members. Data quality in Sothern part was very good. Extensive test processing and quality control at different level of processing and close interaction with interpreter helped in improving the processed out put. A comparison with processed outputs at different stack level clearly shows the enhancement of continuity at shallower as well as deeper part of the section. due to processing (plate 8 to 21). Final PSTM Stack and RMS velocity volume was transferred to interpretation Group by ftp just after the processing was completed.

VI.

Conclusion

Extreme care is taken to preserve the frequency in the zone of interest to and different QCs are performed. Close grid velocity analysis is carried out to generate 3D velocity model . Noise is to be taken care of first. Different noises like, spikes are eliminated. Ambient noises removed by first break mute. Different modules in 

veritas like TDNFK, FDNAT etc. are used to remove ground roll. And A 3D velocity volume is obtained for the area. The entire 3D volume is then Pre-stack Time Migrated. Due to the presence of simple or severe logistics and constraints. In the area of operation, the irregular recoveries have to be resorted to keep the foldage uniform. Events are brought out in the zone of interest having good S/N.

SWATH GEOMETRY

SWATH GEOMETRY PARAMETERS UNIT TEMPLATE

NEAR SURFACE MODEL

FOLD MAP

FOLDAGE LEGEND

QC FIELD SPS

SHOT GATHER

DENOISE GATHER

DECON GATHER

FREQUENCY SPECTRUM OF DENOISED GATHER (OF THE AREA IN RECTANGLE)

BRUTE STACK ALONG INLINE

VELOCITY SPECTRUM

VELOCITY STACK

S-ar putea să vă placă și