Sunteți pe pagina 1din 5

Time to Depth Conversion and Uncertainty Assessment using Average Velocity Modeling

David C. Bartel*, Mark Busby, Jeff Nealon, Joerg Zaske, Chevron Energy Technology Co.
Downloaded 04/11/16 to 132.239.1.231. Redistribution subject to SEG license or copyright; see Terms of Use at http://library.seg.org/

Abstract velocity modelers edit data, develop models, and assess


velocity/depth residuals and uncertainty.
Time to depth (or depth to time) conversion of interpreted
surfaces, seismic data, and well data is an important part of Average velocity versus interval velocity
the interpretation workflow. Velocity models provide a
linkage between time and depth. Building the velocity Interval velocity is an intrinsic property of the rocks within
model in average velocity allows direct mapping between the Earth. However, in order to develop an accurate
time and depth, and avoids some pitfalls associated with velocity model for time to depth conversion using interval
interval velocity modeling. In many cases, an average velocities, we need to know the interval velocities
velocity model can easily be built and with sufficient detail accurately from the top of the model down to our objective
to meet the velocity accuracy required for successful time level. This necessary accuracy can be severely challenged
to depth conversion. Multiple velocity models built using in areas of poor seismic data and no well control.
different methodologies allow for the quantitative Additionally, among the commonly used methods for
assessment of uncertainty in velocity, and therefore depth. determining velocities, only sonic logs directly measure a
The data used for modeling may include seismic velocity proxy for interval velocity (they measure interval
data, well velocities (from checkshot, VSP, and/or synthetic slownesses, the inverse of velocity). Sonic log
seismograms), well markers, and surfaces. Either measurements, however, are greatly affected by near
interpreted geologic or isovelocity, that is surfaces of equal borehole conditions and may not be indicative of the true
average velocity, surfaces can be used. Our workflow for rock velocity. Checkshots and VSPs measure average
time to depth velocity modeling includes data gathering, velocity (time-depth points), and seismic stacking and time
editing, and modeling; a careful review of the model; and a migration velocities are rms-type velocities. Accuracy in
quantitative assessment as to the uncertainty of the velocity estimating interval velocities from the time-velocity pairs is
prediction. affected by the spacing of the measurements and the
complexity of the Earth. The Dix equation (Dix, 1955), for
Introduction instance, assumes flat, isotropic, constant velocity layers in
order to be correct. Depth migration velocity analysis is
Time to depth velocity models are an integral part of the performed in interval velocity, but as aforementioned,
interpretation workflow allowing the interpreter to match migration and stacking velocities are determined to best
seismic time domain data with well data that was collected image the data and may not represent the true velocity of
in the depth domain. Even after prestack depth migration, a the Earth. Seismic and well velocities have different scales
velocity model may still be necessary to accurately match of resolution (Chang et al., 1998) and the frequency of the
borehole and seismic data because seismic velocities are velocity measurement can also affect the derived interval
picked to enhance image quality not to model earth velocity (Winkler, 1986).
velocities (Al-Chalabi, 1979). An inadequate earth model
may have been used for imaging (e.g., ignoring anisotropy, Average velocities, on the other hand, are directly implied
or wrong type of anisotropy assumed), or insufficient data by checkshots, VSPs, and synthetic seismogram
are available. An accurate velocity model is needed to be measurements. It is a simple, and unambiguous, equation to
able to fully utilize the seismic and well data for convert time-depth pairs to time-average velocity pairs.
exploration, development, and reservoir modeling. Converting seismic velocity measurements to average
velocity may still be arguable, but less so than converting
Steps to build a velocity model include gathering and to interval velocity. Furthermore, an error in the average
editing the data, modeling the data to provide an initial velocity estimate at one depth on a velocity curve or in a
velocity model, correcting the model to match the desired velocity model does not affect the depth estimate at deeper
well data (residual modeling), reviewing the model for points along the curve.
quality control, and providing an estimate of uncertainty.
There are several ways to build a good velocity model and To illustrate these issues, let us first consider potential
having a good set of procedures helps ensure that it is done interval and average velocity errors implied by a
efficiently and with consistent results. We typically create checkshot/VSP data set (Figure 1). The error associated
time to depth velocity models using average velocity and with a change of 1 foot in depth and 0.5 ms in time is up to
we have produced a set of workstation tools to help 3000 ft/s in interval velocity. This magnitude of change is
certainly within the realm of potential errors (Bartel, 1999).

SEG/New Orleans 2006 Annual Meeting 2166


Time to Depth Velocity Modeling

The same mispick error in average velocity is basically get the proper water or near surface velocity. The velocity
imperceptible and, based on these criteria, reduces with data curves should be anchored at time 0 and at the base of
Downloaded 04/11/16 to 132.239.1.231. Redistribution subject to SEG license or copyright; see Terms of Use at http://library.seg.org/

depth. the water or near surface layer with the proper average
Velocity Chart for VSP data velocity.
18000 Time-average velocity curves
16000

14000

Data points to
Velocity (ft/s)

12000

10000 be edited
8000

6000

4000
0 3000 6000 9000 12000 15000 18000
Depth Subsea (ft)
Interval velocity Average velocity

Figure 1 – VSP data from a Gulf of Mexico well expressed


as interval and average velocity. Error bars assume 1 foot
depth error and 0.5 ms time pick error. Data from Bartel
(1999). Original
Edited in interval velocity
Edited in average velocity
Another area where working in average velocity holds a
distinct advantage over interval velocity is in the editing of
anomalous data points (Figure 2). When the data points are Figure 2 – Example checkshot data with data points edited
edited in interval velocity, the entire time-average velocity in interval and average velocity. Note change in time-
(and time-depth) relationship below the edited data point is average velocity curve when data edited in interval
changed. When the data points are edited in average velocity. Data from Bartel and Neal (2004).
velocity, the anomalous data point is removed and the time-
average velocity (and time-depth) relationships below are Well data is the standard that the time to depth velocity
not affected. model must match in the each project and deserve much
more scrutiny in the editing process. The well velocity data
Preparing the data used should represent the vertical time-average velocity
relationship at the borehole. Each individual well curve
Well and seismic velocity data must be properly edited in should be examined for overall data quality, and the
order to be useful in velocity modeling. Seismic velocity application of appropriate time and datum shifts. While we
data can have noisy data points, bad picks, and other effects want to edit the velocity curves in average velocity, the
associated with seismic imaging. Seismic velocity data sets data should be viewed in interval velocity space as well to
are large data sets. It would be time consuming to edit each help determine spurious and conflicting data points. The
curve individually. Therefore criteria must be developed to quality of the final velocity model will highly depend on
identify and cull bad data points in a bulk fashion; median internally consistent borehole velocity data, especially from
validation is one such procedure that we have used with closely spaced wells.
success. Median validation involves first determining a
median value for the velocity at a coarse sampling of the Many times well data will not extend into the shallow
data in space and time, calculating the residuals, and then portion of the geologic section. That portion of the curve
examining the residuals to determine bulk editing could be derived from the seismic velocities where it is
characteristics to be applied (e.g., all curves with residuals reasonable to assume that the seismic velocity
above a certain level to be deleted). The median value is measurements are fairly accurate. Also, well velocity
most useful for this type of editing because it will not be curves only extend as far as the well is deep. Different
influenced by a single extreme point within a certain area. wells in a project will terminate at different depths. Hence a
method for extending or extrapolating the well velocity
Our editing program allows other bulk editing of velocity trend is needed. Our technique is to splice the trend of the
curves based on user-defined characteristics such as seismic velocities in the vicinity of the well onto the end of
velocity value, time range, or user-defined polygons. the well curve. Additional editing is needed to ensure
Another important item in preparing velocity data sets is to

SEG/New Orleans 2006 Annual Meeting 2167


Time to Depth Velocity Modeling

conformity between all the well curves so that there will also necessary to properly determine the semi-variogram
not be any data busts in the final velocity model. for geostatistical modeling.
Downloaded 04/11/16 to 132.239.1.231. Redistribution subject to SEG license or copyright; see Terms of Use at http://library.seg.org/

Modeling the data

There are many ways to make a velocity model –


interpolation, neural networks, and geostatistics, to name a
few. We favor using horizon-based interpolation for most
of our time to depth velocity modeling. Interpolation and
geostatistical methods are somewhat similar except that
geostatistics is better able to use the potential predictability
of the data based on its spatial distribution.

Our goal is to use a modeling method that can help us


Figure 3 – Velocity cross-validation errors using different
accurately predict data away from the known well data.
horizons in horizon-based interpolation for an example well
Cross-validation is a technique to test how well a modeling
data set. Horizons used are none (red), waterbottom only
scheme can help us predict the proper velocity. In cross-
(yellow), geologic horizons (purple), and isovelocity
validation, each curve is removed from the data set one at a
horizons (cyan). Overall, isovelocity horizons give the
time, and the remaining curves are used to predict the
lowest cross-validation error for this interpolator and would
velocity at the removed curve site. Good predictive
likely be the best to use in interpolating a velocity model.
modeling schemes will have low cross-validation errors.
Closely-spaced wells may also have low cross-validation
After the initial velocity model is constructed, there is high
errors. An example of cross-validation errors using
likelihood that corrections to the model will be necessary in
horizon-based interpolation with different sets of horizons
order to fit the control data (e.g., the well data). Residual
is shown in Figure 3. In our experience isovelocity surfaces
modeling can be done in the same fashion as the initial
usually have the lowest cross-validation residuals when
modeling, just using the residuals as the input data. Our
used in horizon-based interpolation. We extract the
most common scheme is to use horizon-based interpolation
isovelocity surfaces from a smooth interpolation of the
to back project the residuals onto the starting velocity
velocity data (normally the seismic velocity data) using
model. This is done in an iterative fashion starting with
geologic surfaces. The isovelocity surfaces will have the
large operators to smoothly achieve bulk corrections of the
influence of the geology, but will not follow them exactly.
entire dataset and then using successively smaller radii of
In Tertiary marine basins, where a major influence on
influence to correct the model close to the control data
velocity is the depth below waterbottom, the isovelocity
points (wells). Geostatistical modeling is well suited for
surfaces will be largely sub-parallel to waterbottom.
residual modeling since we would more likely expect a
zero mean to the residual data, or at least a stationary mean
Interpolation and neural network modeling lend themselves
indicative of a necessary bulk shift to the model.
well to the use of cross-validation to verify the modeling
scheme. It is more difficult to use cross-validation in
Assessing uncertainty
geostatistical modeling because beyond the range of the
semi-variogram the output of geostatistical modeling will
As mentioned in the previous section, there are many ways
tend to the global mean of the data. Interpolation will tend
to make a velocity model. Each method has its positive and
to the local mean of the data outside of the input data area.
negative aspects and may be equally valid in most
Neural network models outside the area of the input data
circumstances. Velocity prediction near well control is
are more difficult to predict because neural networks can,
relatively easy. It is more important to properly assess the
to some extent, extrapolate trends in the data.
uncertainty away from well control. We can use the
differences between modeling methods to help us assess the
Modeling of velocity data with geostatistics may also take
velocity uncertainty in an area. Once we have the velocity
additional pre-processing of the data (see Journel, 1989).
uncertainty, we can directly compute the appropriate depth
Geostatistical modeling relies on the data having a
uncertainty.
stationary mean. Thus trends in the data, especially
temporal trends, must be removed prior to modeling and
Our method of assessing uncertainty relies on using
added back afterwards. Various vendor packages are
different methodologies to compute several velocity
available for geostatistical modeling. A sufficient quantity
models. All the velocity models are tied to the same well
of data that is spatially distributed across the model area is
control data and should be the same at the wells. Away
from the wells, the models will be different. Just using

SEG/New Orleans 2006 Annual Meeting 2168


Time to Depth Velocity Modeling

different interpolator operators to compute the models will us edit, model, visualize the data, and assess the model
not give sufficient meaningful differences between the uncertainty. Proper editing of the velocity data in the initial
Downloaded 04/11/16 to 132.239.1.231. Redistribution subject to SEG license or copyright; see Terms of Use at http://library.seg.org/

velocity models. What is needed are truly different ways to phases of a project is crucial to the timely completion of the
compute the base velocity model; for instance, horizontal project.
interpolation versus horizon-based interpolation with
isovelocity surfaces, or interpolation versus neural network A variety of methods can be used to construct a velocity
modeling. Once the various velocity models are computed, model including interpolation, neural network, and
the velocity values at each sample point are examined. We geostatistics. Horizon-based interpolation using isovelocity
assume the variation in the velocity values at the sample surfaces is a relatively simple and quick means to construct
point represent a normally distributed set of values about a high quality velocity model. Differences between models
the “true” value of average velocity at the sample point in constructed different ways can be used to quantify velocity
x, y, and time. Furthermore, since we have a limited uncertainty. By combining these models and examining the
number of samples (because we only calculated a few mean and variance of the velocity at each sample point, we
velocity models), we use the Student-t distribution are led to a reasonable estimate of the “true” velocity and
correction to the normal distribution to derive a mean its uncertainty.
average velocity value and an estimate of the variance at
each sample point in the velocity model. Near the well Acknowledgements
control, the uncertainty is small since all the models were
fit to the same control data (Figure 4). These estimates of The authors would like to acknowledge the many current
the mean velocity and its uncertainty are used further in and past members of the Velocity Modeling Services and
estimating structural and reservoir uncertainty. Research Teams for their contributions to the velocity
modeling workflow and programs used within Chevron.
We would also like to thank Chevron Energy Technology
Co. management and various operating companies for the
permission to present this paper and the associated data.

References

Al-Chalabi, M., 1979, Velocity determination from seismic


reflection data, in Fitch, A.A., ed., Developments in
geophysical exploration studies, v.1, 1-68.

Bartel, D.C., 1999, Velocities before, during, and after


drilling: Presented at the 69th Annual International
Meeting, SEG, paper BH/RP6.6.

Bartel, D.C., and Neal, S., 2004, Piecewise Occam


Figure 4 – Example of calculated depth uncertainty in inversion for smoothing time-depth data from checkshots
meters for one surface in an oilfield. The uncertainty is and VSPs: Presented at the 74th Annual International
relatively small near the well control at the top of the Meeting, SEG, paper VSP2.5.
structure and increases along the flanks. The larger
uncertainty on the right flank is due geologic conditions Chang, C., Hoyle, D., Coates, R., Kane, M., Dodds, K.,
which influence the velocity determination in that area. Esmeroy, C., and Foreman, J., 1998, Localized maps of the
subsurface: Oilfield Review, 10, no. 1, 56-66.
Conclusions
Dix, C.H., 1955, Seismic velocities from surface
Time to depth (and depth to time) conversion of surfaces measurements: Geophysics, 20, 68-86.
and seismic data remains an important part of the
interpretation workflow and requires that we have proper Journel, A.G., 1989, Fundamentals of geostatistics in five
methods to do it correctly. Our standard methodology for lessons: American Geophysical Union, Short Course in
computing velocity models is to construct the velocity Geology v.8.
models in time-average velocity. This approach helps
minimize potential errors from inadequate shallow velocity Winkler, K.W., 1986, Estimates of velocity dispersion
data and inherent errors in the data set. We have developed between seismic and ultrasonic frequencies: Geophysics,
a set of tools, both internal and vendor-based, which help 51, 183-189.

SEG/New Orleans 2006 Annual Meeting 2169


EDITED REFERENCES
Note: This reference list is a copy-edited version of the reference list submitted by the
author. Reference lists for the 2006 SEG Technical Program Expanded Abstracts have
been copy edited so that references provided with the online metadata for each paper will
Downloaded 04/11/16 to 132.239.1.231. Redistribution subject to SEG license or copyright; see Terms of Use at http://library.seg.org/

achieve a high degree of linking to cited sources that appear on the Web.

REFERENCES
Al-Chalabi, M., 1979, Velocity determination from seismic reflection data, in Fitch,
A.A., ed.: Developments in geophysical exploration studies, 1, 1–68.
Bartel, D. C., 1999, Velocities before, during, and after drilling: 69th Annual
International Meeting, SEG, Expanded Abstracts, 168–171.
Bartel, D .C., and S. Neal, 2004, Piecewise Occam inversion for smoothing time-depth
data from checkshots and VSPs: 74th Annual International Meeting, SEG,
Expanded Abstracts, 2493–2496.
Chang, C., D. Hoyle, R. Coates, M. Kane, K. Dodds, C. Esmeroy, and J. Foreman, 1998,
Localized maps of the subsurface: Oilfield Review, 10, no. 1, 56–66.
Dix, C. H., 1955, Seismic velocities from surface measurements: Geophysics, 20, 68–86.
Journel, A. G., 1989, Fundamentals of geostatistics in five lessons: American
Geophysical Union, Short Course in Geology, 8.
Winkler, K. W., 1986, Estimates of velocity dispersion between seismic and ultrasonic
frequencies: Geophysics, 51, 183–189.

SEG/New Orleans 2006 Annual Meeting 2170

S-ar putea să vă placă și