Quality control in processing
The conventional processing sequence is outlined in Figure 1.5-30. Each of the processes described above is presented in detail in subsequent chapters. In a seismic data processing sequence, the step that is most vulnerable to human errors is defining the geometry for the survey under consideration and merging it with the seismic data. This involves correctly assigning sources and receivers to their respective surface locations and correctly specifying the source-receiver separation and azimuth for each recorded trace in the survey.
To demonstrate just how important it is to correctly specify the geometry of a survey, consider the impact of a deliberately incorrect geometry assignment on velocity estimation and normal-moveout correction. Figure 1.5-31 shows CMP gathers before and after moveout correction and velocity spectra at three analysis locations along a seismic traverse. The case shown in Figure 1.5-31a does not appear to exhibit any abnormal moveout behavior. The velocity spectrum yields a fairly unambiguous primary velocity function, and primary events on the moveout-corrected gather are nearly flat. The case shown in Figure 1.5-31b, however, begins to show signs of something being wrong with the data. Although the velocity spectrum, again, yields a fairly unambiguous primary velocity function, note that the events associated with the major primary reflections in the CMP gather do not submit themselves to flattening properly after normal-moveout correction. Such behavior in the moveout may be attributed to some physical phenomenon, for instance, anisotropy or nonhyperbolic moveout caused by lateral velocity variations. Nevertheless, it is caused in this case by incorrect geometry specification related to wrong offset assignment to the traces in the gather. The abnormal moveout behavior is strikingly more obvious in the case shown in Figure 1.5-31c. Note the ambiguous semblance peaks in the velocity spectrum, which cause failure in normal-moveout correction to properly flatten the primary events in the gather. Note the differences in the degree of abnormal behavior in event moveout from one location to another (Figures 1.5-31a,b,c); the simpler and the flatter the subsurface structure, the less obvious the adverse impact of incorrect geometry on the moveout.
The care required for correct assignment of the geometry of a survey, of course, does not undermine the care required for proper specification of the parameters associated with any other step in a processing sequence. Specifically, each step must be executed with the necessary quality control. Displays of appropriate data attributes, such as amplitude spectrum and autocorrelogram, help the analyst understand signal and noise chracteristics of the recorded data and the effect of a step included in the processing sequence on the data, thus facilitating appropriate specification of parameters associated with that step. Figures 1.5-32 through 1.5-41 show quality control panels that are examples of recommended standard displays for parameter selection at various stages in the analysis. All displays include the amplitude spectrum on the top row averaged over the gather, if it is a prestack test panel, and averaged over the portion of the stack, if it is a poststack test panel, and autocorrelogram of the respective data type on bottom row.
Figure 1.5-32 is the quality control panel for prestack signal processing. Shown from left to right are: (a) a CMP gather which exhibits strong, low-frequency swell noise; (b) low-cut filtering to remove the swell noise; (c) t2 scaling to correct for geometric spreading (gain applications); (d) prestack spiking deconvolution (optimum wiener filters, predictive deconvolution in practice, and field data examples); (e) and wide bandpass filtering to remove the high-frequency noise boosted by spiking deconvolution. Note that the autocorrelogram better exhibits over the entire cable length the characteristics of the source waveform and reverberations and multiples after t2 scaling. Also note that spiking deconvolution has removed much of the energy associated with the reverberations and multiples. The broadening and flattening of the amplitude spectrum after spiking deconvolution are indicative of the increase of vertical resolution.
Figure 1.5-33 shows the spectra which are associated with the gathers from left to right in Figure 1.5-32. The horizontal axis is frequency in Hz and the vertical axis is two-way traveltime in s. Note from (a) that the swell noise at very low frequencies occupies the spectrum along the entire time axis. Note also that the energy in the gather is largely confined to shallow times within a bandlimited region of the spectrum. Following the low-cut filtering (b), note the elimination of the swell noise energy. The t2 scaling (c) has restored the energy at late times, and deconvolution (d) has broadened the spectrum. Following the wide bandpass filtering (e), note that the signal bandwidth has been preserved [compare with (a)], and the spectrum has been flattened within the passband.
Figures 1.5-34 and 1.5-35 show two standard test panels for determining prestack deconvolution parameters. With the help of the amplitude spectrum and autocorrelogram, the analyst chooses an optimum operator length and prediction lag. Figure 1.5-34 shows the test panel for prestack spiking deconvolution. Shown from left to right are: the input gather after low-cut filtering and t2 scaling as in Figure 1.5-32, followed by deconvolution using operator lengths of 120 ms, 160 ms, 240 ms, 360 ms, and 480 ms. Note that deconvolution using an operator length of 480 ms best flattens the spectrum within the signal passband. Failure of deconvolution in flattening the spectrum at very high frequencies is most likely due to nonstationarity of the signal. This effect usually is accounted for by time-variant spectral whitening after stack. Since autocorrelation of input data is used in designing a deconvolution operator, it is appropriate to examine the autocorrelation before and after deconvolution. Note from the autocorrelograms in Figure 1.5-34 that operator length dictates the ability of deconvolution in removing reverberations and short-period multiples.
Figure 1.5-35 shows the test panel for prestack predictive deconvolution. Shown from left to right are: the input gather after low-cut filtering and t2 scaling as in Figure 1.5-32, followed by deconvolution using prediction lags of 2 ms (unit prediction lag), 8 ms, 16 ms, 24 ms, and 32 ms, with the same operator length of 480 ms. Note that the unit-prediction lag yields a flat spectrum across the passband, while increasing the prediction lag results in departure from a flat spectrum. Prediction lag controls the ability of deconvolution to increase the vertical resolution.
Figures 1.5-36 and 1.5-37 show two standard test panels for determining poststack deconvolution parameters. Note from the average amplitude spectrum of the section on the left-hand side of each test panel that CMP stacking inherently attenuates high frequencies which need to be restored by poststack deconvolution. Figure 1.5-36 shows the test panel for poststack spiking deconvolution. Shown from left to right are: the input stack, followed by deconvolution using operator lengths of 120 ms, 160 ms, 240 ms, 360 ms and 480 ms, and high-cut filtering to retain the acceptable signal band and remove the high-frequency noise.
Figure 1.5-37 shows the test panel for poststack predictive deconvolution. Shown from left to right are: the input stack, followed by deconvolution using prediction lags of 2 ms (unit prediction lag), 8 ms, 16 ms, 24 ms and 32 ms, using the same operator length of 480 ms, and high-cut filtering to retain the acceptable signal band and remove the high-frequency noise. Again, note that the unit-prediction lag yields a flat spectrum across the passband, while increasing the prediction lag results in departure from a flat spectrum.
Figure 1.5-38 shows the standard quality control panel for poststack signal processing. Shown from left to right are: a portion of the stacked section with prestack processing as described by Figure 1.5-32; spiking deconvolution (field data examples) to restore the high frequencies attenuated by the stacking process; time-variant spectral whitening to account for nonstationarity and to further flatten the spectrum — all three steps followed by high-cut filtering; bandpass filtering to retain the acceptable signal band and remove the high-frequency noise; instantaneous AGC scaling and rms amplitude AGC scaling.
Figures 1.5-39 and 1.5-40 show the test panels for defining the parameters for time-variant filtering (the 1-D Fourier transform). A portion of the stacked section is bandpass filtered using a 10-Hz bandwidth that slides from low to high-frequency end of the spectrum. Note that the coherent signal at high-frequency bands is confined to shallow times. Nevertheless, these filter panels indicate that signal up to 90 Hz is present in the data down to 2.2 s, and the signal up to 100 Hz is present down to 1.4 s.
Finally, Figure 1.5-41 shows the test panel for poststack noise attenuation using f – x deconvolution (linear uncorrelated noise attenuation). A parameter that needs to be tested for f – x deconvolution is the percent add-back of the estimated noise to circumvent the smeared appearance of events following noise attenuation. Shown from left to right are: a portion of the stacked section with poststack deconvolution, time-variant spectral whitening and bandpass filtering; noise attenuation with 80, 60, 40, 20, and 0 percent add-back. Note that without any add-back, the amplitude spectrum of the section after noise attenuation indicates dampening of high-frequency energy that may be attributed to the random noise uncorrelated from trace to trace.
The test panels for quality control in processing of seismic data are not limited to those presented in Figures 1.5-32 through 1.5-41. Additional panels with appropriate and convenient format may be constructed to test parameters associated with refraction and residual statics corrections, multiple attenuation, dip-moveout correction, and migration. Powerful interactive tools, including 3-D visualization techniques, facilitate efficient parameter testing and quality control in processing.
- CMP sorting
- Velocity analysis
- Normal-moveout correction
- Multiple attenuation
- Dip-moveout correction
- CMP stacking
- Poststack processing
- Residual statics corrections
- Parsimony in processing
- Basic data processing sequence