Home

Quick Links

Search

 
Improved normalization of time-lapse seismic data using normalized root mean square repeatability data to improve automatic production and seismic history matching in the Nelson fieldNormal access

Authors: K.D. Stephen and A. Kazemi
Journal name: Geophysical Prospecting
Issue: Vol 62, No 5, September 2014 pp. 1009 - 1027
DOI: 10.1111/1365-2478.12109
Organisations: Wiley
Language: English
Info: Article, PDF ( 2.34Mb )

Summary:
Updating of reservoir models by history matching of 4D seismic data along with production data gives us a better understanding of changes to the reservoir, reduces risk in forecasting and leads to better management decisions. This process of seismic history matching requires an accurate representation of predicted and observed data so that they can be compared quantitatively when using automated inversion. Observed seismic data is often obtained as a relative measure of the reservoir state or its change, however. The data, usually attribute maps, need to be calibrated to be compared to predictions. In this paper we describe an alternative approach where we normalize the data by scaling to the model data in regions where predictions are good. To remove measurements of high uncertainty and make normalization more effective, we use a measure of repeatability of the monitor surveys to filter the observed time-lapse data. We apply this approach to the Nelson field. We normalize the 4D signature based on deriving a least squares regression equation between the observed and synthetic data which consist of attributes representing measured acoustic impedances and predictions from the model. Two regression equations are derived as part of the analysis. For one, the whole 4D signature map of the reservoir is used while in the second, 4D seismic data is used from the vicinity of wells with a good production match. The repeatability of time-lapse seismic data is assessed using the normalized root mean square of measurements outside of the reservoir. Where normalized root mean square is high, observations and predictions are ignored. Net: gross and permeability are modified to improve the match. The best results are obtained by using the normalized root mean square filtered maps of the 4D signature which better constrain normalization. The misfit of the first six years of history data is reduced by 55 per cent while the forecast of the following three years is reduced by 29 per cent. The well based normalization uses fewer data when repeatability is used as a filter and the result is poorer. The value of seismic data is demonstrated from production matching only where the history and forecast misfit reductions are 45% and 20% respectively while the seismic misfit increases by 5%. In the best case using seismic data, it dropped by 6%. We conclude that normalization with repeatability based filtering is a useful approach in the absence of full calibration and improves the reliability of seismic data.

Download
Back to the article list