Home

Quick Links

Search

 
Quantitative Quality Control: a Tool for Seismic Data Processing Monitoring and ComparisonNormal access

Authors: A. Nateganov, A. Popov and B. Pagliccia
Event name: Saint Petersburg 2018
Session: Application of Seismic Data Processing
Publication date: 09 April 2018
DOI: 10.3997/2214-4609.201800239
Organisations: EAGE
Language: English
Info: Extended abstract, PDF ( 774.98Kb )
Price: € 20

Summary:
Quality Control (QC) is an essential way of seismic data processing follow up necessary to insure appropriateness of the final deliverables. Qualitative approach based mainly on visual analysis of spectra, gathers, sections, slices etc. before and after key processing steps is most commonly used for such a QC. Introduction of so-called "well-driven/guided" methodology can be regarded as a shift towards more quantitative way for seismic data processing QC. In Total "Quantitative QC" methodology was extended even further beyond well-to-seismic ties by integration of different seismic attributes, numerical statistics from which provides quantification of seismic data quality in different domains (signal quality, its vertical, lateral and AVO stability and consistency etc.) after key stages of any particular processing workflow. The current paper describes main principles of "Quantitative QC" implementation followed by a real case example illustrating that it allowed not only evaluation of re-processing potential compared to the legacy dataset, but also comparison and ranking of 5 contractors participated in pilot re-processing competition. However, it is believed that quantitative approach is not intended to fully substitute the qualitative one, but rather complement it providing broader vision on (re-)processing effectiveness and its eventual impact on seismic data quality.


Back to the article list