Home

Quick Links

Search

 
An innovative approach to automation for velocity model buildingNormal access

Authors: Tony Martin and Marcus Bell
Journal name: First Break
Issue: Vol 37, No 6, June 2019 pp. 57 - 65
DOI: EAGE-EXPORT-FAKE-DOI
Special topic: Embracing Change - Creativity for the Future
Language: English
Info: Article, PDF ( 14.15Mb )
Price: € 30

Summary:
While some efforts have been made to quantify structural uncertainty on seismic images (Osypov et al., 2011; Letki et al., 2013), the seismic data processing industry has found it challenging to measure the effectiveness of processing algorithms on seismic data. Understanding the success of a single algorithm may be time consuming, require significant work, including research and development, and is therefore also costly. Despite this, the demand for ‘error bars’ on processes is growing, while expectations are that projects should be completed faster. At the same time seismic projects are getting bigger. It is not uncommon to see projects recording up to 20,000,000,000,000 samples. For now, each project will typically have 15 to 20 major processing components, which are managed by intermediate data outputs, each having unique characteristics. More than any other industry, the seismic acquisition and processing companies should be at the forefront of big data analysis. However, there has been little progress from the industry in advanced analytics or artificial intelligence, and there has been no major effort in companies metamorphosing from geophysics to geophysical data science. Therefore, the innovation must come from within, by the geophysicists who use processing workflows on a daily basis. Modifications and manipulations to established systems enable advanced analytics, reductions in turnaround and the opportunity to provide confidence levels on output data volumes.


Back to the article list