Home

Quick Links

Search

 
Optimizing performance of data processingNormal access

Authors: Samuel Brown and Tony Martin
Journal name: First Break
Issue: Vol 37, No 12, December 2019 pp. 47 - 50
DOI: 10.3997/1365-2397.2019040
Language: English
Info: Article, PDF ( 1.84Mb )
Price: € 30

Summary:
The industry associates data processing with the application of geophysical algorithms to seismic data, but this definition is too narrow. We should consider a broader meaning particularly in these times of increased automation, and time and cost consciousness. If processing is a method to produce information from the manipulation of data, then we should consider the framework that enables this process. A system must exist to facilitate the work. Generally, systems are a collection of computers, networks, processes and to a lesser extent people. For any given input, they create an output, which in seismic data processing can be the application of a geophysical algorithm, or it can be reordering, reducing or aggregating data. In this system, the headline grabbing geophysical process may only be a small part of the work. The underlying framework needs to be as efficient as it can be, either managing the data, enabling the geophysical process, or both.


Back to the article list