Seismic Imaging

Reflection Seismic

Reflected seismic waves bear high-resolutional information about geology and rock properties of the sub-surface.  Reflection seismic, thus, is the dominant method used for finding oil- and gas reservoirs. The method involves exploring the subsurface by means of acoustic waves that are excited by a large number of artificial sources at the surface.

Seismic Data Processing

The method of reflection seismic allows to gain insight to the structure of the Earth's subsurface by recording seismic waves that are reflected from contrasts of rock property in the subsurface. It is, in particular, possible to identify oil- and gas reservoirs so that the method has been used for several decades for the purpose of finding hydrocarbons.

Most of the easily detectable and accessible reservoirs have already been exploited. Further reservoirs are, thus, expected to be located in more complicated and deeper structures which results in higher requirements to accuracy and efficinecy of the seismic data processing.

HPC for Seismic Data Processing

We account for the enhanced requirements, by utilising modern concepts of high performance computing (HPC) for the development of methods of seismic data processing. Seismic data sets become increasingly bigger and cover larger and larger parts of the subsurfaces. On the other hand, solid conclusions about the potentiality of a structure can only be drawn if details that are very small compared to the overall problem size are interpreted. In conclusion, the demand for high efficiency of the calculations should not compromise the accury of the results.

Measured seismic data must be processed by a lengthy processing sequence whose components are chosen in accordance with  the acquisition geometry, the data quality, and the questions to be answered with help of the data. At ITWM, we have a strong focus on the method of seismic migration. This methodolgy can be understood as high-dimensional data mapping and generates images of the subsurface that can be interpreted by geologists. However, results of seismic migration are not directly eligible to this interpretation. They must be further conditioned in order to guarantee the data quality and accuracy needed. Techniques like time series analysis, multi-dimensional filtering and data transformations are used for this purpose. Tools developed for this purpose became part of the PSPRO interpretation and processing software.


Our Portfolio of Projects and Products in the Oil- and Gas Market Comprises:

  • Development of algorithms and production-ready software solutions for seismic data processing
  • Optimization and parallelization of clients' software
  • Special processing of true-amplitude prestack migration plus data conditioning

The method of reflection seismology scales across a large range of wavelengths. It can, thus, also be applied to shallow targets and, e.g., be used to answer questions about stability of constructions as, e.g., wind power plants.

Example Projects and Methods


Boulder-Detection with Machine Learning

We developed a process that detects and locates even small boulders or other disturbing obstacles in the ground.



The project »Deep Learning for Large Seismic Applications« (DLseis) deals with basic research up to ready-to-use deep learning tools for seismic applications.


Seismic Prestack-Depth Migration GRT

»Generalized Radon Transform« (GRT) depth migration excellently demonstrates how the unified competence of the department HPC could be used for the successful development of a tailor-made product.


The method of reverse time migration (RTM) stands out by high imaging quality even in case of high geological complexity. Solving the wave equation realistically allows the exact imaging of structures with strongly contrasting seismic velocities as they occur.

Pre-Stack PRO

Pre-Stack Pro is pre-stack seismic analysis software that combines pre-stack visualization, processing, and interpretation in one powerful platform.


ALOMA is a failure tolerant runtime system that helps dealing with these challenges by executing workflows on large-scale distributed systems.