Seismic change is afoot

11 May 2020


Ludger Mintrop’s balls were huge – certainly bigger than most balls the residents of turn-of-thecentury Göttingen, Germany, had ever seen.

Forged out of solid steel, the scientist had managed to haul one of his spheres to the top of an 48ft high metal frame, holding it in position using a crude winch.

A pupil of the pioneering seismologist Emil Johann Wiechert, Mintrop had a deep understanding of the character and scale of the tremors that daily rumbled beneath the earth. The student, however, wanted to go one better than the master; he would create his own earthquakes, dropping his great steel balls from the metal frame to hit the ground with more force than anything seen before in Göttingen. By repeating this experiment day after day, instead of waiting for equivalent natural tremors, Mintrop would go on to fine-tune his seismological equipment and deepen humanity’s understanding of the world that lies beneath our feet.

It was this simple feat of engineering that established the rules behind seismic exploration on a grand scale. Within two decades, geologists were able to survey the rock structures beneath their feet with increasing accuracy by generating similarly artificial tremors – first, by detonating explosives, then through very loud pulses of sound – and measuring the time it took for them to return from a variety of different vantage points.

Oil and gas companies have relied on a variation of this method to uncover huge hydrocarbon deposits on land and at sea. Known as seismic imaging, it involves sending a focused sonic pulse down into the ground, and measuring the intensity of whatever sounds echo back using hydrophones. From this, oil companies strive to create a 3D map of any reservoir that might lie beneath, informing the location of exploratory drilling platforms.

The method might be straightforward, but interpreting the data generated is not. Divining the shape and character of the earth beneath from the echoes of seismic waves requires not only the use of complex algorithms, but also significant raw processing power to find the signal in all the noise propagated by the initial sonic pulse.

Go figure

Since the 1980s, that has led oil and gas companies to invest heavily in the potential of in-house supercomputers. In February, Italian energy giant Eni announced its creation of ‘HPC5’, a supercomputer capable of performing 52 millionbillion operations every second, which it would use not only to run Echelon, its reservoir simulator, but also hone approaches towards processing using AI. Other notable competitors in this space include Total and its ‘Pangea III’ supercomputer, and ExxonMobil, a prominent user of the Blue Waters supercomputer at the University of Illinois.

The appeal of using such machines, either in-house or through lease agreements, is simple: the more raw processing power they have, the greater clarity oil and gas companies are able to obtain in their seismic imaging from the data available.

“They want to process it in a variety of different ways, using different algorithms to try to get the clearest picture that they can,” explains Marc Spieler, business lead for energy at Nvidia. “Because time is limited, they want to make sure that when they process it, they’re processing it in such a way that they’re able to get the most accurate picture as quickly as possible, so that if they don’t see what they’re looking for, they’re able to move on to other potential prospects.”

Dark frontiers

While many people might associate it with its work in creating graphics cards to improve the visual quality of video games, Nvidia also liaises extensively with oil and gas companies seeking to upscale their computing resources and better contend with processing seismic data. Its work in both markets, however, springs from the same source: the graphics processor unit (GPU).

GPUs work by manipulating the shading, colour and light of individual pixels on computer screens – pixels that are, essentially, mathematical equations. “Over time, people in the academic field realised that they could actually program those processor cores on the GPU to solve maths equations and run highly parallelised code across those cores,” says Spieler.

What’s more, GPUs can house thousands more cores – the processor within the unit that works on a single task – than the more popular central processing units (CPUs), which at most, could only support several dozen. As such, the supercomputers capable of processing the huge volumes of data generated by offshore seismic surveys did so by uniting the processing power of many hundreds of CPUs. The emergence of GPUs would increase the raw computing power available to them by orders of magnitude.

“You can imagine the level of detail coming up,” says Spieler. “Being able to actually compute higherfrequency pictures has allowed them to actually get much more clear and accurate pictures of the subsurface that they just couldn’t do before. In the past, it just wasn’t cost-effective to get the same level of resolution that you can now get with a GPU basis.”

“Over time, people in the academic field realised that they could actually program those processor cores on the GPU to solve maths equations and run highly parallelised code across those cores.”

Not every imaging problem, however, can be solved with more processing power. Subsea salt deposits, for example, pose a challenge for even the most powerful supercomputer. While the presence of the substance on the seabed can indicate the presence of hydrocarbons below, this is not easily confirmed by conventional seismic imaging methods, with the salt generally refracting the signal in ways that are difficult for traditional algorithms to interpret.

Several energy companies have made progress in attacking this problem at source. In 2018, BP announced its development of Wolfspar, a seismic sound generator that emits sonic pulses at a lower frequency than conventional air guns. Designed to survey the salt beds on the ocean floor in the Gulf of Mexico, the unit – once paired with an innovative algorithm by a company researcher – succeeded in uncovering a huge new reserve in the region, upping its production there by 100,000 barrels a day.

The invention has not enjoyed unqualified success. Its expertise in seismic surveys did not achieve tangible results in the Peroba deepwater block, a salt-heavy region off the Brazilian coast that many assumed hid vast unexploited reserves. Nevertheless, the shift away from exploratory drilling towards sophisticated modelling techniques has led to significant cost savings for BP. Spieler predicts further changes for the use of such models in the years ahead.

“It’s going to help oil companies that are finding it more difficult to pull younger people into the industry, to basically get through the best amount of data more quickly.”

“It used to be that, in order to [achieve] high-class imaging, you had to have big computing resources,” he says. “In order to have a lot of computing resources, you had to be either a very large company that could afford it, or you had to be a service provider.”

That’s changing with the growing popularity of cloud computing. Companies with in-house supercomputers, like Total and Eni, may be left behind by competitors buying processing power on the open market.

“You’ll see that a lot of folks are now also turning to buying big systems, but also bursting to the cloud for excess capacity when they have big projects,” Spieler continues. “But this is also evening out the playing field. Smaller companies couldn’t afford to have 2,000 servers sitting in their data centre, but had a project that might take three months [to process]. They can now use an Azure, or an Amazon Cloud service, and have access to the same size supercomputers that the large oil and gas companies do.”

Open access

Spieler is also excited by the work being done to pair the huge processing power available to energy companies with deep learning. “They have a lot of data that they have been able to accumulate and keep,” he explains. “Now, what we’re finding is that with... new deep-learning capabilities, they’re able to actually go through and process tremendous amounts of data in real time, which will give them insights into previous decisions made.”

Surveys that were previously considered to have revealed little useful information could now, potentially, be re-examined. The pace at which data can be analysed could also be sped up, with machine-learning algorithms helping researchers identify areas that are more promising than others. Spieler doesn’t think that the energy industry is ready to let AI take all the decisions quite yet, however. Nevertheless, such a development could present significant cost savings related to recruitment.

“It’s going to help oil companies that are finding it more difficult to pull younger people into the industry, to basically get through the best amount of data more quickly,” says Spieler.

It would be a far cry from the straightforward methods of Mintrop’s experiments with his vast, steel balls. Nevertheless, the basic method – generate an artificial movement in the earth, measure it and map the subterranean world – remains sound. Whether or not machines will be undertaking the third in that trio of tasks remains to be seen.



Privacy Policy
We have updated our privacy policy. In the latest update it explains what cookies are and how we use them on our site. To learn more about cookies and their benefits, please view our privacy policy. Please be aware that parts of this site will not function correctly if you disable cookies. By continuing to use this site, you consent to our use of cookies in accordance with our privacy policy unless you have disabled them.