A wavefield snapshot with interval velocity overlay. (Image courtesy Tierra Geophysical) |
?For most E&P projects using seismic imaging, the process seems fairly straightforward—acquisition, processing and interpretation. Now a fourth element can be added to that outline—modeling.
It is no surprise to the industry that seismic data is often imperfect. Knowing the extent of that imperfection is critical to upstream decision-makers. New finite-difference software modeling can make that happen.
Start-up company Tierra Geophysical LLC, based in Denver, specializes in creating and selling software products, such as finite-difference modeling software, that help geoscientists model seismic surveys, before actual acquisition of the seismic, to ensure information gleaned from the survey will justify the cost and will be accurate.
While this may appear to be putting the cart before the horse, the industry has long realized such modeling could be a significant risk reducer. The problem is that it has been too expensive to be practical for most surveys. That is changing.
“Not only is the process getting much cheaper, but with more realistic modeling, it’s accurate enough that it can reliably address key risk issues about a lot of large decisions,” says founder and senior scientist Christof Stork.
Here’s how it works. “Once you get to the point of considering a seismic survey, you know the geology you’re looking for. This technique is almost the classical scientific process, which we’ve almost never performed before in geophysics. You make a hypothesis, and then you test it,” he says.
A finite-difference model includes the hypothesized geology, and the computer simulates a seismic-acquisition survey over it. The simulated data is then processed and migrated to produce a sample image, which can be compared with the original hypothesized geology.
Because seismic interpretation often involves analysis of subtle features in the presence of noise, seismic modeling enables accurate prediction of how geology will appear in the final seismic image, visualizing how subtle features and noise express themselves.
The outcome helps geoscientists to determine what key features to look for in seismic images, if a specific feature is real or merely noise and what features can by reliably interpreted.
“The modeling is so realistic that it will include noise such as multiples, ground roll and surface scattering. You can also include acquisition artifacts, such as production platforms, that create acquisition holes.
“Being able to see how the geology represents itself in the seismic can help decide whether it’s worth acquiring data, whether to believe the image enough to drill a well and how to best do reservoir characterization.”
This has been done before, most notably by BP, which modeled a wide-azimuth survey over its Mad Dog Field in the Gulf of Mexico before firing a single shot of the expensive groundbreaking acquisition. The cost was not insignificant. Stork says the company spent several million dollars on a modeling cluster and almost a year simulating the data.
Stork says the combination of imaging more subtle targets and new acquisition hardware makes the time ripe for a modeling revolution. Also, Stork was able to model the Mad Dog survey at a fraction of BP’s cost.
“My cost for that large survey might be $200,000 or less,” he says. “The bottom line is that people can now do this realistic modeling at a reasonable cost. The technology has been around for 30 years, and everyone knows it’s been great. But when the technology finally reaches the point where it’s cheap enough and people find the right use, it explodes. I think that’s going to happen now.”
Shell Oil Co. is also onboard. “That was a wonderful shot in the arm for me because Shell is very technical, has high standards, and does most of its work in-house,” he says. “But they did several tests of my software and bought it in a snap.” WesternGeco, a unit of Schlumberger Ltd., has also licensed the software.
Stork says the potential of the technology should grab the attention of high-level decision-makers in major oil companies because they typically make risk-reward decisions based on imperfect seismic data.
“If somebody is going to acquire a $10-million seismic survey, drill a $50-million well based on seismic data or start a $100-million enhanced-recovery program, knowing how much he can believe that data adds a lot of value.”
Recommended Reading
TotalEnergies Restarts Gas Production at Tyra Hub in Danish North Sea
2024-03-22 - TotalEnergies said the Tyra hub will produce 5.7 MMcm of gas and 22,000 bbl/d of condensate.
Rystad: More Deepwater Wells to be Drilled in 2024
2024-02-29 - Upstream majors dive into deeper and frontier waters while exploration budgets for 2024 remain flat.
Deepwater Roundup 2024: Americas
2024-04-23 - The final part of Hart Energy E&P’s Deepwater Roundup focuses on projects coming online in the Americas from 2023 until the end of the decade.
Deepwater Roundup 2024: Offshore Africa
2024-04-02 - Offshore Africa, new projects are progressing, with a number of high-reserve offshore developments being planned in countries not typically known for deepwater activity, such as Phase 2 of the Baleine project on the Ivory Coast.
Chevron, Total’s Anchor Up and (Almost) Running
2024-05-07 - During the Offshore Technology Conference 2024, project managers for Chevron’s Anchor Deepwater Project discussed the progress the project has made on its journey to reach first oil by mid-2024.