People in the oil and gas industry constantly strive to improve exploration and development success rates, and one of the most effective tools of the last decade has been 3-D seismic. Today, while that technology is fully integrated into the business, it continues to evolve. In February, the Rocky Mountain Association of Geologists and the Denver Geophysical Society hosted their 8th annual 3-D seismic symposium in Denver, drawing more than 400 people, that focused on creating value through the use of 3-D seismic technology. Featured presentations were on the latest trends in acquisition, interpretation and imaging techniques, as well as case histories illustrating the commercial value of 3-D seismic. The upshot? The use of 3-D seismic is expanding from prospect-level interpretation into a larger force. Certainly, there are constant improvements and tweaks that geophysicists can make to 3-D data. Presentations illustrated new analytical techniques, such as the ability to image fluid-filled fractures using long-offset amplitude data, or how high-frequency data can now be used to image very thin beds. New methods of deriving inverse data from seismic were also discussed. Case histories buttressed 3-D seismic's already formidable reputation, offering examples of how the technology aids success rates. But the real intrigue comes from the way 3-D seismic is opening new doors, using its tremendous store of information to fundamentally change the traditional relationships between the geophysicist and his data, and between geophysics and the rest of the industry. Visualization Certainly, one of the most exciting advances of the past several years has been the explosion of visualization technology. Visualization is rapidly becoming an essential tool for oil and gas professionals. Michael Zeitlin, president and chief executive officer of Magic Earth, a subsidiary of Halliburton, noted that the business drivers in the geophysical sector will absolutely require acceptance of visualization technology. First, cycle time is compressed because explorationists are so pressed for time. "There are half the number of geologists and geophysicists working 10 times as much data as we had before, and we have accuracy issues." Data volume is another driver. The industry has acquired huge amounts of data-more than 1 million square kilometers of 3-D seismic are currently on the market. Nine lines out of 10 are simply not looked at. The third driver is demographics, the aging of the population of geophysicists. "We don't have many young people coming into the pipeline." These forces mean geophysicists must use whatever means available to be more successful and more productive. Today, a single geophysicist might be covering a data area that encompasses 21 offshore blocks by 14 blocks, with 4,000 lines of data. In the early 1980s, a typical work area was one-tenth that size. Speed and accuracy are crucial, and visualization greatly aids that quest. Until recently, geophysicists had to interpret 3-D data in 2-D planes, cutting 3-D seismic volumes into inlines and crosslines. Now they can employ visualization technology to work that data in three dimensions. Visualization also allows interpreters to interact with extremely large volumes of data at different resolutions. Fresh insights into the subsurface are gained through pattern recognition. Other advantages of visualization include an improved ability to collaborate, to reach partner consensus, to gain project approvals, and to quality control a project. New techniques that are enhancing visualization are also evolving, said Zeitlin. Multi-attribute-event-picking, use of combination volumes, corendering and the proper use of color are adding even deeper levels of understanding for interpreters working with data volumes. Geoffrey A. Dorn, executive director of the BP Center for Visualization at the University of Colorado, noted that leaps in visualization technology have been made in improving resolution, interactivity and display size. Visualization encompasses a range of technologies, from small workstations to fully immersive environments. The basic level employs small desk-top displays. While a very valuable technology, the approach focuses the interpreter on domain-specific applications. Desk-top visualizations enhance the traditional geophysical tasks, improving accuracy, completeness and efficiency, but do not fundamentally change the way people relate to the data. That is now being supplanted. "During the past four years, the industry has been moving to large-screen systems for certain applications," he said. The large-screen displays foster the integration of different data types and raise collaboration to a new level. Large-screen systems are a step up from desk-top systems, and supply a semi-immersive environment. The users gain the ability to integrate different types of data and to collaborate on a higher level. Fully immersive environments are the pinnacle of visualization technology. In these, users interact with data in the same way they would interact with the natural world around them. "This allows people to perceive and interact with the data using the tools they have developed since childhood to understand and interact with the real world. It frees up the cognitive portion of their mind to focus on science, engineering and other technical issues." While the industry is intrigued with these cutting-edge environments, three concerns are still preventing widespread embrace: high cost, a lack of applications and a lack of case histories that prove significant economic value. People in the visualization field are working hard to address and overcome those hurdles, noted Dorn. Reservoir simulation Another emerging use of seismic technology is in reservoir simulation. A simulation is built through the integration of petrophysical, geological, geophysical and reservoir engineering data. It constructs a model of the subsurface that defines a reservoir's limits, its structure, volume, heterogeneity and properties. An accurate simulation allows a company to understand a reservoir, and from that understanding to extract the maximum value from the asset. Ideally, a simulation determines the distribution of hydrocarbons and how those hydrocarbons will move within a reservoir. Keynote speakers William Keach and Nicholas Purday, volume visualization and interpretation managers with Landmark Graphics Corp., a Halliburton company, spoke about making better economic decisions by combining 3-D seismic with reservoir simulation. Traditionally, simulating a reservoir is a huge, complex process that covers only a very small area. Conventional models are very time consuming and laborious to construct. Today, however, the additions of 3-D seismic and visualization technologies are ushering in a fresh approach to reservoir simulations. Seismic data can now be used to directly estimate reservoir properties. And, by using newly developed volume reconnaissance and delineation techniques, accurate and useful reservoir models can be developed rapidly. "How much science do we need to do? People can spend too much time doing science projects. From a business aspect, we have to determine what information is essential to make the best decisions," Keach said. Once the critical aspects of an interpretation have been identified, unnecessary data can be stripped from the volume. Geocellular modeling and reservoir analysis can then be tightly focused on key areas. All available data-3-D seismic, velocity and well logs-are used to isolate seismically defined reservoir units. These reservoir units can then be directly incorporated into a reservoir simulation. The resulting simulations offer swift insights. Costs and potential revenues can be compared over a range of drilling options. "Combining a reservoir simulation with 3-D seismic allows a company to make the best business decisions. The results are very accurate and successful well planning." Two key points are that new technologies put seismic-to-reservoir stimulation within the reach of most interpreters, and that the rapid delineation and development of reservoir models leads directly to better economic outcomes. Naturally, barriers to the use of this technique still exist, noted Purday. Getting people to change is always a challenge, and companies need to commit to the technology and process. Nevertheless, momentum is growing. "It's a new framework for understanding seismic, and it's the next generation of interpretation." Seismic monitoring Another exciting use of 3-D seismic adds the dimension of time to the study of the subsurface. In theory, comparing successive seismic surveys acquired over the same area at different times will reveal the movement of fluids in the subsurface. While time-lapse seismic is not a new concept, practitioners of the technology are in the stage-necessarily a long one due to its very nature-of compiling case histories to illustrate its effectiveness. (See "Time-Lapse Seismic," Oil and Gas Investor, May 1998.) In that vein, Tom Davis, professor of geophysics, Colorado School of Mines, offered a case study of successful seismic monitoring of a massive CO2 flood at Weyburn Field in southeast Saskatchewan. The ongoing study is being conducted by the Reservoir Characterization Project (RCP), a consortium composed of industry and academia. Weyburn Field contains 1.4 billion barrels of oil in place, but since its discovery in 1954 only 24% of it has been produced. A waterflood was initiated in the early 1960s, but clearly some enhanced recovery methods were needed to further boost recoveries. In October 2000, operator PanCanadian Petroleum initiated a CO2 flood with the goal of increasing production by at least 15% incremental oil. The company uses horizontal wells as injectors, with laterals of about 2,500 feet that are about 400 feet apart. About 7 million cubic feet of CO2 is injected each day. The objective of the RCP study is to look at injection and receiving patterns in the reservoir in four dimensions. In September 2000, the group acquired a baseline multicomponent 3-D survey over nine square kilometers (covering four CO2-injection patterns). Last year, after some 23 billion cubic feet of CO2 had been injected into the reservoir, it acquired another survey in the same area. Another is planned for this fall. Davis is heartened by the early results. By comparing differences in the seismic volumes, changes can be seen over time. The reservoir of interest is about 100 feet thick and occurs at 5,000 feet. The first phase of the monitoring project has been successful in identifying areas where the oil-CO2 mixture is moving, and in identifying areas that remained unswept. For instance, CO2 could be seen fingering along off-trend fracture zones, the cause of early response and breakthrough. "We have shown that seismic monitoring is feasible in this very thin reservoir at a mile deep. The economic impact is huge-the monitoring of the flood's conformance is the key to its success." The benefits of this dynamic approach are clear: "We can get a lot more oil and gas out of our existing reservoirs. Seismic monitoring is an emerging technology, and it is going to lead to more production." And that is precisely the point. As Michael Zietlin noted, "Remember what business we are in-the oil and gas finding business, not the technology business."