No stranger to what the rest of the world is calling “Big Data,” the oil and gas industry is nonetheless undergoing tectonic shifts of its traditional workflows and business practices. Driving these shifts are the advent of unconventional hydrocarbons, and the complex nonlinear multivariant problem associated with commercializing and optimizing production.

The information these techniques yield, critical to a CEO or those primarily incentivized by production growth, is often challenging to the operations constituents because, for the first time, they can be compared to their peers.

State of the industry

The oil industry is an interesting beast. In some ways, it is and has been the most progressive user of computational power and techniques in realms such as seismology and reservoir modeling. Yet, it has been completely behind the curve in various other work silos.

In many of these silos, we apply computers to merely digitizing workflows in practice since the 1950s or earlier with pencil and paper. Futher, the most popular software packages in these areas are based on 25- to 35-year-old code bases.

The conceptual framework that still dominates is a point-specific, empty software package, with limited amounts of data imported on a one-time basis and worked on by a technical professional who puts out a qualified or semi-quantified product semiintegrating with work flows from other professionals through various GIS layers … at best.

We have thousands of professionals building out tens of thousands of little islands of relevant knowledge that end up being stranded and unintegrated.

The amount of data available to us is increasing exponentially, but most is going fallow. Horizontal and hydraulically fractured unconventional wells are an order of magnitude more complicated and expensive than the vertical and unstimulated wells that they replace.

As an industry, we have never had more known proved undeveloped reserves (PUDs), probable and possible locations to drill—but we have to do it with fewer technical professionals entering the industry than are retiring.

The power of our software is still in its infancy. We still prefer the one-time qualified opinion of a consultant over the quantified insights offered by a machine and the building of organizational knowledge.

How do we capture that knowledge? At its best, software and data enhance our experts. What if every technical professional you employed had access to the ideas and knowledge of those that came before?

Think of tomorrow’s data/software as an Iron Man outfit you provide to each of your technical professionals, with each doing the work of three or five or 20 people today. That is a far better model than to try to figure out how to double, triple, or vigintuple (20 times) your staff!

Let’s take a look at other industries. The financial sectors have been crunching vast amounts of data that has been continually streaming through the markets for some time. Almost instantaneously, stock and option prices, commodity futures and company valuations adjust as new data and information are gathered. Complex statistical models are developed to allow analysts and managers to keep abreast of changing conditions. Analogous analytical tools coupled with “dashboard” technologies can assist E&P management to organize, present and forecast metrics of critical interest.

In another arena, high-speed DNA sequencers have led to the common and broad use of genetic code from criminology to genealogy. Once an intractable problem, processing the genetic sequence to correlate everything from genealogy to genetic health risks is now the norm. Many of the geological “unknowns” we have qualitatively solved for over the years lend themselves to these deterministic techniques.

Companies can use more than a hundred geological measurements from tens of thousands of wells to quantify the natural productivity of particular unconventional formations, with each formation enjoying a separate and distinct production signature.

Huge opportunities lie buried in the vast amount of data available to companies. A typical oil and gas company today has about 3.9 million gigabytes—7,800 laptops worth of data—and most of it is lost or not accessible. Typically, only one-half of a percent of this data has been analyzed to create actionable value.

For instance, drilling and stimulation optimization is a very complex problem, or, as our statistics experts would say, a nonlinear multivariant problem. However, we are now able to quantify the effects of various engineering approaches as a product of the quality of the rock.

This is just the tip of the iceberg. We see a whole new world where knowledge isn’t crew see a world in which patterns in data emerge that are beyond normal human cognitive capacity. Lastly, we see a world where applications aren’t dumb tools; they contain knowledge and have the capacity to learn.

Leveled playing field

Data from every facet of the oil and gas industry is being collected at ever-increasing rates. Organizations of all sizes can now take advantage of statistical analysis and data management to energize scarcer and less leverageable resources such as time, capital and talent. This combination of new data management schemas, which avoid monolithic and expensively curated data stores and integrated analytics software, in fact creates a strong advantage for smaller organizations that are not encumbered by complex, last-generation legacy systems. Whether true wildcatters testing new zones, or fast followers, companies of every scale are able to take advantage of these new value multipliers.

Few, if any, oil company CEOs love IT. At best, it is the necessary cost of keeping the technical professional productive. Properly used, it serves as the most cost-effective multiplier of technical professional productivity. It provides far more bang for the buck than any other investment, other than a great 10-banger well. But improperly used, it can evolve into an out-of-control beast that can absorb an infinite amount of money with little or no incremental productivity improvement.

Data maximization, management

Most E&P companies spend significant money on managing the vast amounts and extreme variability in data types, leaving executives wondering whether they are getting a good return for the investment.

A couple of items stand out. Dirty data is a blocker; garbage in/garbage out. Cleaning data, especially in real time, is becoming a critical core competence, yet one that is missing from most companies.

Unconventional wells are also increasing demand for more efficient methods of managing and analyzing big data, since maximizing the return on investment of these expensive wells means the difference between pure loss and wild success. We now live in a world where one company can make a fortune out of the acreage that busted another.

High-performance analytics hold the promise of synthetizing this complex big data into more manageable results that support better decisions. Today’s compute power allows for much more complicated predictive models constrained and validated by huge amounts of data. Examples include:

  • Finding patterns of activity in leasing and permitting;
  • Identifying anomalous events that could signal emerging plays;
  • Revealing best practices in drilling/completion/production within the context of detailed earth models;
  • Benchmarking operator efficiencies within a given play for competitor analysis or for investment opportunities;
  • Better and more complex models for financial forecasting and recovery prediction;
  • Identifying the geological and reservoir parameters that most affect production; and
  • Identifying step changes in oilfield services that open up previously poor or “barren” rock.

The unconventional oil and gas boom has also ushered in an era of step increased acreage prices. Getting in front of a play before it is widely recognized is the dream of many. Many small companies are now able to anticipate these new or underdeveloped unconventional resource plays. Even when chasing conventional reservoirs, companies should fully evaluate unconventional resource projections and the stage of unconventional resource development in the area. This practice certainly creates an acreage portfolio opportunity in a world where the step value between conventional and unconventional acreage is 100 times or higher.

Workforce changes

Today, more technical professionals are retiring than are being replaced by new entrants in the oil and gas industry. Every new graduate that comes into the industry is replacing two new retirees with 40 to 50 years’ experience. These new entrants are getting managerial authority much earlier than their predecessors as well, due to the “valley of death” caused by essentially no hiring from the mid-1980s until the mid-2000s.

Without advanced technology, we will never be able to keep up with the drilling overhang that currently exists. This means that our exist ing knowledge must be captured in workflow processes and software tools so it is not lost.

On the bright side, recent graduates are far more computer-literate than their grayer antecedents and expect to use sophisticated computer-based analyses to process data and solve problems. The combination of dwindling historical expertise, new technical challenges, rapid advances in computer science and a more techno-savvy user base is leading us to exciting new
innovation and methodologies.

Forward-looking trends

Predictive analytics are becoming better each year. We have always had systems that help predict where hydrocarbons are, where the water is, where the fault is and so forth—those continue to get better. In the past few years we have seen the advent of systems that predict potential drilling problems or downtime. These are systems that learn from past events on other wells and then predict the future, thus saving operators millions of dollars in drilling costs.

We are seeing a rebirth in neural-net technology that learns from the dynamic databases built from daily work. Start-up seismic interpretation systems are just now making an appearance with neural-net technology. It will be a few years before they become mainstream, but it’s expected they will make a big impact on the industry in the next few years.

Prescriptive analytics, the name given to the branch of analysis where results “prescribe” which behavior will maximize certain future results, are quite new and drive exciting ROI opportunities. An example of this technique is analytics that successfully prescribe how best to stimulate a well given a particular rock signature to maximize ROI or IRR.

Interestingly, one can also identify those operators that minimized their opportunity, leaving behind great acreage apparently condemned by the drillbit. It is a spectacular time to be in the oil business!

Allen Gilmer is co-founder, chairman and CEO of Drillinginfo. Previously, he was an independent oilman for seven years, co-founding three profitable E&Ps. He holds several patents in the field of multi-component seismology.