As oil prices drop and shale operators around the U.S. sweat, one statistic sticks out from the 2014 DUG Eagle Ford conference—the open admission on stage by one of the largest energy services companies that “60% of all fracture stages are ineffective.” The presenter continued by asserting that a “step change” in completions and productivity is needed to sustain the unconventional energy revolution.

Such declarations shouldn’t surprise anyone in the oil and gas industry, although it is important to place them in context. Our industry’s use of terms such as “shale manufacturing” and “factory wells” begs the question: Can any factory operate profitably when three of every five products it makes are defective? This state of affairs is unsustainable—no pun intended—over the long term.

Fortunately, every mid- to large-size operator is already collecting the “big data” containing the recipes to transform completions’ effectiveness. The top-performing operators are putting the bits and the atoms together and improving performance.

Completions: Good, bad and ugly

A good completion is one that mechanically works as designed and creates a fracture network that maximizes well production as a function of one or more financial metrics (NPV, ROR, etc.). The cement isolates fluid movement, the plugs hold pressure (and don’t move), the casing doesn’t leak or split and the right sleeves actuate when balls are dropped. If all of the mechanical equipment is working as designed, then an operator has the optimum number of stages and clusters, as well as appropriate stage spacing and inter-well spacing, to maximize effectiveness and minimize the cost of stimulation treatment.

Many operators have moved to “factory-type” well implementations to improve unconventional asset economics. These well factories use the same drilling and completion recipes to leverage bulk purchasing power and reduce project costs.

Although the hardware and materials going into wells are uniform, production is not. Same-pad production can vary by 100% or more across wells. On occasion, widely varying well results may originate from laterals only 150 feet apart. Such wide variation is counterintuitive in an industry that has completed tens of thousands of wells in unconventional plays.

So why are there performance variations between wells for the same operator or between operators with adjacent wells? Is it geology? It is easy to blame poor performance on the reservoir. Does production variability originate in reservoir heterogeneities like total organic content (TOC), brittleness and fracture density, or a combination of reservoir and treatment heterogeneities? Are operators collecting the data necessary to identify variables? Ultimately, the industry doesn’t have control over reservoir quality (although designer wells can treat sweet spots and bypass poorer rock), but operators can control and optimize each fracture stage if in-well instrumentation provides the correct data.

Operators have had plenty of variables on well performance, but lacked the technology to make sense of them.

Secondly, is variability in performance a reflection of poor stimulation efficiency? Did the treatment include the correct chemistry, optimum proppant size and volumes, and optimized pumping rates? Did the stimulation process achieve compartmentalization and zonal isolation?

Microseismic and tiltmeter monitoring are the primary diagnostic methods to monitor stimulation treatments. Both are far-field diagnostic tools. It is not possible to accurately track completion equipment and cluster-level performance using monitoring technologies designed to listen to rocks cracking.

If a service provider drops a ball and the wrong sleeve opens, how many possible options are there for why the microseismic events aren’t showing up where expected? Was it fluid traveling up a cement channel and then hitting a weak zone, or fluid hitting an unmapped fault or natural fracture network? Was it a casing leak? Often the last hypothesis is that completion hardware did not perform as designed. The industry always assumes that plugs, sleeves and other hardware work exactly as designed.

Few shale wells are instrumented to the level where there is enough data to validate the effect of program modifications. One approach is to heavily instrument a few wells per field to ensure that all downhole equipment, cement and treatment recipes work as designed. True reservoir potential can only be calculated by identifying the failed or underperforming stages. Once enough data is available to identify what is working, operators can match stage performance to rock and seismic properties. Doing so allows the industry to drive well design to maximize project NPV or any other economic metric.

Compressing the learning curve

Big data is the most underutilized resource in the oil and gas industry today. Many operators have data containing more than 1,000 variables that can impact well and cost performance.

While the datasets operators collect are impossible to process and make sense of using existing technologies, a new technology from outside the industry is making headway and impacting the bottom line. The concept of prescriptive analytics, a software technology that incorporates all data—regardless of source, structure, size, or format—to prescribe the optimal recipes for drilling, completing and producing wells, can maximize value at every point during a well’s serviceable life. The premise is as follows:

• The technology takes into account all pertinent data that an operator has in order to understand subsurface characteristics and the instrumentation recordings from far-field and near-wellbore monitoring.

• The technology prescribes a specific completion recipe for each new well (see software screenshot) that an operator is going to drill. This recipe is derived from the data the operator already has and takes into account the actionable and the nonactionable variables that affect completions within the context of the operator’s business objectives, preferences and constraints.

• The technology also illustrates which variables have the greatest sensitivity to well productivity. If an operator has a dollar to spend on well monitoring, where should that dollar be invested to collect the data needed to improve treatment?

• A prescribed recipe automatically updates, and gets better with, each new well. Prescriptive analytics is adaptive by design.

Prescriptive analytics software takes into account all data in prescribing the best possible completion recipe for each new well.

The software screenshot is an example of prescriptive analytics in action. The technology combines disparate scientific disciplines that haven’t been combined before (see diagram) to offer functionalities that haven’t been possible before. It determines which operator-controlled variables have the greatest influence on production, given what is known about the reservoir in which the well will be drilled and completed. The technology generates recommendations—or prescriptions—in the form of precise settings, combinations, or values for each actionable variable, taking into consideration economic constraints and operator-specific asset management strategy.

For example, much of today’s discourse is focused on longer laterals, more sand and more fracture stages, with the common wisdom defaulting to “more is better.” The reality is more nuanced and complex. Where is the proppant going? How are stages spaced along the lateral? What is the optimal cluster spacing and perforation density for a particular well?

Intuitively, proppant selection, proppant density and proppant concentration are important, but how do they relate to decisions about pump pressures, pump rates and fluid composition? Treatment sequencing and timing matters, as do decisions about diverter use.

Most importantly, how do individual decisions interrelate? What is the collective impact on ultimate production and the economics of the asset portfolio? How do choices made during the completion phase impact the design, timing, implementation, and effectiveness of down-the-line production decisions surrounding artificial lift, EUR and refracturing?

With so many moving parts, operators have found it difficult to identify best practices for a particular field, let alone implement unique designs for individual wells. Methodically establishing a baseline during which a single variable is manipulated and the impact on production and well economics recorded will eventually result in improvement. But most operators don’t have large-enough budgets and/or long-enough timelines for trial-and-error to produce superior returns, and investors aren’t typically that patient.

Prescriptive analytics software produces better answers faster, compressing learning curves, shrinking timelines and reducing risk and exposure. The result is not just more production, but also more efficient and more predictable production.

Software screenshot showing prescribed settings (in green) for operator-controlled variables to affect production and financial performance.

Better completions, today

The smartest people among us can possibly keep track of eight variables—moving in different directions at different speeds—in their heads. Most of us are not that fortunate, not by a long shot. Prescriptive analytics technology can make sense of thousands of variables across hundreds of wells—extracted from videos, images, sounds, texts and numbers—and derive big-data-driven prescriptions on how to improve the next well completion. For the well after that, the prescription takes into account the latest well completed, adapts and gets better.

Atanu Basu is the CEO of Ayata, a software company that invented and refined prescriptive analytics Daniel Mohan is the senior vice president of sales and marketing; Marc Marshall is senior vice president of engineering; and Glenn McColpin is an industry consultant to Ayata.