Monday, October 22, 2018

Obtaining Reservoir Engineering Parameters in Each Layer

Once the reservoir geometry has been defined, if not actually computed, one step remains before synthesizing the complete reservoir model. This is the estimation of key reservoir engineering parameters in each defined interval across the areal extent of the reservoir. Key parameters are net thickness, porosity, oil, gas and water saturations, and horizontal and vertical permeabilities. The computation proceeds in two stages. 

First, in each well the parameters must be averaged for each interval from the petrophysical interpretations.  This is performed in the component property module and relies on careful selection of cutoffs to exclude sections of formation that do not contribute to fluid movement. Choice of cutoffs is made with the help of sensitivity plots showing how the averaged parameter varies with cutoff value, and preferably in a well with well-test data to validate the cutoff choices. 

Second, the averaged parameters for each interval must be gridded or mapped across the reservoir. In the log property mapping module, the RM package brings into play powerful algorthms that use seismic data to guide the mapping. The key to the method is establishing a relationship at the wells between some attribute of the seismic data and a combination of the averaged well parameters, and then using the relationship to interpolate the averaged parameter everyhwere in the reservoir. The seismic attribute could be amplitude, or acoustic impedance calculated earlier using the inversion module, or one of several attributes that are routinely calculated on seismic interpretation workstations and the imported to the RM system, or simply depth.

The relationship may be linear - that is, the combination of averaged parameters is defined as a simple weighted sum of seismic attributes -or nonlinear, in which an elaborate neural network approach juggles several linear relationships at the same time, picking the best one for given input. Linear relationships easily handle smooth dependecies such as between acoustic impedance and porosity. The nonlinear approach is required for averaged parameters, such as saturations, that may vary abruptly across a field.


In practice, the log property mapping module guides the interpreter through the essential stages: choosing the interval to map, comparing seismic data at the well intersections with the averaged well data, establishing relationship that show a good degree of correlation and then proceeding with the mapping. The advantage of log property mapping over conventional mapping was demonstrated in both the Conoco Indonesia, Inc. and Pertamina Sumbagut case studies. Research continues into finding ways of using all available data to assist the mapping of log data across the reservoir.





Building the Reservoir Model and Estimating Reserves

The stage is set for the RM package- the Model Builder. This module fully characterizes the reservoir by integrating the geometric interpretation  established with the correlation and section modeling modules, including definitions of reservoir tanks and fluid levels, with the reservoir engineering parameters established using the component property and log property mapping modules.

 The main task is constructing the exact shape of the reservoir layers. This is achieved by starting at a bottom reference horizon and building up younger layers according to their assigned descriptors, mimicking the actual process of deposition and erosion. For example, if a layer top has been defined as sequential and conformable, it will be constructed roughly parallel to the layer's bottom horizon. If a reference horizon has been described as an unconformity, then underlying layers can approach it at any angle, while layers above can be constrained to track roughly parallel. 


The areal bounds on layers are determined within the model builder module by severeal factors. First, spesific geometries can be imported. Second, areal bounds may be implied through the geometries created with the section modeling module. Third, the contours of petrophysical parameters estabished during log property mapping can establish areal limits. Fourth, thickness maps of layers can be interactively created and edited prior to model building.






The key dividen of model building is the establishment of reserve estimates for each tank. Oil in place, total pore volume, netpay pore volume, water volume, reservoir bulk volume, net-pay area and net-pay bulk thickness are some of parameters that can be calculated and tabulated on the workstation. Conoco Indonesia Inc.'s estimates using the RM package were in close agreement with standard calculation procedures. During appraisal, when the oil company decides whether to proceed to development, establishing reserve estimates is crucial. As a result, the many steps leading to this moment will be reexamined and almost certainly rerun to assess different assumptions about the reservoir. 


Say, for example, a geologist is working on correlating logs and creating geologic tops, while the geophysicist is preparing an inversion to obtain acoustic impedance. If both want to work concurrently, the version manager simply grows two branches. 

Similiarly, a reservoir engineer may wish to try several scenarios for mapping the distribution of porosity within a layer-say by mapping well log values only and alternatively by using seismics to guide the mapping with the log property mapping module. Two versions can be made in parallel with a branch for each scenario. Several further steps along each interpretation path may be necessary before it becomes clear which mapping technique is better. 

Material Balance Analysis and Preparation for Simulation

For reservoir managers striving to improve the performance of developed fields - for example, investigating placement of new wells or reconfiguring existing producers and injectors to improve drainage- the RM package has two more modules to offer. One provides a sophisticated material balance analysis that assesses whether the established reservoir model is compatible with historical production data.  The second converts the reservoir model into a format suitable for simulating reservoir behavior and predicting future production.


 Material balance analysis is performed using Formation reservoir test system module. In traditional material balance analysis, reservoir volume is estimated by noting how reservoir pressure decreases as fluids are produced. The more fluids produced, the greater the expected pressure decrease. Exactly how much depends on the compressibility of the fluids, which ban be determined experimentally from down-hole samples through pressure-volume-temperature (PVT) analysis, the compressibility of the rock, which can be determined from core samples in the lab, and , of course, reservoir volume. Faster declines in pressure than expected from such an analysis might indicate a smaller reservoir than first thought. Slower declines might indicate a high-volume aquifer driving production, or less rarely, connected and as yet undiscovered extensions to the reservoir. This traditional anaysis of reservoir size and drive mechanism requires no a priori knowledge of reservoir geometry, only production, pressure and PVT data. 


The module uses these basic principles of material balance, but applies them within the geometrically defined reservoir tanks of established reservoir model. This allows not only verification of tank volumes, but also estimation of fluid communication between tanks. Communication between tanks could be due to an intervening low-permeability bed or a fault being only partially sealing. Another result is the prediction of how fluid contacts are moving.










Sunday, October 14, 2018

Correlating Seismic and Well Data

Correlation is performed in several stages. The first is establishing geologic tops on each well using the detailed correlation module. With individual well data displayed for up to four wells simultaneously, the interpreter can correlate horizons from one well to the next, registering consistent geologic tops in every well across the field. All well data have the potential to aid in this process, with core information, petrophysical log interpretations, wireline testing results and production logs equally able to contribute to identifying significant geologic horizon. 

The next step signals the beginning of the merging of seismic and well data. In the well tie module, the 3D seismic trace at a given well is displayed versus two-way time alongside all pertinent well data, which are already converted to time using borehole seismic or check-shot data. The main purpose of this combined display is to tie events recognized on the seismic trace -seismic horizons - to the recently established geologic tops found from the well data. These ties, or links, between the two data types are crucial at several subsequent stages during the construction of the reservoir model. In addition, seismic markers found at this stage can be transferred to a seismic interpretation workstation for horizon tracking.

The first use of the tie, or link, between seismic and well data is in the velocity mapping module that enables the 3D seismic record versus time to be converted to a record versus depth. This crucial step subsequently allows the seismic data to guide the mapping of geologic horizons between wells. 

A velocity map for each layer is first assessed from the stacking velocities used in the 3D seismic processing. These are average velocities to the depth in question and must be converted to interval velocties using Dix's formula. The interpreter then maps these velocities for a given horizon, using one of four available algorithms including the sophisticated kriging technique, and reviews their appearance in plan view. Gradual changes in velocity are normal, but anomalies such as bull's eye effects -isolated highs or lows - that are geologically unacceptable can be edited out. 

Next, values of velocity at the intersections of horizons with wells are compared with velocity values obtained from acoustic log or borehole seismic data. The differences, determined in all wells, are also mapped and then used to correct the original velocity map. Finally, the corrected velocity map is used to convert the 3D seismic record to depth. To check the result, structural dip azimuth as estimated from dipmeter logs can be superimposed on the resulting map-structural dip azimuth should follow the line of greatest slope as indicated by map. 

With seismic data converted to depth, the interpreter can begin building a stratified model of the reservoir using the correlation module. First, seismic data acting as a guide allow geologic tops in one well to be firmly correlated with tops in adjacent wells. This display may be further enhanced by superimposing dipmeter stick plots and other forms of dipmeter interpretation along the well trajectories. Another display that shifts data to an arbitrary datum, generally an already correlated horizon, provides a stratigraphic perspective. Second, each geolgoic corrrelation is allocated descriptors that determine how it relates geometrically to its neighbors above and below. These descriptors are later used to build up the actual reservoir model. Third, all available information about reservoir compartemantilization -for example, saturation interpretations from well logs and wireline testing results- are used to identify flow barriers, such as a sealing fault, so the reservoir can be divided into a set of isolated volumes called tanks, essential for correctly estimating reserves.



Sometimes, the interpreter may want to manually dictate the geometry of a horizon or other feature - such as fault, bar, channel, etc.  - rather than let it be guided by established horizons on the 3D seismic data. This can be accomplished using the section modeling module, which offers an array of graphic tools to create and edit elements of the reservoir model in the vertical section. This labor-intensive manual creation of a reservoir model becomes mandatory when there are no seismic data or only sparse 2D data.

One source of data that may contribute to the definition of tanks and faults is the well test. Well tests give an approximation of tank size and , in particular, provide distance estimates from the well to sealing faults. Azimuth to the fault, however, is undetermined.



Thursday, October 11, 2018

Integrated Reservoir Interpretation

Every field is unique , and not just in its geology. Size, geographical location, production history, available data, the field's role in overeall company strategy , the nature of its hydrocarbon  - all these factors determine how reservoir engineers attempt to maximize production potential. Perhaps the only commonality is that decisions are ultimately based on the best interpretation of data. For that task, there is a variability to match the fields being interpreted.


In an oil company with separate geological, geophysical and reservoir engineering departements, the interpretation of data tends to be sequential. Each discipline contributes and then hands over to the next discipline. At the end of the line, the reservoir engineer attempts to reconcile the cumulative understanding of the reservoir with its actual behavior. Isolated from the geologist and geophysicist who have already made their contributions, the reservoir engineer can play with parameters such as porosity, saturation and permeability, but is usually barred, because of practical difficulties, from adjusting the underlying reservoir geometry.


The scenario is giving way to the integrated asset team, in which all the relevant  disciplines work together, hopefully in close enough harmony that each individual's expertise can benefit from insight provided by others in the team. There is plenty of motivation for seeking this route, at any stage of a field's development. Reservoirs are so complex and the art and science of characterizing them still so convoluted, that the uncertainties in exploitation, from exploration to maturity, are generally higher that most would care to admit. 


In theory, uncertainty during the life of a field goes as follows: During exploration, uncertainty is highest. It diminishes as appraisal wells are drilled and key financial decisions have to be mad regarding expensive production facilities- for offshore fields, these typically account for around 40% of the capital outlay during the life of the field. As the field is developed, uncertainty on how to most efficiently exploit it diminishes further. By the time the field is dying, reservoir engineers understand their field perfectly.

A realistic scenario may be more like this: During exploration, uncertainty is high. But during appraisal, the need for crucial decisions may encourage tighter bounds on the reservoir's potential than are justifiable. Later, as the field is developed, production fails to match expectations, and more data, for example  3D seismic data, have to be acquired to plug holes in the reservoir understanding. Uncertainties begin to increase rather than diminish. They may even remain high as parts of the field become unproducible due to water breakthrough and as reservoir engineers still struggle to fathom the field's intricacies.

Asset teams go a long way toward maximizing understanding of the reservoir and placing a realistic uncertainty on reservoir behaviour. They are the best bet for making most sense of the available data. What they may lack, however, are the right tools. Today, interpretation is mainly performed on workstations with the raw and interpreted data paraded in its full multidimensionality on the monitor. Occasionally, hard-copy output is still the preferred medium - for example, logs taped to walls for correlating large numbers of wells. 

There are workstation packages for 3D seismic interpretation, for mapping, for viewing different parts of the reservoir in three-dimensions, for petrophysical interpretation in wells, for performing geostatistical modeling in unsampled areas of the reservoir, for creating a grid for simulation , for simulating reservoir behavior, and more.  But for the reservoir manager, these fragmented offerings lack cohesion. In a perceived absence of an integrated reservoir management package, many oil companies pick different packages for each spesific application and then connect them serially.

Any number of combinations is possible. The choice depends on oil company preferences, the history of the field and t he problem being adressed. Modeling a mature epehant field in the Middle East with hundreds of wells and poor seismic data may require a different selection of tools than a newly discovered field having high-quality 3D seismic coverage and a handful of appraisal wells. Reservoir management problems vary from replanning an injection strategy for mature fields, to selecting horizontal well for optimum recovery , to simply estimating the reserves in a new discovery about to be exploited.

Whatever the scenario, the tactic of stringing together diverse packages creates several problems. First is data compatibility. Since the industry has yet to firm up a definitive geoscience data model, each package is likely to accept and output data in slightly different ways. This forces a certain amount of data translation as the interpretation moves forward-indeed, a small industry has emerged to perform such translation. Second, the data management system supporting this fragmented activity must somehow keep track of the interpretation as it evolves. Ideally, the reservoir manager needs to know the history of the project, who made what changes, and if ncessary how to backtrack. 

Postprocessing Seismic Data

The interpretation path obviously depends on the data available. For two of the three fields considered here, there were excellent 3D seismic data. And in all three fields, there was at least one well with a borehole seismic survey. The first goal in working with seismic data is to ensure that the borehole seismic and the surface seismic at the borehole trajectory look as similiar as possible. If that is achieved, then the surface seismic can be tightly linked to events at the borehole and subsequently used to correlate structure and evaluate properties between wells. If no borehole seismic data are avaialble, an alternative is to use synthetic seismograms, computed from acoustic and density logs.

Differences in seismic data arise because of difficulties in achieving a zero phase response, a preferred format for displaying seismic results in which each peak on the trace corresponds exactly to an acoustic impedance contrast, and , by inference, geologic interface. Processing seismic data to obtain a zero-phase response depends on accurately assessing the signature of the acoustic source. This is straightforward in borehole seismics because t he measured signal can be split into its downgoing and upgoing componentss, and the former yields the source signature. In surface seismics, the downgoing field is unmeasured and statistical techniques must be used to assess the signature, leading to less reliable results.  In Conoco Indonesia, Inc.'s field, the surface seismic and borehole seismic data initially matced poorly. With the residual processing module, the mismatch is resolved by comparing the frequency spectra of the two data sets and designing a filter to pull the surface seismic data into line with the borehole seismic data. In this case, the postmatch alignment is excellent.






 However, if the alignment resulting from this treatment remains poor, it may prove necessary to vary the design of the filter versus two-way time. This is achieved by contructing filters for several specific time intervals along the well trajectory and then interpolating between them to obtain a representative filter at any two-way time. 

 The next step is to perform a seismic inversion on the matced seismic data, using the inversion module. This process converts the matced seismic data to acoustic impedance, defined as the product of rock density and acoustic velocity. Acoustic impedance can be used to classify lithology and fluid type. Mapped across a section in two dimensions or throughout space in three dimensions, acoustic impedance provides a valuable stratigraphic correlation tool. For Conoco Indonesia, Inc., inversion provided valuable insight into the lateral extent of the reservoir. 

The inversion computation requires the full spectrum of acoustic frequencies. Very low frequencies are missing from the seismic record, so these are estimated interactively from acoustic well logs. Between wells, this low-frequency information is interpolated  , and for a 3D inversion, the information must be mapped everywhere in reservoir space.

 

Wednesday, October 3, 2018

Beating the Exploration Schedule With Integrated Data Interpretation

Oil companies have made great strides in improving the success rate of exploration and production, mainly by using seismic data to guide well placement. But data gathering and interpretation do not stop here. Once a well is spudded, the company commits considerable resources to gather more data such as mud logs, cores, measurements-while-drilling (MWD) information and wireline logs. This creates a huge volme of data - often at different scales and qualities - that must be efficiently handled to ensure maximum return on the exploration investment. 

Communications hardware already allows large volumes of information to be moved rapidly from one computer to another  even from remote locations. And data management and universal data standards are gradually being introduced throughout the industry to facilitate data access.

To take full advantage of this new environment, however, geoscientist need their interpretation applications integrated into a common system that allows data sharing.

Until recently, no attempt was made to standarize the data format used by interpretation software packages, which meant that they could not communicate with each other or use a common database. More time was spent on converting and loading data than on interpretation. Applicatons often ran in series rather than parallel, introducing further delays. This resulted in drilling or completion decisions being made using incomplete answers while full interpretations took weeks or months.