Browsing by Subject "Reservoir Simulation"
Now showing 1 - 13 of 13
Results Per Page
Sort Options
Item A Hierarchical History Matching Method and its Applications(2012-02-14) Yin, JichaoModern reservoir management typically involves simulations of geological models to predict future recovery estimates, providing the economic assessment of different field development strategies. Integrating reservoir data is a vital step in developing reliable reservoir performance models. Currently, most effective strategies for traditional manual history matching commonly follow a structured approach with a sequence of adjustments from global to regional parameters, followed by local changes in model properties. In contrast, many of the recent automatic history matching methods utilize parameter sensitivities or gradients to directly update the fine-scale reservoir properties, often ignoring geological inconsistency. Therefore, there is need for combining elements of all of these scales in a seamless manner. We present a hierarchical streamline-assisted history matching, with a framework of global-local updates. A probabilistic approach, consisting of design of experiments, response surface methodology and the genetic algorithm, is used to understand the uncertainty in the large-scale static and dynamic parameters. This global update step is followed by a streamline-based model calibration for high resolution reservoir heterogeneity. This local update step assimilates dynamic production data. We apply the genetic global calibration to unconventional shale gas reservoir specifically we include stimulated reservoir volume as a constraint term in the data integration to improve history matching and reduce prediction uncertainty. We introduce a novel approach for efficiently computing well drainage volumes for shale gas wells with multistage fractures and fracture clusters, and we will filter stochastic shale gas reservoir models by comparing the computed drainage volume with the measured SRV within specified confidence limits. Finally, we demonstrate the value of integrating downhole temperature measurements as coarse-scale constraint during streamline-based history matching of dynamic production data. We first derive coarse-scale permeability trends in the reservoir from temperature data. The coarse information are then downscaled into fine scale permeability by sequential Gaussian simulation with block kriging, and updated by local-scale streamline-based history matching. he power and utility of our approaches have been demonstrated using both synthetic and field examples.Item Continuous reservoir model updating using an ensemble Kalman filter with a streamline-based covariance localization(Texas A&M University, 2007-04-25) Arroyo Negrete, Elkin RafaelThis work presents a new approach that combines the comprehensive capabilities of the ensemble Kalman filter (EnKF) and the flow path information from streamlines to eliminate and/or reduce some of the problems and limitations of the use of the EnKF for history matching reservoir models. The recent use of the EnKF for data assimilation and assessment of uncertainties in future forecasts in reservoir engineering seems to be promising. EnKF provides ways of incorporating any type of production data or time lapse seismic information in an efficient way. However, the use of the EnKF in history matching comes with its shares of challenges and concerns. The overshooting of parameters leading to loss of geologic realism, possible increase in the material balance errors of the updated phase(s), and limitations associated with non-Gaussian permeability distribution are some of the most critical problems of the EnKF. The use of larger ensemble size may mitigate some of these problems but are prohibitively expensive in practice. We present a streamline-based conditioning technique that can be implemented with the EnKF to eliminate or reduce the magnitude of these problems, allowing for the use of a reduced ensemble size, thereby leading to significant savings in time during field scale implementation. Our approach involves no extra computational cost and is easy to implement. Additionally, the final history matched model tends to preserve most of the geological features of the initial geologic model. A quick look at the procedure is provided that enables the implementation of this approach into the current EnKF implementations. Our procedure uses the streamline path information to condition the covariance matrix in the Kalman Update. We demonstrate the power and utility of our approach with synthetic examples and a field case. Our result shows that using the conditioned technique presented in this thesis, the overshooting/undershooting problems disappears and the limitation to work with non- Gaussian distribution is reduced. Finally, an analysis of the scalability in a parallel implementation of our computer code is given.Item Effective Reservoir Management Using Streamline-Based Reservoir Simulation, History Matching and Rate Allocation Optimization(2014-08-28) Tanaka, ShuseiThe use of the streamline-based method for reservoir management is receiving increased interest in recent years because of its computational advantages and intuitive appeal for reservoir simulation, history matching and rate allocation optimization. Streamline-based method uses snapshots of flow path of convective flow. Previous studies proved its applicability for convection dominated process such as waterflooding and tracer transport. However, for a case with gas injection with strong capillarity and gravity effects, the streamline-based method tends to lose its advantages for reservoir simulation and may result in loss of accuracy and applicability for history-matching and optimization problems. In this study, we first present the development of a 3D 3-phase black oil and compositional streamline simulator. Then, we introduce a novel approach to incorporate capillary and gravity effects via orthogonal projection method. The novel aspect of our approach is the ability to incorporate transverse effects into streamline simulation without adversely affecting its computational efficiency. We demonstrate our proposed method for various cases, including CO2 injection scenario. The streamline model is shown to be particularly effective to examine and visualize the interactions between heterogeneity which resulting impact on the vertical and areal sweep efficiencies. Next, we apply the streamline simulator to history matching and rate optimization problems. In the conventional approach of streamline-based history matching, the objective is to match flow rate history, assuming that reservoir energy was matched already, such as pressure distribution. The proposed approach incorporates pressure information as well as production flow rates, aiming that reservoir energy are also reproduced during production rate matching. Finally, we develop an NPV-based optimization method using streamline-based rate reallocation algorithm. The NPV is calculated along streamline and used to generate diagnostic plots of the effectiveness of wells. The rate is updated to maximize the field NPV. The proposed approach avoids the use of complex optimization tools. Instead, we emphasize the visual and the intuitive appeal of streamline methods and utilize flow diagnostic plots for optimal rate allocation. We concluded that our proposed approach of streamline-based simulation, inversion and optimization algorithm improves computational efficiency and accuracy of the solution, which leads to a highly effective reservoir management tool that satisfies industry demands.Item Extending the Petrel Model Builder for Educational and Research Purposes(2013-04-11) Nwosa, Obiajulu CReservoir Simulation is a very powerful tool used in the Oil and Gas industry to perform and provide various functions including but not limited to predicting reservoir performance, conduct sensitivity analysis to quantify uncertainty, production optimization and overall reservoir management. Compared to explored reservoirs in the past, current day reservoirs are more complex in extent and structure. As a result, reservoir simulators and algorithms used to represent dynamic systems of flow in porous media have invariably got just as complex. In order to provide the best solutions for analyzing reservoir performance, there is a need to continuously develop reservoir simulators and reservoir simulation algorithms that best represent the performance of the reservoir without compromising efficiency and accuracy. There exists several commercial reservoir simulation packages in the market that have been proven to be extremely resourceful with functionality that covers a wide range of interests in reservoir simulation yet there is the constant need to provide better and more efficient methods and algorithms to study and manage our reservoirs. This thesis aims at bridging the gap in the framework for developing these algorithms. To this end, this project has both an educational and research component. Educational because it leads to a strong understanding of the topic of reservoir simulation for students which can be daunting especially for those who require a more direct experience to fully comprehend the subject matter. It is research focused because it will serve as the foundation for developing a framework for integrating custom built external simulators and algorithms with the workflow of the model builder of our reservoir simulation package of choice i.e. Petrel with the Ocean programming environment in a seamless manner for simulating large scale multi-physics problems of flow in highly heterogeneous flow of porous media. Of particular interest are the areas of model order reduction and production optimization. In-house algorithms are being developed for these areas of interest and with the completion of this project. We hope to have developed a framework whereby we can take our algorithms specifically developed for areas of interest and add them to the workflow of the Petrel Model Builder. Currently, we have taken one of our in-house simulators i.e. a two dimensional, oil-water five-spot water flood pattern as a starting point and have been able to integrate it successfully into the ?Define Simulation Case? process of Petrel as an additional choice for simulation by an end user. In the future, we will expand this simulator with updates to improve its performance, efficiency and extend its capabilities to incorporate areas of research interest.Item Fast history matching of finite-difference model, compressible and three-phase flow using streamline-derived sensitivities(Texas A&M University, 2006-10-30) Cheng, HaoReconciling high-resolution geologic models to field production history is still a very time-consuming procedure. Recently streamline-based assisted and automatic history matching techniques, especially production data integration by ??????travel-time matching,?????? have shown great potential in this regard. But no systematic study was done to examine the merits of travel-time matching compared to more traditional amplitude matching for field-scale application. Besides, most applications were limited to two-phase water-oil flow because current streamline models are limited in their ability to incorporate highly compressible flow in a rigorous and computationally efficient manner. The purpose of this work is fourfold. First, we quantitatively investigated the nonlinearities in the inverse problems related to travel time, generalized travel time, and amplitude matching during production data integration and their impact on the solution and its convergence. Results show that the commonly used amplitude inversion can be orders of magnitude more nonlinear compared to the travel-time inversion. Both the travel-time and generalized travel time inversion (GTTI) are shown to be more robust and exhibit superior convergence characteristics. Second, the streamline-based assisted history matching was enhanced in two important aspects that significantly improve its efficiency and effectiveness. We utilize streamline-derived analytic sensitivities to determine the location and magnitude of the changes to improve the history match, and we use the iterative GTTI for model updating. Our approach leads to significant savings in time and manpower. Third, a novel approach to history matching finite-difference models that combines the efficiency of analytical sensitivity computation of the streamline models with the versatility of finite-difference simulation was developed. Use of finite-difference simulation can account for complex physics. Finally, we developed an approach to history matching three-phase flow using a novel compressible streamline formulation and streamline-derived analytic sensitivities. Streamline models were generalized to account for compressible flow by introducing a relative density of total fluids along streamlines and a density-dependent source term in the saturation equation. The analytical sensitivities are calculated based on the rigorous streamline formulation. The power and utility of our approaches have been demonstrated using both synthetic and field examples.Item Fast History Matching of Time-Lapse Seismic and Production-Data for High Resolution Models(2012-10-19) Rey Amaya, AlvaroSeismic data have been established as a valuable source of information for the construction of reservoir simulation models, most commonly for determination of the modeled geologic structure, and also for population of static petrophysical properties (e.g. porosity, permeability). More recently, the availability of repeated seismic surveys over the time scale of years (i.e., 4D seismic) has shown promising results for the qualitative determination of changes in fluid phase distributions and pressure required for determination of areas of bypassed oil, swept volumes and pressure maintenance mechanisms. Quantitatively, and currently the state of the art in reservoir model characterization, 4D seismic data have proven distinctively useful for the calibration of geologic spatial variability which ultimately contributes to the improvement of reservoir development and management strategies. Among the limited variety of techniques for the integration of dynamic seismic data into reservoir models, streamline-based techniques have been demonstrated as one of the more efficient approaches as a result of their analytical sensitivity formulations. Although streamline techniques have been used in the past to integrate time-lapse seismic attributes, the applications were limited to the simplified modeling scenarios of two-phase fluid flow and invariant streamline geometry throughout the production schedule. This research builds upon and advances existing approaches to streamline-based seismic data integration for the inclusion of both production and seismic data under varying field conditions. The proposed approach integrates data from reservoirs under active reservoir management and the corresponding simulation models can be constrained using highly detailed or realistic schedules. Fundamentally, a new derivation of seismic sensitivities is proposed that is able to represent a complex reservoir evolution between consecutive seismic surveys. The approach is further extended to manage compositional reservoir simulation with dissolution effects and gravity-convective-driven flows which, in particular, are typical of CO2 transport behavior following injection into deep saline aquifers. As a final component of this research, the benefits of dynamic data integration on the determination of swept and drained volumes by injection and production, respectively, are investigated. Several synthetic and field reservoir modeling scenarios are used for an extensive demonstration of the efficacy and practical feasibility of the proposed developments.Item Fast Marching Method with Multiphase Flow and Compositional Effects(2014-08-06) Fujita, YusukeIn current petroleum industry, there is a lack of effective reservoir simulators for modeling shale and tight sand reservoirs. An unconventional resource modeling requires an accurate flow characterization of complex transport mechanisms caused by the interactions among fractures, inorganic matrices, and organic rocks. Pore size in shale and tight sand reservoirs typically ranges in nanometers, which results in ultralow permeability (nanodarcies) and a high capillary pressure in the confined space. In such extremely low permeability reservoirs, adsorption/desorption and diffusive flow processes play important roles for a fluid flow behavior in addition to heterogeneity-driven convective flow. In this study, the concept of ?Diffusive Time of Flight? (DTOF) is generalized for multiphase and multicomponent flow problems on the basis of the asymptotic theory. The proposed approach consists of two decoupled steps ? (1) calculation of well drainage volumes along a propagating ?peak? pressure front, and (2) numerical simulation based on the transformed 1-D coordinates. Geological heterogeneities distributed in 3-D space are integrated by tracking the propagation of ?peak? pressure front using a ?Fast Marching Method? (FMM), and subsequently, the drainage volumes are evaluated along the outwardly propagation contours. A DTOF-based numerical simulation is performed by treating a series of the DTOF as a spatial coordinate. This approach is analogous to streamline simulation, whereby a multidimensional simulation is transformed into 1-D coordinates resulting in substantial savings in computational time, thus allowing for high resolution simulation. However, instead of using a convective time of flight (CTOF), a diffusive time of flight is introduced in the modeling of a pressure front propagation. The overall workflow, which consist of the FMM and numerical simulation, is described in detail for single-phase, two-phase, blackoil, and compositional cases. The model validation is firstly performed on single-porosity systems with and without geological heterogeneity, then extended to multi-continuum domains including dual-porosity fractured reservoir and triple-continuum system. The large-scale unconventional models are finally demonstrated in consideration of the permeability correction for shale gas system and capillarity incorporation for confined phase behavior in multiphase shale oil system.Item History matching and uncertainty quantificiation using sampling method(2009-05-15) Ma, XianlinUncertainty quantification involves sampling the reservoir parameters correctly from a posterior probability function that is conditioned to both static and dynamic data. Rigorous sampling methods like Markov Chain Monte Carlo (MCMC) are known to sample from the distribution but can be computationally prohibitive for high resolution reservoir models. Approximate sampling methods are more efficient but less rigorous for nonlinear inverse problems. There is a need for an efficient and rigorous approach to uncertainty quantification for the nonlinear inverse problems. First, we propose a two-stage MCMC approach using sensitivities for quantifying uncertainty in history matching geological models. In the first stage, we compute the acceptance probability for a proposed change in reservoir parameters based on a linearized approximation to flow simulation in a small neighborhood of the previously computed dynamic data. In the second stage, those proposals that passed a selected criterion of the first stage are assessed by running full flow simulations to assure the rigorousness. Second, we propose a two-stage MCMC approach using response surface models for quantifying uncertainty. The formulation allows us to history match three-phase flow simultaneously. The built response exists independently of expensive flow simulation, and provides efficient samples for the reservoir simulation and MCMC in the second stage. Third, we propose a two-stage MCMC approach using upscaling and non-parametric regressions for quantifying uncertainty. A coarse grid model acts as a surrogate for the fine grid model by flow-based upscaling. The response correction of the coarse-scale model is performed by error modeling via the non-parametric regression to approximate the response of the computationally expensive fine-scale model. Our proposed two-stage sampling approaches are computationally efficient and rigorous with a significantly higher acceptance rate compared to traditional MCMC algorithms. Finally, we developed a coarsening algorithm to determine an optimal reservoir simulation grid by grouping fine scale layers in such a way that the heterogeneity measure of a defined static property is minimized within the layers. The optimal number of layers is then selected based on a statistical analysis. The power and utility of our approaches have been demonstrated using both synthetic and field examples.Item Performance Analysis & Optimization of Well Production in Unconventional Resource Plays(2013-05-01) Sehbi, Baljit SinghThe Unconventional Resource Plays consisting of the lowest tier of resources (large volumes and most difficult to develop) have been the main focus of US domestic activity during recent times. Horizontal well drilling and hydraulic fracturing completion technology have been primarily responsible for this paradigm shift. The concept of drainage volume is being examined using pressure diffusion along streamlines. We use diffusive time of flight to optimize the number of hydraulic fracture stages in horizontal well application for Tight Gas reservoirs. Numerous field case histories are available in literature for optimizing number of hydraulic fracture stages, although the conclusions are case specific. In contrast, a general method is being presented that can be used to augment field experiments necessary to optimize the number of hydraulic fracture stages. The optimization results for the tight gas example are in line with the results from economic analysis. The fluid flow simulation for Naturally Fractured Reservoirs (NFR) is performed by Dual-Permeability or Dual-Porosity formulations. Microseismic data from Barnett Shale well is used to characterize the hydraulic fracture geometry. Sensitivity analysis, uncertainty assessment, manual & computer assisted history matching are integrated to develop a comprehensive workflow for building reliable reservoir simulation models. We demonstrate that incorporating proper physics of flow is the first step in building reliable reservoir simulation models. Lack of proper physics often leads to unreasonable reservoir parameter estimates. The workflow demonstrates reduced non-uniqueness for the inverse history matching problem. The behavior of near-critical fluids in Liquid Rich Shale plays defies the production behavior observed in conventional reservoir systems. In conventional reservoirs an increased gas-oil ratio is observed as flowing bottom-hole pressure is less than the saturation pressure. The production behavior is examined by building a compositional simulation model on an Eagle Ford well. Extremely high pressure drop along the multiple transverse hydraulic fractures and high critical gas saturation are responsible for this production behavior. Integrating pore-scale flow modeling (such as Lattice Boltzmann) to the field-scale reservoir simulation may enable quantifying the effects of high capillary pressure and phase behavior alteration due to confinement in the nano-pore system.Item Performance of Assisted History Matching Techniques When Utilizing Multiple Initial Geologic Models(2011-11-15) Aggarwal, AkshayHistory matching is a process wherein changes are made to an initial geologic model of a reservoir, so that the predicted reservoir performance matches with the known production history. Changes are made to the model parameters which include rock and fluid parameters (viscosity, compressibility, relative permeability, etc.) or properties within the geologic model. Assisted History Matching (AHM) provides an algorithmic framework to minimize the mismatch in simulation, and aids in accelerating this process. The changes made by AHM techniques, however, cannot ensure a geologically consistent reservoir model. In fact, the performance of these techniques depends on the initial starting model. In order to understand the impact of the initial model, this project explored the performance of the AHM approach using a specific field case, but working with multiple distinct geologic scenarios. This project involved an integrated seismic to simulation study, wherein I interpreted the seismic data, assembled the geological information, and performed petrophysical log evaluation along with well test data calibration. The ensemble of static models obtained was carried through the AHM methodology. I used sensitivity analysis to determine the most important dynamic parameters that affect the history match. These parameters govern the large scale changes in the reservoir description and are optimized using the Evolutionary Strategy Algorithm. Finally, the streamline based techniques were used for local modifications to match the water cut well by well. The following general conclusions were drawn from this study- a) The use of multiple simple geologic models is extremely useful in screening possible geologic scenarios and especially for discarding unreasonable alternative models. This was especially true for the large scale architecture of the reservoir. b) The AHM methodology was very effective in exploring a large number of parameters, running the simulation cases, and generating the calibrated reservoir models. The calibration step consistently worked better if the models had more spatial detail, instead of the simple models used for screening. c) The AHM methodology implemented a sequence of pressure and water cut history matching. An examination of specific models indicated that a better geologic description minimized the conflict between these two match criteria.Item Reservoir Simulation and Evaluation of the Upper Jurassic Smackover Microbial Carbonate and Grainstone-Packstone Reservoirs in Little Cedar Creek Field, Conecuh County, Alabama(2013-04-25) Mostafa, Moetaz YThis thesis presents an integrated study of mature carbonate oil reservoirs (Upper Jurassic Smackover Formation) undergoing gas injection in the Little Cedar Creek Field located in Conecuh County, Alabama. This field produces from two reservoirs, one grainstone-packstone and the other microbial boundstone. The main objective of the study is to determine a potential redevelopment plan to increase oil recovery from the field by targeting the remaining oil saturation. This study involves using numerical reservoir simulation to identify the remaining recoverable oil distribution throughout the field. The 3-D geological model, which served as input for the dynamic reservoir simulation performed in this study, was provided by another author. Reservoir simulation indicates that potentially high recoverable oil saturation remains in the unitized area in the southwestern part of the field. Also, the simulation studies show that the following redevelopment plan investigated in this study has the potential to recover up to 5 MMSTB of oil by January 2017: converting 3 wells to inject water into the microbial boundstone reservoir, converting one more well to inject recycled gas into the grainstone-packstone reservoir performing work-over operations on 18 wells, sidetracking a plugged and abandoned well 10560, already completed in the grainstone-packstone reservoir, to another location in the same reservoir, and drilling 7 new wells in the grainstone-packstone reservoir and 5 new wells in the microbial boundstone reservoir. All these 12 new wells should be drilled on 160-acre unit size according to the field rules. Moreover, reservoir simulation showed that drilling additional 6 wells on a unit size less than160-acre (infill drilling) could result in additional recovery of up to 0.7 MMSTB of oil from the grainstone-packstone reservoir. No cost-benefit analysis studies have been performed in this thesis. Thus, the redevelopment plan investigated cannot be recommended for implementation until such analyses have been conducted.Item Subsurface Flow Management and Real-Time Production Optimization using Model Predictive Control(2012-02-14) Lopez, Thomas JaiOne of the key challenges in the Oil & Gas industry is to best manage reservoirs under different conditions, constrained by production rates based on various economic scenarios, in order to meet energy demands and maximize profit. To address the energy demand challenges, a transformation in the paradigm of the utilization of "real-time" data has to be brought to bear, as one changes from a static decision making to a dynamical and data-driven management of production in conjunction with real-time risk assessment. The use of modern methods of computational modeling and simulation may be the only means to account for the two major tasks involved in this paradigm shift: (1) large-scale computations; and (2) efficient utilization of the deluge of data streams. Recently, history matching and optimization were brought together in the oil industry into an integrated and more structured approach called optimal closed-loop reservoir management. Closed-loop control algorithms have already been applied extensively in other engineering fields, including aerospace, mechanical, electrical and chemical engineering. However, their applications to porous media flow, such as - in the current practices and improvements in oil and gas recovery, in aquifer management, in bio-landfill optimization, and in CO2 sequestration have been minimal due to the large-scale nature of existing problems that generate complex models for controller design and real-time implementation. Their applicability to a realistic field is also an open topic because of the large-scale nature of existing problems that generate complex models for controller design and real-time implementation, hindering its applicability. Basically, three sources of high-dimensionality can be identified from the underlying reservoir models: size of parameter space, size of state space, and the number of scenarios or realizations necessary to account for uncertainty. In this paper we will address type problem of high dimensionality by focusing on the mitigation of the size of the state-space models by means of model-order reduction techniques in a systems framework. We will show how one can obtain accurate reduced order models which are amenable to fast implementations in the closed-loop framework .The research will focus on System Identification (System-ID) (Jansen, 2009) and Model Predictive Control (MPC) (Gildin, 2008) to serve this purpose. A mathematical treatment of System-ID and MPC as applied to reservoir simulation will be presented. Linear MPC would be studied on two specific reservoir models after generating low-order reservoir models using System-ID methods. All the comparisons are provided from a set of realistic simulations using the commercial reservoir simulator called Eclipse. With the improvements in oil recovery and reductions in water production effectively for both the cases that were considered, we could reinforce our stance in proposing the implementation of MPC and System-ID towards the ultimate goal of "real-time" production optimization.Item Techniques of High Performance Reservoir Simulation for Unconventional Challenges(2013-12-05) Wang, YuheThe quest to improve the performance of reservoir simulators has been evolving with the newly encountered challenges of modeling more complex recovery mechanisms and related phenomena. Reservoir subsidence, fracturing and fault reactivation etc. require coupled flow and poroelastic simulation. These features, in turn, bring a heavy burden on linear solvers. The booming unconventional plays such as shale/tight oil in North America demand reservoir simulation techniques to handle more physics (or more hypotheses). This dissertation deals with three aspects in improving the performance of reservoir simulation toward these unconventional challenges. Compositional simulation is often required for many reservoir studies with complex recovery mechanisms such as gas inject. But, it is time consuming and its parallelization often suffers sever load imbalance problems. In the first section, a novel approach based on domain over-decomposition is investigated and implemented to improve the parallel performance of compositional simulation. For a realistic reservoir case, it is shown the speedup is improved from 29.27 to 62.38 on 64 processors using this technique. Another critical part that determines the performance of a reservoir simulator is the linear solver. In the second section, a new type of linear solver based the combinatorial multilevel method (CML) is introduced and investigated for several reservoir simulation applications. The results show CML has better scalability and performance empirically and is well-suited for coupled poroelastic problems. These results also suggest that CML might be a promising way of precondition for flow simulation with and without coupled poroelastic calculations. In order to handle unconventional petroleum fluid properties for tight oil, the third section incorporates a simulator with extended vapor-liquid equilibrium calculations to consider the capillarity effect caused by the dynamic nanopore properties. The enhanced simulator can correctly capture the pressure dependent impact of the nanopore on rock and fluid properties. It is shown inclusion of these enhanced physics in simulation will lead to significant improvements in field operation decision-making and greatly enhance the reliability of recovery predictions.