Browsing by Subject "History Matching"
Now showing 1 - 11 of 11
Results Per Page
Sort Options
Item A Hierarchical History Matching Method and its Applications(2012-02-14) Yin, JichaoModern reservoir management typically involves simulations of geological models to predict future recovery estimates, providing the economic assessment of different field development strategies. Integrating reservoir data is a vital step in developing reliable reservoir performance models. Currently, most effective strategies for traditional manual history matching commonly follow a structured approach with a sequence of adjustments from global to regional parameters, followed by local changes in model properties. In contrast, many of the recent automatic history matching methods utilize parameter sensitivities or gradients to directly update the fine-scale reservoir properties, often ignoring geological inconsistency. Therefore, there is need for combining elements of all of these scales in a seamless manner. We present a hierarchical streamline-assisted history matching, with a framework of global-local updates. A probabilistic approach, consisting of design of experiments, response surface methodology and the genetic algorithm, is used to understand the uncertainty in the large-scale static and dynamic parameters. This global update step is followed by a streamline-based model calibration for high resolution reservoir heterogeneity. This local update step assimilates dynamic production data. We apply the genetic global calibration to unconventional shale gas reservoir specifically we include stimulated reservoir volume as a constraint term in the data integration to improve history matching and reduce prediction uncertainty. We introduce a novel approach for efficiently computing well drainage volumes for shale gas wells with multistage fractures and fracture clusters, and we will filter stochastic shale gas reservoir models by comparing the computed drainage volume with the measured SRV within specified confidence limits. Finally, we demonstrate the value of integrating downhole temperature measurements as coarse-scale constraint during streamline-based history matching of dynamic production data. We first derive coarse-scale permeability trends in the reservoir from temperature data. The coarse information are then downscaled into fine scale permeability by sequential Gaussian simulation with block kriging, and updated by local-scale streamline-based history matching. he power and utility of our approaches have been demonstrated using both synthetic and field examples.Item A Hybrid Ensemble Kalman Filter for Nonlinear Dynamics(2011-02-22) Watanabe, ShingoIn this thesis, we propose two novel approaches for hybrid Ensemble Kalman Filter (EnKF) to overcome limitations of the traditional EnKF. The first approach is to swap the ensemble mean for the ensemble mode estimation to improve the covariance calculation in EnKF. The second approach is a coarse scale permeability constraint while updating in EnKF. Both hybrid EnKF approaches are coupled with the streamline based Generalized Travel Time Inversion (GTTI) algorithm for periodic updating of the mean of the ensemble and to sequentially update the ensemble in a hybrid fashion. Through the development of the hybrid EnKF algorithm, the characteristics of the EnKF are also investigated. We found that the limits of the updated values constrain the assimilation results significantly and it is important to assess the measurement error variance to have a proper balance between preserving the prior information and the observation data misfit. Overshooting problems can be mitigated with the streamline based covariance localizations and normal score transformation of the parameters to support the Gaussian error statistics. The swapping mean and mode estimation approach can give us a better matching of the data as long as the mode solution of the inversion process is satisfactory in terms of matching the observation trajectory. The coarse scale permeability constrained hybrid approach gives us better parameter estimation in terms of capturing the main trend of the permeability field and each ensemble member is driven to the posterior mode solution from the inversion process. However the WWCT responses and pressure responses need to be captured through the inversion process to generate physically plausible coarse scale permeability data to constrain hybrid EnKF updating. Uncertainty quantification methods for EnKF were developed to verify the performance of the proposed hybrid EnKF compared to the traditional EnKF. The results show better assimilation quality through a sequence of updating and a stable solution is demonstrated. The potential of the proposed hybrid approaches are promising through the synthetic examples and a field scale application.Item Application of Fast Marching Method in Shale Gas Reservoir Model Calibration(2013-07-26) Yang, ChangdongUnconventional reservoirs are typically characterized by very low permeabilities, and thus, the pressure depletion from a producing well may not propagate far from the well during the life of a development. Currently, two approaches are widely utilized to perform unconventional reservoir analysis: analytical techniques, including the decline curve analysis and the pressure/rate transient analysis, and numerical simulation. The numerical simulation can rigorously account for complex well geometry and reservoir heterogeneity but also is time consuming. In this thesis, we propose and apply an efficient technique, fast marching method (FMM), to analyze the shale gas reservoirs. Our proposed approach stands midway between analytic techniques and numerical simulation. In contrast to analytical techniques, it takes into account complex well geometry and reservoir heterogeneity, and it is less time consuming compared to numerical simulation. The fast marching method can efficiently provide us with the solution of the pressure front propagation equation, which can be expressed as an Eikonal equation. Our approach is based on the generalization of the concept of depth of investigation. Its application to unconventional reservoirs can provide the understanding necessary to describe and optimize the interaction between complex multi-stage fractured wells, reservoir heterogeneity, drainage volumes, pressure depletion, and well rates. The proposed method allows rapid approximation of reservoir simulation results without resorting to detailed flow simulation, and also provides the time-evolution of the well drainage volume for visualization. Calibration of reservoir models to match historical dynamic data is necessary to increase confidence in simulation models and also minimize risks in decision making. In this thesis, we propose an integrated workflow: applying the genetic algorithm (GA) to calibrate the model parameters, and utilizing the fast marching based approach for forward simulation. This workflow takes advantages of both the derivative free characteristics of GA and the speed of FMM. In addition, we also provide a novel approach to incorporate the micro-seismic events (if available) into our history matching workflow so as to further constrain and better calibrate our models.Item Applications of Level Set and Fast Marching Methods in Reservoir Characterization(2012-10-19) Xie, JiangReservoir characterization is one of the most important problems in petroleum engineering. It involves forward reservoir modeling that predicts the fluid behavior in the reservoir and inverse problem that calibrates created reservoir models with given data. In this dissertation, we focus on two problems in the field of reservoir characterization: depth of investigation in heterogeneous reservoirs, and history matching and uncertainty quantification of channelized reservoirs. The concept of depth of investigation is fundamental to well test analysis. Much of the current well test analysis relies on analytical solutions based on homogeneous or layered reservoirs. However, such analytic solutions are severely limited for heterogeneous and fractured reservoirs, particularly for unconventional reservoirs with multistage hydraulic fractures. We first generalize the concept to heterogeneous reservoirs and provide an efficient tool to calculate drainage volume using fast marching methods and estimate pressure depletion based on geometric pressure approximation. The applicability of proposed method is illustrated using two applications in unconventional reservoirs including flow regime visualization and stimulated reservoir volume estimation. Due to high permeability contrast and non-Gaussianity of channelized permeability field, it is difficult to history match and quantify uncertainty of channelized reservoirs using traditional approaches. We treat facies boundaries as level set functions and solve the moving boundary problem (history matching) with the level set equation. In addition to level set methods, we also exploit the problem using pixel based approach. The reversible jump Markov Chain Monte Carlo approach is utilized to search the parameter space with flexible dimensions. Both proposed approaches are demonstrated with two and three dimensional examples.Item Continuous reservoir model updating using an ensemble Kalman filter with a streamline-based covariance localization(Texas A&M University, 2007-04-25) Arroyo Negrete, Elkin RafaelThis work presents a new approach that combines the comprehensive capabilities of the ensemble Kalman filter (EnKF) and the flow path information from streamlines to eliminate and/or reduce some of the problems and limitations of the use of the EnKF for history matching reservoir models. The recent use of the EnKF for data assimilation and assessment of uncertainties in future forecasts in reservoir engineering seems to be promising. EnKF provides ways of incorporating any type of production data or time lapse seismic information in an efficient way. However, the use of the EnKF in history matching comes with its shares of challenges and concerns. The overshooting of parameters leading to loss of geologic realism, possible increase in the material balance errors of the updated phase(s), and limitations associated with non-Gaussian permeability distribution are some of the most critical problems of the EnKF. The use of larger ensemble size may mitigate some of these problems but are prohibitively expensive in practice. We present a streamline-based conditioning technique that can be implemented with the EnKF to eliminate or reduce the magnitude of these problems, allowing for the use of a reduced ensemble size, thereby leading to significant savings in time during field scale implementation. Our approach involves no extra computational cost and is easy to implement. Additionally, the final history matched model tends to preserve most of the geological features of the initial geologic model. A quick look at the procedure is provided that enables the implementation of this approach into the current EnKF implementations. Our procedure uses the streamline path information to condition the covariance matrix in the Kalman Update. We demonstrate the power and utility of our approach with synthetic examples and a field case. Our result shows that using the conditioned technique presented in this thesis, the overshooting/undershooting problems disappears and the limitation to work with non- Gaussian distribution is reduced. Finally, an analysis of the scalability in a parallel implementation of our computer code is given.Item Effective Reservoir Management Using Streamline-Based Reservoir Simulation, History Matching and Rate Allocation Optimization(2014-08-28) Tanaka, ShuseiThe use of the streamline-based method for reservoir management is receiving increased interest in recent years because of its computational advantages and intuitive appeal for reservoir simulation, history matching and rate allocation optimization. Streamline-based method uses snapshots of flow path of convective flow. Previous studies proved its applicability for convection dominated process such as waterflooding and tracer transport. However, for a case with gas injection with strong capillarity and gravity effects, the streamline-based method tends to lose its advantages for reservoir simulation and may result in loss of accuracy and applicability for history-matching and optimization problems. In this study, we first present the development of a 3D 3-phase black oil and compositional streamline simulator. Then, we introduce a novel approach to incorporate capillary and gravity effects via orthogonal projection method. The novel aspect of our approach is the ability to incorporate transverse effects into streamline simulation without adversely affecting its computational efficiency. We demonstrate our proposed method for various cases, including CO2 injection scenario. The streamline model is shown to be particularly effective to examine and visualize the interactions between heterogeneity which resulting impact on the vertical and areal sweep efficiencies. Next, we apply the streamline simulator to history matching and rate optimization problems. In the conventional approach of streamline-based history matching, the objective is to match flow rate history, assuming that reservoir energy was matched already, such as pressure distribution. The proposed approach incorporates pressure information as well as production flow rates, aiming that reservoir energy are also reproduced during production rate matching. Finally, we develop an NPV-based optimization method using streamline-based rate reallocation algorithm. The NPV is calculated along streamline and used to generate diagnostic plots of the effectiveness of wells. The rate is updated to maximize the field NPV. The proposed approach avoids the use of complex optimization tools. Instead, we emphasize the visual and the intuitive appeal of streamline methods and utilize flow diagnostic plots for optimal rate allocation. We concluded that our proposed approach of streamline-based simulation, inversion and optimization algorithm improves computational efficiency and accuracy of the solution, which leads to a highly effective reservoir management tool that satisfies industry demands.Item Fast history matching of finite-difference model, compressible and three-phase flow using streamline-derived sensitivities(Texas A&M University, 2006-10-30) Cheng, HaoReconciling high-resolution geologic models to field production history is still a very time-consuming procedure. Recently streamline-based assisted and automatic history matching techniques, especially production data integration by ??????travel-time matching,?????? have shown great potential in this regard. But no systematic study was done to examine the merits of travel-time matching compared to more traditional amplitude matching for field-scale application. Besides, most applications were limited to two-phase water-oil flow because current streamline models are limited in their ability to incorporate highly compressible flow in a rigorous and computationally efficient manner. The purpose of this work is fourfold. First, we quantitatively investigated the nonlinearities in the inverse problems related to travel time, generalized travel time, and amplitude matching during production data integration and their impact on the solution and its convergence. Results show that the commonly used amplitude inversion can be orders of magnitude more nonlinear compared to the travel-time inversion. Both the travel-time and generalized travel time inversion (GTTI) are shown to be more robust and exhibit superior convergence characteristics. Second, the streamline-based assisted history matching was enhanced in two important aspects that significantly improve its efficiency and effectiveness. We utilize streamline-derived analytic sensitivities to determine the location and magnitude of the changes to improve the history match, and we use the iterative GTTI for model updating. Our approach leads to significant savings in time and manpower. Third, a novel approach to history matching finite-difference models that combines the efficiency of analytical sensitivity computation of the streamline models with the versatility of finite-difference simulation was developed. Use of finite-difference simulation can account for complex physics. Finally, we developed an approach to history matching three-phase flow using a novel compressible streamline formulation and streamline-derived analytic sensitivities. Streamline models were generalized to account for compressible flow by introducing a relative density of total fluids along streamlines and a density-dependent source term in the saturation equation. The analytical sensitivities are calculated based on the rigorous streamline formulation. The power and utility of our approaches have been demonstrated using both synthetic and field examples.Item History matching and uncertainty quantificiation using sampling method(2009-05-15) Ma, XianlinUncertainty quantification involves sampling the reservoir parameters correctly from a posterior probability function that is conditioned to both static and dynamic data. Rigorous sampling methods like Markov Chain Monte Carlo (MCMC) are known to sample from the distribution but can be computationally prohibitive for high resolution reservoir models. Approximate sampling methods are more efficient but less rigorous for nonlinear inverse problems. There is a need for an efficient and rigorous approach to uncertainty quantification for the nonlinear inverse problems. First, we propose a two-stage MCMC approach using sensitivities for quantifying uncertainty in history matching geological models. In the first stage, we compute the acceptance probability for a proposed change in reservoir parameters based on a linearized approximation to flow simulation in a small neighborhood of the previously computed dynamic data. In the second stage, those proposals that passed a selected criterion of the first stage are assessed by running full flow simulations to assure the rigorousness. Second, we propose a two-stage MCMC approach using response surface models for quantifying uncertainty. The formulation allows us to history match three-phase flow simultaneously. The built response exists independently of expensive flow simulation, and provides efficient samples for the reservoir simulation and MCMC in the second stage. Third, we propose a two-stage MCMC approach using upscaling and non-parametric regressions for quantifying uncertainty. A coarse grid model acts as a surrogate for the fine grid model by flow-based upscaling. The response correction of the coarse-scale model is performed by error modeling via the non-parametric regression to approximate the response of the computationally expensive fine-scale model. Our proposed two-stage sampling approaches are computationally efficient and rigorous with a significantly higher acceptance rate compared to traditional MCMC algorithms. Finally, we developed a coarsening algorithm to determine an optimal reservoir simulation grid by grouping fine scale layers in such a way that the heterogeneity measure of a defined static property is minimized within the layers. The optimal number of layers is then selected based on a statistical analysis. The power and utility of our approaches have been demonstrated using both synthetic and field examples.Item History-Matching Production Data Using Ensemble Smoother with Multiple Data Assimilation: A Comparative Study(2014-12-18) Xia, XiaoyangReservoir simulation models are generated by petroleum engineers to optimize field operation and production, thus maximizing oil recovery. History matching methods are extensively used for reservoir model calibration and petrophysical properties estimation by matching numerical simulation results with true oil production history. Sequential reservoir model updating technique Ensemble Kalman filter (EnKF) has gained popularity in automatic history matching because of simple conceptual formulation and ease of implementation. The computational cost is relatively affordable compared with other sophisticated assimilation methods. Ensemble Smoother is a viable alternative of EnKF. Unlike EnKF, Ensemble Smoother computes a global update by simultaneously assimilating all data available and provides a significant reduction in simulation time. However, Ensemble Smoother typically yields a data match significantly inferior to that obtained with EnKF. Ensemble smoother with multiple data assimilation (ES-MDA) is developed as efficient iterative forms of Ensemble Smoother to compare with conventional EnKF. For ES-MDA the same set of data is assimilated multiple times with an inflated covariance matrix of the measurement error. We apply ES-MDA and EnKF to generate multiple realizations of the permeability field by history matching production data including bottom-hole pressure, water-cut and gas-oil ratio. Both algorithms have been implemented to synthetic heterogeneous case and Goldsmith field case. Moreover, ES-MDA coupled with various covariance localization methods: Distance based, Streamline based and Hierarchical ensemble filter localization methods are compared in terms of both quality of history matching and permeability distribution.Item Performance of Assisted History Matching Techniques When Utilizing Multiple Initial Geologic Models(2011-11-15) Aggarwal, AkshayHistory matching is a process wherein changes are made to an initial geologic model of a reservoir, so that the predicted reservoir performance matches with the known production history. Changes are made to the model parameters which include rock and fluid parameters (viscosity, compressibility, relative permeability, etc.) or properties within the geologic model. Assisted History Matching (AHM) provides an algorithmic framework to minimize the mismatch in simulation, and aids in accelerating this process. The changes made by AHM techniques, however, cannot ensure a geologically consistent reservoir model. In fact, the performance of these techniques depends on the initial starting model. In order to understand the impact of the initial model, this project explored the performance of the AHM approach using a specific field case, but working with multiple distinct geologic scenarios. This project involved an integrated seismic to simulation study, wherein I interpreted the seismic data, assembled the geological information, and performed petrophysical log evaluation along with well test data calibration. The ensemble of static models obtained was carried through the AHM methodology. I used sensitivity analysis to determine the most important dynamic parameters that affect the history match. These parameters govern the large scale changes in the reservoir description and are optimized using the Evolutionary Strategy Algorithm. Finally, the streamline based techniques were used for local modifications to match the water cut well by well. The following general conclusions were drawn from this study- a) The use of multiple simple geologic models is extremely useful in screening possible geologic scenarios and especially for discarding unreasonable alternative models. This was especially true for the large scale architecture of the reservoir. b) The AHM methodology was very effective in exploring a large number of parameters, running the simulation cases, and generating the calibrated reservoir models. The calibration step consistently worked better if the models had more spatial detail, instead of the simple models used for screening. c) The AHM methodology implemented a sequence of pressure and water cut history matching. An examination of specific models indicated that a better geologic description minimized the conflict between these two match criteria.Item Streamline-based three-phase history matching(Texas A&M University, 2008-10-10) Oyerinde, Adedayo StephenGeologic models derived from static data alone typically fail to reproduce the production history of a reservoir, thus the importance of reconciling simulation models to the dynamic response of the reservoir. This necessity has been the motivation behind the active research work in history matching. Traditionally, history matching is performed manually by applying local and regional changes to reservoir properties. While this is still in general practice, the subjective overtone of this approach, the time and manpower requirements, and the potential loss of geologic consistency have led to the development of a variety of alternative workflows for assisted and automatic history matching. Automatic history matching requires the solution of an inverse problem by minimizing an appropriately defined misfit function. Recent advances in geostatistics have led to the building of high-resolution geologic models consisting of millions of cells. Most of these are scaled up to the submillion size for reservoir simulation purposes. History matching even the scaled up models is computationally prohibitive. The associated cost in terms of time and manpower has led to increased interest in efficient history matching techniques and in particular, to sensitivity-based algorithms because of their rapid convergence. Furthermore, of the sensitivity-based methods, streamline-based production data integration has proven to be extremely efficient computationally. In this work, we extend the history matching capability of the streamline-based technique to three-phase production while addressing in general, pertinent issues associated with history matching. We deviate from the typical approach of formulating the inverse problem in terms of derived quantities such as GOR and Watercut, or measured phase rates, but concentrate on the fundamental variables that characterize such quantities. The presented formulation is in terms of well node saturations and pressures. Production data is transformed to composite saturation quantities, the time variation of which is matched in the calibration exercise. The dependence of the transformation on pressure highlights its importance and thus a need for pressure match. To address this need, we follow a low frequency asymptotic formulation for the pressure equation. We propose a simultaneous inversion of the saturation and pressure components to account for the interdependence and thus, high non-linearity of three phase inversion. We also account for global parameters through experimental design methodology and response surface modeling. The validity of the proposed history matching technique is demonstrated through application to both synthetic and field cases.