Browsing by Subject "History matching"
Now showing 1 - 9 of 9
Results Per Page
Sort Options
Item A Hierarchical Multiscale Approach to History Matching and Optimization for Reservoir Management in Mature Fields(2012-10-19) Park, Han-YoungReservoir management typically focuses on maximizing oil and gas recovery from a reservoir based on facts and information while minimizing capital and operating investments. Modern reservoir management uses history-matched simulation model to predict the range of recovery or to provide the economic assessment of different field development strategies. Geological models are becoming increasingly complex and more detailed with several hundred thousand to million cells, which include large sets of subsurface uncertainties. Current issues associated with history matching, therefore, involve extensive computation (flow simulations) time, preserving geologic realism, and non-uniqueness problem. Many of recent rate optimization methods utilize constrained optimization techniques, often making them inaccessible for field reservoir management. Field-scale rate optimization problems involve highly complex reservoir models, production and facilities constraints and a large number of unknowns. We present a hierarchical multiscale calibration approach using global and local updates in coarse and fine grid. We incorporate a multiscale framework into hierarchical updates: global and local updates. In global update we calibrate large-scale parameters to match global field-level energy (pressure), which is followed by local update where we match well-by-well performances by calibration of local cell properties. The inclusion of multiscale calibration, integrating production data in coarse grid and successively finer grids sequentially, is critical for history matching high-resolution geologic models through significant reduction in simulation time. For rate optimization, we develop a hierarchical analytical method using streamline-assisted flood efficiency maps. The proposed approach avoids use of complex optimization tools; rather we emphasize the visual and the intuitive appeal of streamline method and utilize analytic solutions derived from relationship between streamline time of flight and flow rates. The proposed approach is analytic, easy to implement and well-suited for large-scale field applications. Finally, we present a hierarchical Pareto-based approach to history matching under conflicting information. In this work we focus on multiobjective optimization problem, particularly conflicting multiple objectives during history matching of reservoir performances. We incorporate Pareto-based multiobjective evolutionary algorithm and Grid Connectivity-based Transformation (GCT) to account for history matching with conflicting information. The power and effectiveness of our approaches have been demonstrated using both synthetic and real field cases.Item A probabilistic workflow for uncertainty analysis using a proxy-based approach applied to tight reservoir simulation studies(2016-08) Wantawin, Marut; Sepehrnoori, Kamy, 1951-; Yu, WeiUncertainty associated with reservoir simulation studies should be thoroughly captured during history matching process and adequately explained during production forecasts. Lacking information and limited accuracy of measurements typically cause uncertain reservoir properties in the reservoir simulation models. Unconventional tight reservoirs, for instances, often deal with complex dynamic flow behavior and inexact dimensions of hydraulic fractures that directly affect production estimation. Non-unique history matching solutions on the basis of probabilistic logic are recognized in order to avoid underestimating prediction results. Assisted history matching techniques have been widely proposed in many literature to quantify the uncertainty. However, few applications were done in unconventional reservoirs where some distinct uncertain factors could significantly influence well performance. In this thesis, a probabilistic workflow was developed using proxy-modeling approach to encompass uncertain parameters of unconventional reservoirs and obtain reliable prediction. Proxy-models were constructed by Design of Experiments (DoE) and Response Surface Methodology (RSM). As preliminary screening tools, significant parameters were identified, thus removing those that were insignificant for the reduced dimensions. Furthermore, proxy-models were systematically built to approximate the actual simulation, then sampling algorithms, e.g. Markov Chain Monte Carlo (MCMC) method, successfully estimated probabilistic history matching solutions. An iterative procedure was also introduced to gradually improve the accuracy of proxy-models at the interested region with low history matching errors. The workflow was applied to case studies in Middle Bakken reservoir and Marcellus Shale formation. In addition to estimating misfit function for the errors, proxy-models are also regressed on the simulated quantity of the measurements at various points in time, which is shown to be very useful. This alternative method was utilized in a synthetic tight reservoir model, which analyzed the impact of complex fracture network relative to instantaneous well performance at different stages. The results in this thesis show that the proxy-based approach reasonably provides simplified approximation of actual simulation. Besides, they are very flexible and practical for demonstrating the non-unique history matching solutions and analyzing the probability distributions of complicated reservoir and fracture properties. Ultimately, the developed workflow delivers probabilistic production forecasts with efficient computational requirement.Item Heterogeneous Reservoir Characterization Utilizing Efficient Geology Preserving Reservoir Parameterization through Higher Order Singular Value Decomposition (HOSVD)(2015-01-21) Afra, SardarPetroleum reservoir parameter inference is a challenging problem to many of the reservoir simulation work flows, especially when it comes to real reservoirs with high degree of complexity and non-linearity, and high dimensionality. In fact, the process of estimating a large number of unknowns in an inverse problem lead to a very costly computational effort. Moreover, it is very important to perform geologically consistent reservoir parameter adjustments as data is being assimilated in the history matching process, i.e., the process of adjusting the parameters of reservoir system in order to match the output of the reservoir model with the previous reservoir production data. As a matter of fact, it is of great interest to approximate reservoir petrophysical properties like permeability and porosity while reparameterizing these parameters through reduced-order models. As we will show, petroleum reservoir models are commonly described by in general complex, nonlinear, and large-scale, i.e., large number of states and unknown parameters. Thus, having a practical approach to reduce the number of reservoir parameters in order to reconstruct the reservoir model with a lower dimensionality is of high interest. Furthermore, de-correlating system parameters in all history matching and reservoir characterization problems keeping the geological description intact is paramount to control the ill-posedness of the system. In the first part of the present work, we will introduce the advantages of a novel parameterization method by means of higher order singular value decomposition analysis (HOSVD). We will show that HOSVD outperforms classical parameterization techniques with respect to computational and implementation cost. It also, provides more reliable and accurate predictions in the petroleum reservoir history matching problem due to its capability to preserve geological features of the reservoir parameter like permeability. The promising power of HOSVD is investigated through several synthetic and real petroleum reservoir benchmarks and all results are compared to that of classic SVD. In addition to the parameterization problem, we also addressed the ability of HOSVD in producing accurate production data comparing to those of original reservoir system. To generate the results of the present work, we employ a commercial reservoir simulator known as ECLIPSE. In the second part of the work, we will address the inverse modeling, i.e., the reservoir history matching problem. We employed the ensemble Kalman filter (EnKF) which is an ensemble-based characterization approach to solve the inverse problem. We also, integrate our new parameterization technique into the EnKF algorithm to study the suitability of HOSVD based parameterization for reducing the dimensionality of parameter space and for estimating geologically consistence permeability distributions. The results of the present work illustrates the characteristics of the proposed parameterization method by several numerical examples in the second part including synthetic and real reservoir benchmarks. Moreover, the HOSVD advantages are discussed by comparing its performance to the classic SVD (PCA) parameterization approach. In the first part of the present work, we will introduce the advantages of a novel parameterization method by means of higher order singular value decomposition analysis (HOSVD). We will show that HOSVD outperforms classical parameterization techniques with respect to computational and implementation cost. It also, provides more reliable and accurate predictions in the petroleum reservoir history matching problem due to its capability to preserve geological features of the reservoir parameter like permeability. The promising power of HOSVD is investigated through several synthetic and real petroleum reservoir benchmarks and all results are compared to that of classic SVD. In addition to the parameterization problem, we also addressed the ability of HOSVD in producing accurate production data comparing to those of original reservoir system. To generate the results of the present work, we employ a commercial reservoir simulator known as ECLIPSE. In the second part of the work, we will address the inverse modeling, i.e., the reservoir history matching problem. We employed the ensemble Kalman filter (EnKF) which is an ensemble-based characterization approach to solve the inverse problem. We also, integrate our new parameterization technique into the EnKF algorithm to study the suitability of HOSVD based parameterization for reducing the dimensionality of parameter space and for estimating geologically consistence permeability distributions. The results of the present work illustrate the characteristics of the proposed parameterization method by several numerical examples in the second part including synthetic and real reservoir benchmarks. Moreover, the HOSVD advantages are discussed by comparing its performance to the classic SVD (PCA) parameterization approach.Item History Matching and Optimization Using Stochastic Methods: Applications to Chemical Flooding(2014-09-04) Zhang, ZhengA typical lifecycle of an oil and gas field is characterized by three stages: primary recovery by natural depletion, secondary recovery by fluid injection, and enhanced oil recovery (EOR). The primary goal of reservoir management is to increase hydrocarbon recovery while reducing capital and operational expenditures. Two key techniques for the success of reservoir management are model calibration and production optimization. History matching is used to calibrate existing geological models against to measured data and predict the range of future recovery. Production optimization on calibrated reservoir models provides economic assessment of different field development plans and suggests optimal strategies to maximize recovery and minimize cost. We first presented the workflow of history matching in chemical flooding. Evolutionary algorithms are the method of choice due to its capability of calibrating various parameter types and its global search nature. Chemical flooding simulator UTCHEM, developed by The University of Texas at Austin, is coupled during the history matching process to consider complex mechanisms such as phase behavior, chemical and physical transformations, etc. Next, we implemented the proposed workflow to calibrate models in multiple stages that can efficiently reduce large amounts of uncertain parameters in alkaline-surfactant-polymer (ASP) flooding. Each stage of model calibration will follow an order of field scale, and then individual well scale, with consideration of behaviors brought by ASP flooding, such as surfactant/polymer adsorption. The proposed multi-stage history matching workflow is powerful to deliver better history matching results and significantly reduce the uncertainty of large numbers of parameters involved in chemical flooding. Lastly, we extended the evolutionary workflows for multi-objective optimization via introducing the concept of Pareto optimality. Pareto front method is proposed to handle conflicting objective functions such as oil production and chemical efficiency instead of weighted sum method in optimizing ASP flooding. Non-dominated Sorting Genetic Algorithm (NSGA-II) is used to search for Pareto optimal solutions. The robustness and practical feasibility of our approaches have been demonstrated through both synthetic and field examples.Item Leak-off test (LOT) models combining wellbore and near-wellbore mechanical and thermal behaviors(2015-08) Gandomkar, Arjang; Gray, Kenneth E., Ph. D.; Daigle, Hugh CConsiderable efforts to model leak-off test (LOT) and leak-off behaviors have been carried out in the past. Altun presented a model to estimate leak-off volume by dividing the wellbore system into four sub-systems: mud compression, casing expansion, fluid leakage, and borehole expansion (Altun 2001). The volume response from each sub-system is then combined to represent the total volume pumped during a LOT. Most existing leak-off models do not account for mechanical behavior of cement and rock formations around the wellbore. While their compressibilities are small, the cement and rock formation volume changes can be significant. In this research, a mechanical expansion model has been developed, based on a linearly elastic, concentric cylinder theory developed by Norris (Norris 2003). The model is an extension of Lamé equations for multi concentric cylinders and assumes the horizontal stresses on the system’s boundary are applied equally in all directions, i.e., the horizontal, far-field stresses around the system are isotropic. The resulting model simulates the compound radial displacements of casing, cement, and formation along the cased hole, based on pressures inside the wellbore and in the far-field stress region. The volume generated from concentric cylinder expansion is then combined with Altun’s model to simulate the total volume pumped during a LOT. One use of the model is the estimation of minimum horizontal far field stress. Since the model consists of concentric cylinders, the pressure on the outside boundary can approximate the minimum horizontal far field stress, which in turn is related to overburden pressure. The pressure inside the most inner cylinder is calculated from known mud weight. With an initial estimation for the far field stress and iterative methods, the minimum horizontal stress can be estimated. The developed models were then applied to field LOT data from Gulf of Mexico. The results show that leak-off volume along the cased hole should be analyzed as a compound expansion of casing, cement, and formation.Item Model selection for CO₂ sequestration using surface deflection and injection data(2015-08) Nwachukwu, Chiazor; Srinivasan, Sanjay; Sepehrnoori, KamyIn recent years, sequestration of CO₂ in the subsurface has been studied more extensively as an approach to curb carbon emissions into the atmosphere. Monitoring the fate and migration of the CO₂ plume in the aquifer is of utmost interest to regulators and operators. Current monitoring techniques like time-lapse seismic are expensive and have limited applicability. Moreover, these techniques have little predictive value unless embedded within a feedback-style control scheme. Provided that field data such as bottom-hole pressures, well rates, or even surface deformation is available, geologic models for the aquifer can be created and used, as an input to a flow simulator, to predict the migration of CO₂. A history matching approach has been developed, within a model selection framework, to select and refine geologic models within a selected set of models until they represent the spatial heterogeneity of the target aquifer, and produce forecast with relatively small uncertainty. An initial large suite of models can be created based on prior information of the aquifer. Predicting the response from these models however, presents a problem in terms of computational time and expense. A particle-tracking algorithm has been developed to estimate the flow response from geologic models, while significantly reducing computational costs. This algorithm serves as a fast approximation of finite-difference flow simulation models, and is meant to provide a rapid estimation of connectivity of the aquifer models. A finite element method (FEM) solver was also developed to approximate the geomechanical effects in the rock caused by the injection of CO₂. The approach used here utilizes a partial coupling scheme to sequentially solve the flow and geomechanical equilibrium equations. The validity of the proxies is tested on both 2D and 3D field cases, and the solutions are shown to correlate reasonably well with full-physics simulations. We also demonstrate the application of the model selection algorithm to a 3D reservoir with complex topography. The algorithm includes three main steps: (1) predicting the flow and geomechanical response of a large prior ensemble of models using the proxies; (2) grouping models with similar responses into clusters using multidimensional scaling together with a k-means clustering approach; and (3) selecting a model cluster that produces the minimum deviation from the observed field data. The model selection procedure can be repeated using the sub-group of models within a selected cluster in order to further refine the forecasts for future plume migration. This entire iterative model selection scheme is demonstrated using the injection data for the Krechba reservoir in Algeria, which is an active site for CO₂ sequestration.Item Particle tracking proxies for prediction of CO₂ plume migration within a model selection framework(2014-05) Bhowmik, Sayantan; Srinivasan, Sanjay; Bryant, Steven L.Geologic sequestration of CO₂ in deep saline aquifers has been studied extensively over the past two decades as a viable method of reducing anthropological carbon emissions. The monitoring and prediction of the movement of injected CO₂ is important for assessing containment of the gas within the storage volume, and taking corrective measures if required. Given the uncertainty in geologic architecture of the storage aquifers, it is reasonable to depict our prior knowledge of the project area using a vast suite of aquifer models. Simulating such a large number of models using traditional numerical flow simulators to evaluate uncertainty is computationally expensive. A novel stochastic workflow for characterizing the plume migration, based on a model selection algorithm developed by Mantilla in 2011, has been implemented. The approach includes four main steps: (1) assessing the connectivity/dynamic characteristics of a large prior ensemble of models using proxies; (2) model clustering using the principle component analysis or multidimensional scaling coupled with the k-mean clustering approach; (3) model selection using the Bayes' rule on the reduced model space, and (4) model expansion using an ensemble pattern-based matching scheme. In this dissertation, two proxies have been developed based on particle tracking in order to assess the flow connectivity of models in the initial set. The proxies serve as fast approximations of finite-difference flow simulation models, and are meant to provide rapid estimations of connectivity of the aquifer models. Modifications have also been implemented within the model selection workflow to accommodate the particular problem of application to a carbon sequestration project. The applicability of the proxies is tested both on synthetic models and real field case studies. It is demonstrated that the first proxy captures areal migration to a reasonable extent, while failing to adequately capture vertical buoyancy-driven flow of CO₂. This limitation of the proxy is addressed in the second proxy, and its applicability is demonstrated not only in capturing horizontal migration but also in buoyancy-driven flow. Both proxies are tested both as standalone approximations of numerical simulation and within the larger model selection framework.Item Predicting the migration of CO₂ plume in saline aquifers using probabilistic history matching approaches(2012-05) Bhowmik, Sayantan; Srinivasan, Sanjay; Bryant, Steven L.During the operation of a geological carbon storage project, verifying that the CO₂ plume remains within the permitted zone is of particular interest both to regulators and to operators. However, the cost of many monitoring technologies, such as time-lapse seismic, limits their application. For adequate predictions of plume migration, proper representation of heterogeneous permeability fields is imperative. Previous work has shown that injection data (pressures, rates) from wells might provide a means of characterizing complex permeability fields in saline aquifers. Thus, given that injection data are readily available and inexpensive, they might provide an inexpensive alternative for monitoring; combined with a flow model like the one developed in this work, these data could even be used for predicting plume migration. These predictions of plume migration pathways can then be compared to field observations like time-lapse seismic or satellite measurements of surface-deformation, to ensure the containment of the injected CO₂ within the storage area. In this work, two novel methods for creating heterogeneous permeability fields constrained by injection data are demonstrated. The first method is an implementation of a probabilistic history matching algorithm to create models of the aquifer for predicting the movement of the CO₂ plume. The geologic property of interest, for example hydraulic conductivity, is updated conditioned to geological information and injection pressures. The resultant aquifer model which is geologically consistent can be used to reliably predict the movement of the CO₂ plume in the subsurface. The second method is a model selection algorithm that refines an initial suite of subsurface models representing the prior uncertainty to create a posterior set of subsurface models that reflect injection performance consistent with that observed. Such posterior models can be used to represent uncertainty in the future migration of the CO₂ plume. The applicability of both methods is demonstrated using a field data set from central Algeria.Item Quantification of uncertainties associated with reservoir performance simulation(Texas Tech University, 2007-05) Oghena, Andrew; Heinze, Lloyd R.; Ziaja, Malgorzata; Siddiqui, ShameemThis research presents a method to quantify uncertainty associated with reservoir performance prediction after history match by conditioning black oil with compositional simulation. Two test cases were investigated. In the first test case, a black oil history matched model of a natural depleted volatile oil reservoir was used to predict reservoir performance. The same reservoir was simulated with compositional model and the model used to forecast reservoir performance. The difference between black oil and compositional models predicted cumulative oil production were evaluated using an objective function algorithm. To minimize the objective function, the black oil and compositional simulation reservoir descriptions were equally perturbed to generate few multiple realizations. These new realizations were used to predict oil recovery and their forecast optimized. Non-linear analysis of the optimization results was used to quantify the range of uncertainty associated with the predicted cumulative oil production. Similarly, a second test case was studied whereby, the same volatile reservoir was produced under water-alternate-gas injection scheme. As in the first test case, it is shown how optimization followed by non-linear analysis of both the black oil and compositional simulation predictions can be used to assess uncertainty in reservoir performance forecast. It is well known that the disadvantage of the black oil is its inability to simulate comprehensive reservoir fluid compositional data. To eliminate this limitation in reservoir performance prediction, this research presents a technique that is based on conditioning black oil output with compositional simulation in order to better account for fluid phase behavior and reservoir description influence on reservoir performance.