Browsing by Subject "uncertainty"
Now showing 1 - 15 of 15
Results Per Page
Sort Options
Item A Combined Skeleton Model(2014-12-10) Miller, Daniel EvanSkeleton representations are a fundamental way of representing a variety of solid models. They are particularly important for representing certain biological models and are often key to visualizing such data. Several methods exist for extracting skeletal models from 3D data sets. Unfortunately, there is usually not a single correct definition for what makes a good skeleton, and different methods will produce different skeletal models from a given input. Furthermore, for many scanned data sets, there also is inherent noise and loss of data in the scanning process that can reduce ability to identify a skeleton. In this document, I propose a method for combining multiple algorithms' skeleton results into a single composite skeletal model. This model leverages various aspects of the geometric and topological information contained in the different input skeletal models to form a single result that may limit the error introduced by particular inputs by means of a confidence function. Using such an uncertainty based model, one can better understand, refine, and de-noise/simplify the skeletal structure. The following pages describe methods for forming this composite model and also examples of applying it to some real-world data sets.Item An Updated Procedure for Tare and Interference Wind Tunnel Testing of Strut-Mounted Models(2014-05-02) Kutz, Douglas MDespite advances in modern computing and simulation, wind tunnel testing remains the most trusted method for determining aerodynamic vehicle behavior. Corrections are applied to accurately obtain results representative of free-air performance due to the presence of wind tunnel walls. The standard correction procedure adjusts for the presence of these boundaries using approximations based on linear potential flow theory. Separately,tare and interference removal involves the linear subtraction of mounting strut effects, accomplished using mirrored mounting systems. Uncertainty in wind tunnel data is quantified throughout each step in the data analysis procedure. Additionally, an updated procedure for the analysis and correction of wind tunnel data for strut mounted models is recommended.Item Applying Calibration to Improve Uncertainty Assessment(2013-08-02) Fondren, Mark EdwardUncertainty has a large effect on projects in the oil and gas industry, because most aspects of project evaluation rely on estimates. Industry routinely underestimates uncertainty, often significantly. The tendency to underestimate uncertainty is nearly universal. The cost associated with underestimating uncertainty, or overconfidence, can be substantial. Studies have shown that moderate overconfidence and optimism can result in expected portfolio disappointment of more than 30%. It has been shown that uncertainty can be assessed more reliably through look-backs and calibration, i.e., comparing actual results to probabilistic predictions over time. While many recognize the importance of look-backs, calibration is seldom practiced in industry. I believe a primary reason for this is lack of systematic processes and software for calibration. The primary development of my research is a database application that provides a way to track probabilistic estimates and their reliability over time. The Brier score and its components, mainly calibration, are used for evaluating reliability. The system is general in the types of estimates and forecasts that it can monitor, including production, reserves, time, costs, and even quarterly earnings. Forecasts may be assessed visually, using calibration charts, and quantitatively, using the Brier score. The calibration information can be used to modify probabilistic estimation and forecasting processes as needed to be more reliable. Historical data may be used to externally adjust future forecasts so they are better calibrated. Three experiments with historical data sets of predicted vs. actual quantities, e.g., drilling costs and reserves, are presented and demonstrate that external adjustment of probabilistic forecasts improve future estimates. Consistent application of this approach and database application over time should improve probabilistic forecasts, resulting in improved company and industry performance.Item Automatic history matching in Bayesian framework for field-scale applications(Texas A&M University, 2006-04-12) Mohamed Ibrahim Daoud, AhmedConditioning geologic models to production data and assessment of uncertainty is generally done in a Bayesian framework. The current Bayesian approach suffers from three major limitations that make it impractical for field-scale applications. These are: first, the CPU time scaling behavior of the Bayesian inverse problem using the modified Gauss-Newton algorithm with full covariance as regularization behaves quadratically with increasing model size; second, the sensitivity calculation using finite difference as the forward model depends upon the number of model parameters or the number of data points; and third, the high CPU time and memory required for covariance matrix calculation. Different attempts were used to alleviate the third limitation by using analytically-derived stencil, but these are limited to the exponential models only. We propose a fast and robust adaptation of the Bayesian formulation for inverse modeling that overcomes many of the current limitations. First, we use a commercial finite difference simulator, ECLIPSE, as a forward model, which is general and can account for complex physical behavior that dominates most field applications. Second, the production data misfit is represented by a single generalized travel time misfit per well, thus effectively reducing the number of data points into one per well and ensuring the matching of the entire production history. Third, we use both the adjoint method and streamline-based sensitivity method for sensitivity calculations. The adjoint method depends on the number of wells integrated, and generally is of an order of magnitude less than the number of data points or the model parameters. The streamline method is more efficient and faster as it requires only one simulation run per iteration regardless of the number of model parameters or the data points. Fourth, for solving the inverse problem, we utilize an iterative sparse matrix solver, LSQR, along with an approximation of the square root of the inverse of the covariance calculated using a numerically-derived stencil, which is broadly applicable to a wide class of covariance models. Our proposed approach is computationally efficient and, more importantly, the CPU time scales linearly with respect to model size. This makes automatic history matching and uncertainty assessment using a Bayesian framework more feasible for large-scale applications. We demonstrate the power and utility of our approach using synthetic cases and a field example. The field example is from Goldsmith San Andres Unit in West Texas, where we matched 20 years of production history and generated multiple realizations using the Randomized Maximum Likelihood method for uncertainty assessment. Both the adjoint method and the streamline-based sensitivity method are used to illustrate the broad applicability of our approach.Item Control of systems subject to uncertainty and constraints(2009-05-15) Villota Cerna, Elizabeth RoxanaAll practical control systems are subject to constraints, namely constraints aris?ing from the actuator?s limited range and rate capacity (input constraints) or from imposed operational limits on plant variables (output constraints). A linear control system typically yields the desirable small signal performance. However, the presence of input constraints often causes undesirable large signal behavior and potential insta?bility. An anti-windup control consists of a remedial solution that mitigates the e?ect of input constraints on the closed-loop without a?ecting the small signal behavior. Conversely, an override control addresses the control problem involving output con?straints and also follows the idea that large signal control objectives do not alter small signal performance. Importantly, these two remedial control methodologies must in?corporate model uncertainty into their design to be considered reliable in practice. In this dissertation, shared principles of design for the remedial compensation problem are identi?ed which simplify the picture when analyzing, comparing and synthesiz?ing for the variety of existing remedial schemes. Two performance objectives, each one related to a di?erent type of remedial compensation, and a general structural representation associated with both remedial compensation problems will be consid?ered. The e?ect of remedial control on the closed-loop will be evaluated in terms of two general frameworks which permit the uni?cation and comparison of all known remedial compensation schemes. The di?erence systems describing the performance objectives will be further employed for comparison of remedial compensation schemes under uncertainty considerations and also for synthesis of compensators. On the ba?sis of the di?erence systems and the general structure for remedial compensation, systematic remedial compensation synthesis algorithms for anti-windup and override compensation will be given and compared. Successful application of the proposed robust remedial control synthesis algorithms will be demonstrated via simulation.Item Design with Uncertain Technology Evolution(2012-10-19) Arendt, Jonathan LeeDesign is an uncertain human activity involving decisions with uncertain outcomes. Sources of uncertainty in product design include uncertainty in modeling methods, market preferences, and performance levels of subsystem technologies, among many others. The performance of a technology evolves over time exhibiting improving performance as research and development efforts continue. As the performance of a technology in the future is uncertain, quantifying the evolution of these technologies poses a challenge in making design decisions. Designing systems involving evolving technologies is a poorly understood problem. The objective of this research is to create a computational method allowing designers to make decisions encompassing the evolution of technology. Techniques for modeling evolution of a technology that has multiple performance attributes are developed. An S-curve technology evolution model is used. The performance of a technology develops slowly at first, quickly during heavy R&D effort, and slowly again as the performance approaches its limits. Pareto frontiers represent the set of optimal solutions that the decision maker can select from. As the performance of a technology develops, the Pareto frontier shifts to a new location. The assumed S-curve form of technology development allows the designer to apply the uncertainty of technology development directly to the S-curve evolution model rather than applying the uncertainty to the performance, giving a more focused application of uncertainty in the problem. Monte Carlo simulations are used to the propagate uncertainty through the decision. The decision-making methods give designers greater insight when making long-term decisions regarding evolving technologies. The scenario of an automotive manufacturing firm entering the electric vehicle market deciding which battery technology to include in their new line of electric cars is used to demonstrate the decision-making method. Another scenario of a wind turbine energy company deciding which technology to invest in demonstrates a more sophisticated technology evolution modeling technique and the decision making under uncertainty method.Item Essays on Modeling the Economic Impacts of a Foreign Animal Disease on the United States Agricultural Sector(2011-02-22) Hagerman, Amy DeannForeign animal disease can cause serious damage to the United States (US) agricultural sector and foot-and-mouth disease (FMD), in particular, poses a serious threat. FMD causes death and reduced fecundity in infected animals, as well as significant economic consequences. FMD damages can likely be reduced through implementing pre-planned response strategies. Empirical studies have evaluated the economic consequences of alternative strategies, but typically employ simplified models. This dissertation seeks to improve US preparedness for avoiding and/or responding to an animal disease outbreak by addressing three issues related to strategy assessment in the context of FMD: integrated multi region economic and epidemic evaluation, inclusion of risk, and information uncertainty. An integrated economic/epidemic evaluation is done to examine the impact of various control strategies. This is done by combining a stochastic, spatial FMD simulation model with a national level, regionally disaggregated agricultural sector mathematical programming economic model. In the analysis, strategies are examined in the context of California's dairy industry. Alternative vaccination, disease detection and movement restriction strategies are considered as are trade restrictions. The results reported include epidemic impacts, national economic impacts, prices, regional producer impacts, and disease control costs under the alternative strategies. Results suggest that, including trade restrictions, the median national loss from the disease outbreak is as much as $17 billion when feed can enter the movement restriction zone. Early detection reduces the median loss and the standard deviation of losses. Vaccination does not reduce the median disease loss, but does have a smaller standard deviation of loss which would indicate it is a risk reducing strategy. Risk in foreign animal disease outbreaks is present from several sources; however, studies comparing alternative control strategies assume risk neutrality. In reality, there will be a desire to minimize the national loss as well as minimize the chance of an extreme outcome from the disease (i.e. risk aversion). We perform analysis on FMD control strategies using breakeven risk aversion coefficients in the context of an outbreak in the Texas High Plains. Results suggest that vaccination while not reducing average losses is a risk reducing strategy. Another issue related to risk and uncertainty is the response of consumers and domestic markets to the presence of FMD. Using a highly publicized possible FMD outbreak in Kansas that did not turn out to be true, we examine the role of information uncertainty in futures market response. Results suggest that livestock futures markets respond to adverse information even when that information is untrue. Furthermore, the existence of herding behavior and potential for momentum trading exaggerate the impact of information uncertainty related to animal disease.Item Impacts of project management on real option values(Texas A&M University, 2005-02-17) Bhargav, Shilpa AnandraoThe cost of construction projects depends on their size, complexity, and duration. Construction management applies effective management techniques to the planning, design, and construction of a project from conception to completion for the purpose of controlling time, cost and quality. A real options approach in construction projects, improves strategic thinking by helping planners recognize, design and use flexible alternatives to manage dynamic uncertainty. In order to manage uncertainty using this approach, it is necessary to value the real options. Real option models assume independence of option holder and the impacts of underlying uncertainties on performance and value. The current work proposes and initially tests whether project management reduces the value of real options. The example of resource allocation is used to test this hypothesis. Based on the results, it is concluded that project management reduces the value of real options by reducing variance of the exercise signal and the difference between exercise conditions and the mean exercise signal.Item Integrated Simulation and Optimization for Decision-Making under Uncertainty with Application to Healthcare(2014-11-26) Alvarado, MichelleMany real applications require decision-making under uncertainty. These decisions occur at discrete points in time, influence future decisions, and have uncertainties that evolve over time. Mean-risk stochastic integer programming (SIP) is one optimization tool for decision problems involving uncertainty. However, it may be challenging to develop a closed-form objective for some problems. Consequently, simulation of the system performance under a combination of conditions becomes necessary. Discrete event system specification (DEVS) is a useful tool for simulation and evaluation, but simulation models do not naturally include a decision-making component. This dissertation develops a novel approach whereby simulation and optimization models interact and exchange information leading to solutions that adapt to changes in system data. The integrated simulation and optimization approach was applied to the scheduling of chemotherapy appointments in an outpatient oncology clinic. First, a simulation of oncology clinic operations, DEVS-CHEMO, was developed to evaluate system performance from the patient and managements perspectives. Four scheduling algorithms were developed for DEVS-CHEMO. Computational results showed that assigning patients to both chairs and nurses improved system performance by reducing appointment duration by 3%, reducing waiting time by 34%, and reducing nurse overtime by 4%. Second, a set of mean-risk SIP models, SIP-CHEMO, was developed to determine the start date and resource assignments for each new patients appointment schedule. SIP-CHEMO considers uncertainty in appointment duration, acuity levels, and resource availability. The SIP-CHEMO models utilize the expected excess and absolute semideviation mean-risk measures. The SIP-CHEMO models increased throughput by 1%, decreased waiting time by 41%, and decreased nurse overtime by 25% when compared to DEVS-CHEMOs scheduling algorithms. Finally, a new framework integrating DEVS and SIP, DEVS-SIP, was developed. The DEVS-CHEMO and SIP-CHEMO models were combined using the DEVS-SIP framework to create DEVS-SIP-CHEMO. Appointment schedules were determined using SIP-CHEMO and implemented in DEVS-CHEMO. If the system performance failed to meet predetermined stopping criteria, DEVS-CHEMO revised SIP-CHEMO and determined a new appointment schedule. Computational results showed that DEVS-SIP-CHEMO is preferred to using simulation or optimization alone. DEVSSIP-CHEMO held throughput within 1% and improved nurse overtime by 90% and waiting time by 36% when compared to SIP-CHEMO alone.Item On the Predictive Uncertainty of a Distributed Hydrologic Model(2009-05-15) Cho, HuidaeWe use models to simulate the real world mainly for prediction purposes. However, since any model is a simplification of reality, there remains a great deal of uncertainty even after the calibration of model parameters. The model?s identifiability of realistic model parameters becomes questionable when the watershed of interest is small, and its time of concentration is shorter than the computational time step of the model. To improve the discovery of more reliable and more realistic sets of model parameters instead of mathematical solutions, a new algorithm is needed. This algorithm should be able to identify mathematically inferior but more robust solutions as well as to take samples uniformly from high-dimensional search spaces for the purpose of uncertainty analysis. Various watershed configurations were considered to test the Soil and Water Assessment Tool (SWAT) model?s identifiability of the realistic spatial distribution of land use, soil type, and precipitation data. The spatial variability in small watersheds did not significantly affect the hydrographs at the watershed outlet, and the SWAT model was not able to identify more realistic sets of spatial data. A new populationbased heuristic called the Isolated Speciation-based Particle Swarm Optimization (ISPSO) was developed to enhance the explorability and the uniformity of samples in high-dimensional problems. The algorithm was tested on seven mathematical functions and outperformed other similar algorithms in terms of computational cost, consistency, and scalability. One of the test functions was the Griewank function, whose number of minima is not well defined although the function serves as the basis for evaluating multi-modal optimization algorithms. Numerical and analytical methods were proposed to count the exact number of minima of the Griewank function within a hyperrectangle. The ISPSO algorithm was applied to the SWAT model to evaluate the performance consistency of optimal solutions and perform uncertainty analysis in the Generalized Likelihood Uncertainty Estimation (GLUE) framework without assuming a statistical structure of modeling errors. The algorithm successfully found hundreds of acceptable sets of model parameters, which were used to estimate their prediction limits. The uncertainty bounds of this approach were comparable to those of the typical GLUE approach.Item Quantification of uncertainty in reservoir simulations influenced by varying input geological parameters, Maria Reservoir, CaHu Field(Texas A&M University, 2005-02-17) Schepers, Karine ChrystelFinding and developing oil and gas resources requires accurate geological information with which to formulate strategies for exploration and exploitation ventures. When data are scarce, statistical procedures are sometimes substituted to compensate for the lack of information about reservoir properties. The most modern methods incorporate geostatistics. Even the best geostatistical methods yield results with varying degrees of uncertainty in their solutions. Geological information is, by its nature, spatially limited and the geoscientist is handicapped in determining appropriate values for various geological parameters that affect the final reservoir model (Massonnat, 1999). This study focuses on reservoir models that depend on geostatistical methods. This is accomplished by quantifying the uncertainty in outcome of reservoir simulations as six different geological variables are changed during a succession of reservoir simulations. In this study, variations in total fluid produced are examined by numerical modeling. Causes of uncertainty in outcomes of the model runs are examined by changing one of six geological parameters for each run. The six geological parameters tested for their impact on reservoir performances include the following: 1) variogram range used to krig thickness layers, 2) morphology around well 14, 3) shelf edge orientation, 4) bathymetry ranges attributed for each facies, 5) variogram range used to simulate facies distribution, 6) extension of the erosion at top of the reservoir. The parameters were assigned values that varied from a minimum to a maximum quantity, determined from petrophysical and core analysis. After simulation runs had been completed, a realistic, 3-dimensional reservoir model was developed that revealed a range of reservoir production data. The parameters that had the most impact on reservoir performance were: 1) the amount of rock eroded at the top of the reservoir zone and 2) the bathymetry assigned to the reservoir facies. This study demonstrates how interaction between geological parameters influence reservoir fluid production, how variations in those parameters influence uncertainties in reservoir simulations, and it highlights the interdependencies between geological variables. The analysis of variance method used to quantify uncertainty in this study was found to be rapid, accurate, and highly satisfactory for this type of study. It is recommended for future applications in the petroleum industry.Item Quantifying the Uncertainty in Estimates of World Conventional Oil Resources(2010-07-14) Tien, Chih-MingSince Hubbert proposed the "peak oil" concept to forecast ultimate recovery of crude oil for the U.S. and the world, there have been countless debates over the timing of peak world conventional oil production rate and ultimate recovery. From review of the literature, forecasts were grouped into those that are like Hubbert's with an imminent peak, and those that do not predict an imminent peak. Both groups have bases for their positions. Viewpoints from the two groups are polarized and the rhetoric is pointed and sometimes personal. A big reason for the large divide between the two groups is the failure of both to acknowledge the significant uncertainty in their estimates. Although some authors attempt to quantify uncertainty, most use deterministic methods and present single values, with no ranges. This research proposes that those that do attempt to quantify uncertainty underestimate it significantly. The objective of this thesis is to rigorously quantify the uncertainty in estimates of ultimate world conventional oil production and time to peak rate. Two different methodologies are used. The first is a regression technique based on historical production data using Hubbert's model and the other methodology uses mathematical models. However, I conduct the analysis probabilistically, considering errors in both the data and the model, which results in likelihood probability distributions for world conventional oil production and time to peak rate. In the second method, I use a multiple-experts analysis to combine estimates from the multitude of papers presented in the literature, yielding an overall distribution of estimated world conventional oil production. Giving due consideration to uncertainty, Hubbert-type mathematical modeling results in large uncertainty ranges that encompass both groups of forecasts (imminent peak and no imminent peak). These ranges are consistent with those from the multiple-experts analysis. In short, the industry does not have enough information at this time to say with any reliability what the ultimate world conventional oil production will be. It could peak soon, somewhere in the distant future, or somewhere in between. It would be wise to consider all of these possible outcomes in planning and making decisions regarding capital investment and formulation of energy policy.Item The experience of . . . suspense: understanding the construct, its antecedents, and its consequences in consumption and acquisition contexts(Texas A&M University, 2005-02-17) Guidry, Julie Anna?Will my flight be cancelled?? ?Will I win the eBay auction?? These consumption and product acquisition situations would trigger the experience of . . . suspense. Suspense is defined as the overall anticipatory arousal associated with the hope and/or fear felt by a consumer assessing the likelihood of occurrence of an important and imminent consumption or acquisition event. If one views a potential outcome as causing pleasure (an approach appraisal), hope will be felt, while if one views a potential outcome as causing pain (an avoidance appraisal), fear will be felt. Other variables expected to indirectly impact suspense are frequency of probability change, degree of probability change and anticipation time. The conceptual model in this dissertation also proposes that people have an attitude toward the anticipation period and identifies four resolution emotions, satisfaction, disappointment, relief, and anguish, which may occur once the outcome is known. Further, attitude toward anticipation period and the resolution emotions are expected to affect attitude toward overall experience. Three studies were conducted. The objective of Studies 1 and 2 was to develop scales yielding reliable scores of hope, fear, and suspense. Fifty words related to hope, fear, and suspense were generated. In Study 1, 553 participants rated the words on the evaluative and activity dimensions using 18 semantic differential scale items. O-technique factor analysis was used to analyze the data in Study 1. In Study 2, 354 participants read one of three suspenseful stories, then indicated their hope, fear, and suspense. Exploratory and confirmatory factor analyses were used in Study 2. Study 3 consisted of an experiment in which 241 participants read a suspenseful house-buying scenario, then indicated their hope, fear, and suspense. Structural equation modeling was used to analyze the data in Study 3. Results supported the conceptualization of suspense: both hope and fear had a positive effect on suspense. Additionally, approach appraisal had a positive effect on hope, and avoidance appraisal had a positive effect on fear. The moderating effect of frequency of probability change was not supported. However, frequency of probability change did have a positive effect on both hope and fear.Item Tsallis Entropy Based Velocity Distribution in Open Channel Flows(2010-07-14) Luo, HaoThe Tsallis entropy is applied to derive both 1-D and 2-D velocity distributions in an open channel cross section. These distributions contain a parameter m through which the Tsallis entropy becomes a generalization of the Shannon entropy. Different m parameter values are examined to determine the best value for describing the velocity distribution.Two Lagrangian parameters that are involved in the final form of 1-D velocity distribution equation are determined from observations of mean velocity and the maximum velocity at the water surface. For channels which are not wide and where the maximum velocity does not occur at the water surface, a 2-D velocity distribution is more appropriate. The Tsallis entropy is applied to derive 2-D velocity distributions. A new parameter M is introduced which represents the hydraulic characteristics of the channel. The derived velocity distributions are verified using both field data and experimental data. The advantages are found by comparing with Parandtl-von Karman, power law and Chiu?s velocity distributions.Item Uncertainty evaluation of delayed neutron decay parameters(2009-05-15) Wang, JinkaiIn a nuclear reactor, delayed neutrons play a critical role in sustaining a controllable chain reaction. Delayed neutron?s relative yields and decay constants are very important for modeling reactivity control and have been studied for decades. Researchers have tried different experimental and numerical methods to assess these delayed neutron parameters. The reported parameter values vary widely, much more than the small statistical errors reported with these parameters. Interestingly, the reported parameters fit their individual measurement data well in spite of these differences. This dissertation focuses on evaluation of the errors and methods of delayed neutron relative yields and decay constants for thermal fission of U-235. Various numerical methods used to extract the delayed neutron parameter from the measured data, including Matrix Inverse, Levenberg-Marquardt, and Quasi-Newton methods, were studied extensively using simulated delayed neutron data. This simulated data was Poisson distributed around Keepin?s theoretical data. The extraction methods produced totally different results for the same data set, and some of the above numerical methods could not even find solutions for some data sets. Further investigation found that ill-conditioned matrices in the objective function were the reason for the inconsistent results. To find a reasonable solution with small variation, a regularization parameter was introduced using a numerical method called Ridge Regression. The results from the Ridge Regression method, in terms of goodness of fit to the data, were good and often better than the other methods. Due to the introduction of a regularization number in the algorithm, the fitted result contains a small additional bias, but this method can guarantee convergence no matter how large the coefficient matrix condition number. Both saturation and pulse modes were simulated to focus on different groups. Some of the factors that affect the solution stability were investigated including initial count rate, sample flight time, initial guess values. Finally, because comparing reported delayed neutron parameters among different experiments is useless to determine if their data actually differs, methods are proposed that can be used to compare the delayed neutron data sets.