Browsing by Subject "risk"
Now showing 1 - 13 of 13
Results Per Page
Sort Options
Item Assessment of the Potential Effect of Climate Change on Hurricane Risk and Vulnerability in Florida(2014-12-05) Ruiz, MichelleHurricanes are a yearly threat to the eastern and Gulf coasts of the United States. An increase in frequency and intensity of hurricanes is a possible and dangerous consequence of future climate change. To assess the threat of more frequent and intense hurricanes, this research will address how climate change will affect future hurricane activity in Florida. A greater understanding of how climate change will affect hurricanes is vital for regions, such as Florida, that are vulnerable to these powerful storms. Hurricane return periods were calculated for all Florida counties based on 1900-2010. Hurricane landfalls were quantified using a dynamic wind model which allowed for the spatial extent of each storm to be examined. A meta-analysis of the existing literature on the effects of climate change on hurricane behavior was performed. Using the findings from the meta-analysis, a sensitivity analysis was performed to determine how climate change may affect hurricane damage and loss for Florida. The HAZUS-MH Hurricane Model was used to estimate losses and damage from hurricane winds based on Florida?s growing population and increasing coastal development. Results show that wind-derived return periods more accurately depict the distribution of a storm?s wind field. Counties in southern Florida have the lowest return periods based on the track-derived and wind-derived return periods. Based on the meta-analysis, hurricane intensity is expected to increase by 2 to 11%. Hurricane frequency is expected to decrease or remain the same and storm tracks are not expected to change. The sensitivity analysis examined the influence of climate change on baseline (current), moderate (15% increase), and extreme (35% increase) TC intensity scenarios. The most developed and populated regions are the most vulnerable to hurricane damages and losses. Based on the boxplots, the spread of percent values increases for building damage, economic losses, and shelter needs as storm intensity increases. The spread in the data shown in the scatterplots and boxplots is storm specific. This research found that southeastern Florida is at highest risk of future hurricane landfalls and most vulnerable to hurricane damages and losses.Item Capacity dynamics of feed-forward, flow-matching networks exposed to random disruptions(Texas A&M University, 2006-10-30) Savachkin, AliakseiWhile lean manufacturing has greatly improved the efficiency of production operations, it has left US enterprises in an increasingly risky environment. Causes of manufacturing disruptions continue to multiply, and today, seemingly minor disruptions can cause cascading sequences of capacity losses. Historically, enterprises have lacked viable tools for addressing operational volatility. As a result, each year US companies forfeit billions of dollars to unpredictable capacity disruptions and insurance premiums. In this dissertation we develop a number of stochastic models that capture the dynamics of capacity disruptions in complex multi-tier flow-matching feed-forward networks (FFN). In particular, we relax basic structural assumptions of FFN, introduce random propagation times, study the impact of inventory buffers on propagation times, and make initial efforts to model random network topology. These stochastic models are central to future methodologies supporting strategic risk management and enterprise network design.Item D2 Dopamine Receptor Mediation of Risky Decision-making(2011-08-08) Simon, Nicholas WayneExcessive risk-taking is a characteristic of several psychopathological disorders. In order to alleviate maladaptive risky behavior, a thorough understanding of the neurobiological and pharmacological substrates of risky choice must be developed. In this dissertation, the ?risky decision-making task? was utilized to explore the mechanisms by which dopamine mediates risky choice. In experiment 1, we characterized rats in risky decision-making as well as a variety of other behavioral traits. This was performed to determine if the behavioral patterns obtained in the risky decision-making task represent an independent cognitive construct rather than a function of a separate behavioral trait. Risky decision-making performance was not correlated with measures of motivation, anxiety, pain tolerance, or other types of decision-making. In contrast, risky choice was correlated with impulsive action as assessed by the Differential Rates of Low Responding Task, suggesting that risky choice may be mechanistically similar to impulsive action. In experiment 2, the effects of various dopaminergic drugs on risky decision-making was investigated. Amphetamine administration attenuated risky choice, while the dopamine antagonist ?-flupenthixol had no effect on risky choice. Agonists and antagonists specific to D1 dopamine receptors had no effects on risky choice; however, the D2 dopamine receptor agonist bromocriptine reduced risky choice in a manner similar to amphetamine. Furthermore, coadministration of amphetamine with a D2 antagonist abolished amphetamine?s effects on risky choice, and amphetamine?s effects were unaffected by coadministration of a D1 antagonist. These data suggest that D2 signaling at the receptor is particularly critical to risky decision-making behavior. In experiment 3, D2 dopamine receptor mRNA abundance was assessed in rats that had been previously characterized in risky decision-making using in situ hybridization. Levels of D2 cRNA hybridization in both orbitofrontal cortex (OFC) and medial prefrontal cortex (mPFC) predicted risky decision-making behavior as assessed by nonlinear curve estimation analyses. Interestingly, opposite relationships between D2 mRNA abundance and risky choice were observed in these two cortical areas, with OFC D2 mRNA abundance showing a U-shaped relationship with risky choice, and mPFC D2 mRNA resembling an inverted U-curve. Additionally, increased levels of D2 mRNA in dorsal striatum were observed in risk-averse rats in comparison to risk-taking rats. In conclusion, these data suggest that signaling via D2 dopamine receptors is an important mediator of risky decision-making behavior, and that D2 signaling in frontostriatal circuitry may be particularly relevant toward these behaviors.Item Essays on Modeling the Economic Impacts of a Foreign Animal Disease on the United States Agricultural Sector(2011-02-22) Hagerman, Amy DeannForeign animal disease can cause serious damage to the United States (US) agricultural sector and foot-and-mouth disease (FMD), in particular, poses a serious threat. FMD causes death and reduced fecundity in infected animals, as well as significant economic consequences. FMD damages can likely be reduced through implementing pre-planned response strategies. Empirical studies have evaluated the economic consequences of alternative strategies, but typically employ simplified models. This dissertation seeks to improve US preparedness for avoiding and/or responding to an animal disease outbreak by addressing three issues related to strategy assessment in the context of FMD: integrated multi region economic and epidemic evaluation, inclusion of risk, and information uncertainty. An integrated economic/epidemic evaluation is done to examine the impact of various control strategies. This is done by combining a stochastic, spatial FMD simulation model with a national level, regionally disaggregated agricultural sector mathematical programming economic model. In the analysis, strategies are examined in the context of California's dairy industry. Alternative vaccination, disease detection and movement restriction strategies are considered as are trade restrictions. The results reported include epidemic impacts, national economic impacts, prices, regional producer impacts, and disease control costs under the alternative strategies. Results suggest that, including trade restrictions, the median national loss from the disease outbreak is as much as $17 billion when feed can enter the movement restriction zone. Early detection reduces the median loss and the standard deviation of losses. Vaccination does not reduce the median disease loss, but does have a smaller standard deviation of loss which would indicate it is a risk reducing strategy. Risk in foreign animal disease outbreaks is present from several sources; however, studies comparing alternative control strategies assume risk neutrality. In reality, there will be a desire to minimize the national loss as well as minimize the chance of an extreme outcome from the disease (i.e. risk aversion). We perform analysis on FMD control strategies using breakeven risk aversion coefficients in the context of an outbreak in the Texas High Plains. Results suggest that vaccination while not reducing average losses is a risk reducing strategy. Another issue related to risk and uncertainty is the response of consumers and domestic markets to the presence of FMD. Using a highly publicized possible FMD outbreak in Kansas that did not turn out to be true, we examine the role of information uncertainty in futures market response. Results suggest that livestock futures markets respond to adverse information even when that information is untrue. Furthermore, the existence of herding behavior and potential for momentum trading exaggerate the impact of information uncertainty related to animal disease.Item Evaluation of dietary factors associated with spontaneous pancreatitis in dogs(2009-05-15) Lem, Kristina YvonneThis study estimates the association between dietary factors and spontaneous pancreatitis in dogs. A case-control study was conducted using 198 dogs with a clinical diagnosis of pancreatitis and 187 control dogs with a diagnosis of renal failure without clinical evidence of pancreatitis. Information on signalment, weight, body condition, dietary intake, medical history, diagnostic tests performed, concurrent diseases, treatment, length of hospital stay, and discharge status was extracted from medical records for dogs admitted to the Texas A&M University Small Animal Clinic (TAMU SAC) during January 2000 to December 2005. Information on dietary intake, signalment, weight, medical, surgical and environmental history was collected for the same dogs through a telephone questionnaire conducted from November 2006 through January 2007. Descriptive statistics were calculated, tabular analyses performed, and logistic regression was used to estimate odds ratios (OR) and 95% confidence intervals (CI). Based on information extracted from the medical records, ingesting unusual food (OR=4.3; CI=1.7 to 10.7), ingesting table food (OR=1.5; CI=1.0 to 2.2), or exposure to both of these dietary factors (OR=2.1; CI=1.3 to 3.2) increased the odds of pancreatitis. Collected through the telephone questionnaire, ingesting unusual food (OR=6.1; CI=2.2 to 16.5), ingesting table scraps the week before diagnosis (OR=2.2; CI=1.2 to 3.8) or regularly throughout life (OR=2.2; CI=1.2 to 4.0), and getting into the trash (OR=13.2; CI=2.1 to undefined) increased the odds of pancreatitis. Multivariable modeling estimated the associations of exposure to one or more dietary factors reported through the telephone questionnaire (OR=2.6; CI=1.4 to 5.0), overweight (OR=1.3; CI=0.7 to 2.5), year of diagnosis (OR=3.5; CI=1.9 to 6.5), neuter status (OR=3.6; CI=1.4 to 9.5), non-neuter surgery (OR=21.1; CI=3.3 to 133.9) and an interaction term between neuter status and non-neuter surgery (OR=0.1; CI=0.01 to 0.4). Dietary factors increase the odds of spontaneous pancreatitis in dogs.Item Evaluation of information bundles in engineering decisions(Texas A&M University, 2004-11-15) Bakir, Niyazi OnurThis dissertation addresses the question of choosing the best information alternative in engineering decisions. The decision maker maximizes his expected utility under uncertainty where both the action he takes and the state of the environment determines the payoff earned. The decision maker has an opportunity to gather information about the decision environment a priori at a certain cost. There might be different information alternatives, and the decision maker has to determine which alternative offers "better" prospects for improving the decision. Any decision environment that is characterized by a finite number of outcomes and a discrete probability distribution over the set of outcomes is a lottery. We analyze the value of information on a single outcome and determine the attributes in each piece of information that maximizes its value. Information is valuable when the decision is changed after gathering information. We show that if the number of optimal actions taken under different outcomes scenarios is finite, the decision maker does not require the perfect information. Further, we analyze the relation between the value of information and its determinants, and show a monotonic relation exists for a restricted class of information bundles and utility functions. We use different approaches to evaluate information and analyze the cases where preference reversals occur between different approaches. We observe that a priori pricing of information does not necessarily induce the same ranking with the expected utility approach, however both approaches agree on whether a given piece of information is valuable or not. The second part of this dissertation evaluates information in both static and dynamic coinsurance problems. In static insurance decisions, we analyze the case where the decision maker gathers information about the severity of the risk events and perform ranking of information bundles in a specific class. In dynamic insurance problems, we make a case study to analyze different physical risks that the production facilities are exposed to. The information in dynamic insurance problems involves more detail with regard to the timing of the multiple risk events. We observe that information on events that pose relatively good scenarios for the decision maker have value, however, their value may diminish as their probability of occurance decreases. The decision maker purchases more information as the profitability of the product increases and less information as the initial wealth increases. Furthermore, the decrease cost of insurance does not necessarily make information more valuable as the value is directly related to the change in the decisions rather than the cost of taking a specific action.Item Explorative study of African Americans and internet dating(Texas A&M University, 2005-02-17) Spates, Kamesha SondranekThe online dating industry is estimated to be worth 1.5 billion dollars. The growing trends in technology have resulted in African Americans logging on to the Web at astonishing rates. Therefore, the goal of this research project is to evaluate dating orientated interaction in the context of virtual communities. The theoretical perspective of this thesis is that of the concept of trust, and I examine the role that trust has on dating oriented interaction in the context of virtual communities. This study utilizes both ethnographic qualitative research methods along with the survey research method to explore the topic of African Americans and their use of the Internet as a tool to find ?quality or compatible dates?. This study also provides an examination not only of dating patterns among African Americans via the Internet, but it also provides an examination of the role that technology plays in creating and mediating dating trends. An additional interest is to evaluate dating orientated interaction in the context of virtual communities.Item Looking for a good doctor (or realtor or mechanic): construing quality with credence services(2009-05-15) Mirabito, Ann MarieLittle is known about how people evaluate credence attributes, that is, those attributes which the consumer often cannot fully evaluate even after purchasing and consuming the product. And yet consumers struggle to evaluate quality in several important product categories dominated by credence attributes such as food safety, medical services, legal services, and pharmaceuticals, among others. The dissertation explores the processes by which people form quality evaluations of services high in credence attributes and the consequences of those evaluations. Drawing on the service quality, dual-process social information processing, expert-novice and risk literatures, I develop a conceptual model to illustrate how skill and motivation moderate the ways people seek and integrate observable information to infer unobservable quality. The influence of quality evaluations on outcome, satisfaction, value, and loyalty is mapped. The model is tested in the context of a classic credence service, health care services with two large datasets using structural equation modeling. Study 1 draws on an existing patient satisfaction database (6,280 records) to measure the sources and consequences of quality evaluations. Study 2 validates Study 1 findings and extends those findings to show the moderating roles of product expertise and perceived risk on quality evaluation processes. The second study is tested with 1,379 consumers (patients) drawn from an online consumer panel. The research suggests service quality in this context refers narrowly to the attributes of the core product (here, the physician?s medical competence); interpersonal and organizational quality are associated with value, satisfaction and loyalty, rather than overall quality. Two paths to quality evaluations appear to exist. In the first, consumers integrate evidence of the physician?s capabilities, practices, and prior outcomes to reach evaluations of technical quality. In the second path, consumers rely on a trust heuristic in which observed interpersonal and organizational quality signals are used to build trust in the physician; that trust, in turn, influences perceptions of technical quality. The trust heuristic appears to be used when the stakes are low and, counterintuitively, when the stakes are high, just when superior evaluations are most needed.Item Mitigating cotton revenue risk through irrigation, insurance, and/or hedging(2009-05-15) Bise, Elizabeth HartTexas is the leading U.S. producer of cotton, and the U.S. is the largest international market supplier of cotton. Risks and uncertainties plague Texas cotton producers with unpredictable weather, insects, diseases, and price variability. Risk management studies have examined the risk reducing capabilities of alternative management strategies, but few have looked at the interaction of using several strategies in different combinations. The research in this study focuses on managing risk faced by cotton farmers in Texas using irrigation, put options, and yield insurance. The primary objective was to analyze the interactions of irrigation, put options, and yield insurance as risk management strategies on the economic viability of a 1,000 acre cotton farm in the Lower Rio Grande Valley (LRGV) of Texas. The secondary objective was to determine the best combination of these strategies for decision makers with alternative preferences for risk aversion. Stochastic values for yields and prices were used in simulating a whole-farm financial statement for a 1000 acre furrow irrigated cotton farm in the LRGV with three types of risk management strategies. Net returns were simulated using a multivariate empirical distribution for 16 risk management scenarios. The scenarios were ranked across a range of risk aversion levels using stochastic efficiency with respect to a function. Analyses for risk averse decision makers showed that multiple irrigations are preferred, and that yield insurance is strongly preferred at lower irrigation levels. The benefits to purchasing put options increase with yields, so they are more beneficial when higher yields are expected from applying more irrigation applications.Item Optimization of a petroleum producing assets portfolio: development of an advanced computer model(2009-05-15) Aibassov, GizatullaPortfolios of contemporary integrated petroleum companies consist of a few dozen Exploration and Production (E&P) projects that are usually spread all over the world. Therefore, it is important not only to manage individual projects by themselves, but to also take into account different interactions between projects in order to manage whole portfolios. This study is the step-by-step representation of the method of optimizing portfolios of risky petroleum E&P projects, an illustrated method based on Markowitz?s Portfolio Theory. This method uses the covariance matrix between projects? expected return in order to optimize their portfolio. The developed computer model consists of four major modules. The first module generates petroleum price forecasts. In our implementation we used the price forecasting method based on Sequential Gaussian Simulation. The second module, Monte Carlo, simulates distribution of reserves and a set of expected production profiles. The third module calculates expected after tax net cash flows and estimates performance indicators for each realization, thus yielding distribution of return for each project. The fourth module estimates covariance between return distributions of individual projects and compiles them into portfolios. Using results of the fourth module, analysts can make their portfolio selection decisions. Thus, an advanced computer model for optimization of the portfolio of petroleum assets has been developed. The model is implemented in a MATLAB? computational environment and allows optimization of the portfolio using three different return measures (NPV, GRR, PI). The model has been successfully applied to the set of synthesized projects yielding reasonable solutions in all three return planes. Analysis of obtained solutions has shown that the given computer model is robust and flexible in terms of input data and output results. Its modular architecture allows further inclusion of complementary ?blocks? that may solve optimization problems utilizing different measures (than considered) of risk and return as well as different input data formats.Item Risk-Based Technology Assessment for Capital Equipment Acquisition Decisions in Small Firms(2013-08-06) Merriweather, Samuel P.Companies and organizations must make decisions concerning capital budgeting. Capital budgeting is a decision-making process that determines whether a firm should purchase equipment to be used on a long-term basis. The initial investment in the equipment is predicted to be returned through revenue gained by the use of the equipment over its lifetime. However, there is inherent risk associated with these investment decisions. Therefore, potential purchasers must decide whether the risk involved with investing in the equipment is justified. This dissertation addresses risk-based technology assessment for capital equipment acquisition decisions in small firms. Technology assessment, here, is concerned with understanding the uncertainty associated with assessing the value predicted in the capital budgeting process. When analyzing the risk for a given technology, we assign a probability law to its net present value. Our primary research contribution is providing an analytical framework together with a computational strategy to support capital equipment budgeting in firms where the value of candidate technologies can represent nearly all the firm?s value. Since small firms typically have limited budgets, spending for technology is always a difficult budgeting decision. The organization?s administration must decide which, if any, among the available technologies will be best for their operation. The process for acquiring technology in many small firms can be filled with challenges. Most important among them is that capital budgeting is typically a ?one-off? decision. These decisions are difficult since the candidate technologies may not have operational data available. Thus, decision makers need some means to predict how the proposed technology (e.g., equipment or machinery) will be used. Hence, firms should follow techniques and procedures based on appropriate normative principles and well-established theory. Senior company executives and/or governance boards are often authorized to approve capital equipment purchases. However, these company leaders may not have adequate expertise in the operations of candidate technologies or may lack the understanding necessary to determine how new technologies may impact other company operations. Appropriate financial evaluation measures and selection criteria that incorporate risk are critical to making sound, quantitative acquisition decisions. The research reported here offers an analytical framework for comparing different technology alternatives in capital budgeting decisions. Comparison is based on the expected net present value and the risk (i.e., probability law on net present value) associated with each decision alternative. To this end, the operational characteristics of each technology alternative are connected to their potential revenue and cost streams. The framework is embedded within a computational architecture that can be customized to account for operations and technologies in specific application scenarios. One major barrier addressed by this research is overcoming the fact that new technologies typically have no historical operational data. Therefore, characterizing the uncertainty of operations (e.g., distribution of the equipment lifetime) can be very difficult. Discrete- event simulation is used to generate potential revenue and cost estimates. We demonstrate the tractability and practicality of the analytical framework and computational architecture via a healthcare technology assessment decision. Data extracted from a published journal article detailing a hospital?s technology assessment decision are used to find the risk of the medical technology using the computational architecture developed. Widely-available, no-cost software tools are employed. Results of the health care example suggest that the financial analysis in the original technology assessment was in- adequate and simplistic. Small firms may find this research particularly beneficial because potential investments can be a significant portion of a small firm?s value.Item Statistical estimation of water distribution system pipe break risk(2009-05-15) Yamijala, ShridharThe deterioration of pipes in urban water distribution systems is of concern to water utilities throughout the world. This deterioration generally leads to pipe breaks and leaks, which may result in reduction in the water-carrying capacity of the pipes from tuberculation of interior walls of the pipe. Deterioration can also lead to contamination of water in the distribution systems. Water utilities which are already facing tight funding constraints incur large expenses in replacement and rehabilitation of water mains, and hence it becomes critical to evaluate the current and future condition of the system for making maintenance decisions. Quantitative estimates of the likelihood of pipe breaks on individual pipe segments can facilitate inspection and maintenance decisions. A number of statistical methods have been proposed for this estimation problem. This thesis focuses on comparing these statistical models on the basis of short time histories. The goals of this research are to estimate the likelihood of pipe breaks in the future and to determine the parameters that most affect the likelihood of pipe breaks. The various statistical models reviewed in this thesis are time linear and time exponential ordinary least squares regression models, proportional hazards models (PHM), and generalized linear models (GLM). The data set used for the analysis comes from a major U.S. city, and the data includes approximately 85,000 pipe segments with nearly 2,500 breaks from 2000 through 2005. The covariates used in the analysis are pipe diameter, length, material, year of installation, operating pressure, rainfall, land use, soil type, soil corrosivity, soil moisture, and temperature. The Logistic Generalized Linear Model fits can be used by water utilities to choose inspection regimes based on a rigorous estimation of pipe breakage risk in their pipe network.Item Studies on Hazard Characterization for Performance-based Structural Design(2010-07-14) Wang, YuePerformance-based engineering (PBE) requires advances in hazard characterization, structural modeling, and nonlinear analysis techniques to fully and efficiently develop the fragility expressions and other tools forming the basis for risk-based design procedures. This research examined and extended the state-of-the-art in hazard characterization (wind and surge) and risk-based design procedures (seismic). State-of-the-art hurricane models (including wind field, tracking and decay models) and event-based simulation techniques were used to characterize the hurricane wind hazard along the Texas coast. A total of 10,000 years of synthetic hurricane wind speed records were generated for each zip-code in Texas and were used to statistically characterize the N-year maximum hurricane wind speed distribution for each zip-code location and develop design non-exceedance probability contours for both coastal and inland areas. Actual recorded wind and surge data, the hurricane wind field model, hurricane size parameters, and a measure of storm kinetic energy were used to develop wind-surge and wind-surge-energy models, which can be used to characterize the wind-surge hazard at a level of accuracy suitable for PBE applications. These models provide a powerful tool to quickly and inexpensively estimate surge depths at coastal locations in advance of a hurricane landfall. They also were used to create surge hazard maps that provide storm surge height non-exceedance probability contours for the Texas coast. The simulation tools, wind field models, and statistical analyses, make it possible to characterize the risk-consistent hurricane events considering both hurricane intensity and size. The proposed methodology for event-based hurricane hazard characterization, when coupled with a hurricane damage model, can also be used for regional loss estimation and other spatial impact analyses. In considering seismic hazard, a risk-consistent framework for displacement-based seismic design of engineered multistory woodframe structures was developed. Specifically, a database of probability-based scale factors which can be used in a direct displacement design (DDD) procedure for woodframe buildings was created using nonlinear time-history analyses with suitably scaled ground motions records. The resulting DDD procedure results in more risk-consistent designs and therefore advances the state-of-the-art in displacement-based seismic design of woodframe structures.