Browsing by Subject "Correlation"
Now showing 1 - 18 of 18
Results Per Page
Sort Options
Item An investigation into the predictive performance of pavement marking retroreflectivity measured under various conditions of continuous wetting(Texas A&M University, 2007-04-25) Pike, Adam MatthewThis thesis research investigated the predictive performance of pavement marking retroreflectivity measured under various conditions of continuous wetting. The researcher compared nighttime detection distance of pavement markings in simulated rain conditions and the retroreflectivity of the same pavement markings in several continuous wetting conditions. Correlation analyses quantified the predictive performance of the resulting retroreflectivity values from the continuous wetting conditions. The researcher measured the retroreflectivity of 18 pavement marking samples under 14 different conditions. The American Society for Testing and Materials (ASTM) has three standards for measuring the retroreflectivity of pavement markings under: dry (E-1710), recovery (E-2177), and continuous wetting conditions (E-2176). Using three ASTM standard conditions resulted in three sets of retroreflectivity data, and variations of the continuous wetting standard produced an additional 11 sets of continuous wetting condition data. The researcher also incorporated detection distance values measured for the same 18 pavement marking samples under three different simulated rainfall conditions at night. The three conditions included: high (0.87 in/hr), medium (0.52 in/hr), and low (0.28 in/hr) flow rates, these rates were to simulate typical rainfall rates in the state of Texas. The correlation analyses measures the linear relationship as well as the logarithmic relationship between the detection distance and the retroreflectivity of the pavement markings. A pavement markings' retroreflectivity is typically used as a detection distance performance indicator, therefore a high degree of correlation between retroreflectivity and detection distance would be desired. A high degree of correlation would indicate that a measured retroreflectivity value of a pavement marking would provide a good indication of the expected detection distance. The researcher conducted analyses for several subgroups of the pavement markings based on the markings type or characteristics. Dry, recovery, and all the continuous wetting retroreflectivity data were correlated to the detection distances. Correlation values found during this thesis research did not show a high degree of correlation for most of the subgroups analyzed. This indicates that measured retroreflectivity would not provide very good predictive performance of the pavement markings detection distance in rainy conditions.Item Coefficient of intrinsic dependence: a new measure of association(Texas A&M University, 2005-08-29) Liu, Li-yu DaisyTo detect dependence among variables is an essential task in many scientific investigations. In this study we propose a new measure of association, the coefficient of intrinsic dependence (CID), which takes value in [0,1] and faithfully reflects the full range of dependence for two random variables. The CID is free of distributional and functional assumptions. It can be easily implemented and extended to multivariate situations. Traditionally, the correlation coefficient is the preferred measure of association. However, it's effectiveness is considerably compromised when the random variables are not normally distributed. Besides, the interpretation of the correlation coefficient is difficult when the data are categorical. By contrast, the CID is free of these problems. In our simulation studies, we find that the ability of the CID in differentiating different levels of dependence remains robust across different data types (categorical or continuous) and model features (linear or curvilinear). Also, the CID is particularly effective when the dependence is strong, making it a powerful tool for variable selection. As an illustration, the CID is applied to variable selection in two aspects: classification and prediction. The analysis of actual data from a study of breast cancer gene expression is included. For the classification problem, we identify a pair of genes that best classify a patient's prognosis signature, and for the prediction problem, we identify a pair of genes that best relates to the expression of a specific gene.Item Conductivity of proppant mixtures(2014-05) Schulz, Eric Clinton; Mohanty, Kishore KumarHydraulic fracturing is a physically complex phenomenon, and there are many variables, both environmental and operational, that affect the overall success of a fracture treatment. Amongst the operational variables, the process of proppant selection is key to ensuring that the induced fractures remain open and permeable. A variety of physical mechanisms act to degrade the permeability of a given proppant packing after deposition in a fracture, the most important of which is the magnitude of the confining stress. The goal of this work is to understand how mixtures of unlike proppants behave under various stress conditions. Specifically, the permeability and conductivity of various mixtures of unlike proppants are measured as a function of confining stress. A secondary investigation is also made into the dependence of permeability on the areal concentration of proppant. Choices of proppants are restricted to those which are currently most common in industry, in terms of both material and size. To that end, mixtures consisted of primarily ceramics and sands with appropriate grain size distributions. Additionally, a light-weight plastic proppant was included in the study. Simple laboratory methods are employed to measure the permeability of the various proppant packings. Values obtained from direct experimentation are compared with values obtained from an independent analytical model. Given the assumptions which are inherent in the analytical model, the experimental and analytical results are in satisfactory agreement. Also, a correlation is developed for single proppants and binary mixtures which predicts permeability as a function of stress, grain size, material, and weight fraction. One key conclusion is that for a binary mixture of proppants, the mixture permeability will not generally be a weighted linear combination of the pure proppant permeabilities. In other words, the permeability of a mixture comprised of 50% (by weight) of one component and 50% of the second component will generally not be halfway between the permeabilities of the single components. A hypothesis is presented which posits that there are threshold weight fractions for each proppant pair that control the permeability of the mixture.Item Correlation between Median Household Income and LEED Sustainable Site Criteria for Public Transportation Access and a Regression Model Predicting Appraised Unit Value of Unimproved Parcels in Houston, Texas(2010-07-14) Ji, QundiThe Leadership in Energy and Environmental Design (LEED) Green Building Rating System provides third-party verification for environmentally sustainable construction. LEED certified buildings often provide healthier work and living environments, however, it does not provide any direct economic incentives to the owners and developers. An early research suggested that there was a significant correlation between appraised unit value of a parcel and LEED sustainable site criteria for public transportation access. Moreover, the regression model for predicting appraised unit value of a parcel suggested that the coefficient of Number of Light Rail Stations was positive, while the coefficient of Number of Bus Stops was negative. This result contradicted our original expectation that both number of bus stops and light rail stations could have a positive effect on the appraised unit value. Hence it becomes important to conduct further research to explain this phenomenon. In this research, Pearson correlation was examined to determine whether there is a significant correlation between median household income and the number of bus stops and light rail stations for a given parcel that meet LEED sustainable site criteria for public transportation access. After confirming no significant correlation exists, multiple regression analysis was applied to establish a regression model for predicting unit value of a given parcel using number of bus stops and light rail stations for a given parcel that meet LEED sustainable site criteria for public transportation access, median household income and parcel area as the independent variables. Result of Pearson correlation indicated that there was no significant correlation exists between median household income and the number of bus stops and light rail stations for a given parcel which met LEED sustainable site criteria for public transportation access. Findings of multiple regression analysis suggested that all independent variables were significant predictors for unit value of a parcel. Besides, this regression model had a higher adjusted R- square value than that of the model which was established by Bhagyashri Joshi. It means that this regression model could better predict appraised unit value of an unimproved parcel.Item Development of reliable pavement models(2011-08) Aguiar Moya, José Pablo, 1981-; Prozzi, Jorge Alberto; Manuel, Lance; Walton, Michael; Machemehl, Randy B.; Yilmaz, HilalAs the cost of designing and building new highway pavements increases and the number of new construction and major rehabilitation projects decreases, the importance of ensuring that a given pavement design performs as expected in the field becomes vital. To address this issue in other fields of civil engineering, reliability analysis has been used extensively. However, in the case of pavement structural design, the reliability component is usually neglected or overly simplified. To address this need, the current dissertation proposes a framework for estimating the reliability of a given pavement structure regardless of the pavement design or analysis procedure that is being used. As part of the dissertation, the framework is applied with the Mechanistic-Empirical Pavement Design Guide (MEPDG) and failure is considered as a function of rutting of the hot-mix asphalt (HMA) layer. The proposed methodology consists of fitting a response surface, in place of the time-demanding implicit limit state functions used within the MEPDG, in combination with an analytical approach to estimating reliability using second moment techniques: First-Order and Second-Order Reliability Methods (FORM and SORM) and simulation techniques: Monte Carlo and Latin Hypercube Simulation. In order to demonstrate the methodology, a three-layered pavement structure is selected consisting of a hot-mix asphalt (HMA) surface, a base layer, and subgrade. Several pavement design variables are treated as random; these include HMA and base layer thicknesses, base and subgrade modulus, and HMA layer binder and air void content. Information on the variability and correlation between these variables are obtained from the Long-Term Pavement Performance (LTPP) program, and likely distributions, coefficients of variation, and correlation between the variables are estimated. Additionally, several scenarios are defined to account for climatic differences (cool, warm, and hot climatic regions), truck traffic distributions (mostly consisting of single unit trucks versus mostly consisting of single trailer trucks), and the thickness of the HMA layer (thick versus thin). First and second order polynomial HMA rutting failure response surfaces with interaction terms are fit by running the MEPDG under a full factorial experimental design consisting of 3 levels of the aforementioned design variables. These response surfaces are then used to analyze the reliability of the given pavement structures under the different scenarios. Additionally, in order to check for the accuracy of the proposed framework, direct simulation using the MEPDG was performed for the different scenarios. Very small differences were found between the estimates based on response surfaces and direct simulation using the MEPDG, confirming the accurateness of the proposed procedure. Finally, sensitivity analysis on the number of MEPDG runs required to fit the response surfaces was performed and it was identified that reducing the experimental design by one level still results in response surfaces that properly fit the MEPDG, ensuring the applicability of the method for practical applications.Item Generalizing the multivariate normality assumption in the simulation of dependencies in transportation systems(2010-05) Ng, Man Wo; Waller, S. Travis; Hasenbein, John J.By far the most popular method to account for dependencies in the transportation network analysis literature is the use of the multivariate normal (MVN) distribution. While in certain cases there is some theoretical underpinning for the MVN assumption, in others there is none. This can lead to misleading results: results do not only depend on whether dependence is modeled, but also how dependence is modeled. When assuming the MVN distribution, one is limiting oneself to a specific set of dependency structures, which can substantially limit validity of results. In this report an existing, more flexible, correlation-based approach (where just marginal distributions and their correlations are specified) is proposed, and it is demonstrated that, in simulation studies, such an approach is a generalization of the MVN assumption. The need for such generalization is particularly critical in the transportation network modeling literature, where oftentimes there exists no or insufficient data to estimate probability distributions, so that sensitivity analyses assuming different dependence structures could be extremely valuable. However, the proposed method has its own drawbacks. For example, it is again not able to exhaust all possible dependence forms and it relies on some not-so-known properties of the correlation coefficient.Item Measuring liquefaction-induced deformation from optical satellite imagery(2014-05) Martin, Jonathan Grant; Rathje, Ellen M.Liquefaction-induced deformations associated with lateral spreading represent a significant hazard that can cause substantial damage during earthquakes. The ability to accurately predict lateral-spreading displacement is hampered by a lack of field data from previous earthquakes. Remote sensing via optical image correlation can fill this gap and provide data regarding liquefaction-induced lateral spreading displacements. In this thesis, deformations from three earthquakes (2010 Darfield, February 2011 Christchurch, and 2011 Tohoku Earthquakes) are measured using optical image correlation applied to 0.5-m resolution satellite imagery. The resulting deformations from optical image correlation are compared to the geologic conditions, as well as field observations and measurements of liquefaction. Measurements from optical image correlation are found to have a precision within 0.40 m in all three cases, and results agree well with field measurements.Item Modeling correlation in binary count data with application to fragile site identification(Texas A&M University, 2006-10-30) Hintze, Christopher JerryAvailable fragile site identification software packages (FSM and FSM3) assume that all chromosomal breaks occur independently. However, under a Mendelian model of inheritance, homozygosity at fragile loci implies pairwise correlation between homologous sites. We construct correlation models for chromosomal breakage data in situations where either partitioned break count totals (per-site single-break and doublebreak totals) are known or only overall break count totals are known. We derive a likelihood ratio test and Neyman??????s C( ????) test for correlation between homologs when partitioned break count totals are known and outline a likelihood ratio test for correlation using only break count totals. Our simulation studies indicate that the C( ????) test using partitioned break count totals outperforms the other two tests for correlation in terms of both power and level. These studies further suggest that the power for detecting correlation is low when only break count totals are reported. Results of the C( ????) test for correlation applied to chromosomal breakage data from 14 human subjects indicate that detection of correlation between homologous fragile sites is problematic due to sparseness of breakage data. Simulation studies of the FSM and FSM3 algorithms using parameter values typical for fragile site data demonstrate that neither algorithm is significantly affected by fragile site correlation. Comparison of simulated fragile site misclassification rates in the presence of zero-breakage data supports previous studies (Olmsted 1999) that suggested FSM has lower false-negative rates and FSM3 has lower false-positive rates.Item Molecular Basis of Heterosis in Maize: Genetic Correlation and 3-Dimensional Network Between Gene Expression and Grain Yield Trait Heterosis(2012-02-14) Zhi, HuiHeterosis, or hybrid vigor, refers to the superiority of F?hybrid performance over the mean of its parents (mid-parent heterosis) theoretically, or the performance of better parents. It has been discovered in many species of plants and animals as well as in humans, and played an important role in enhanced agricultural production, especially in maize, rice and sorghum although the mechanism have not been elucidated. We studied the molecular basis of heterosis with a combined genomics and systems biology approach using model organism maize. We profiled the expression of 39 genes that were most differentially expressed (DG) between the mid-parents and their F1 hybrid (Mo17 x B73) in the 13V-satged, developed whole ear shoots of 13 inbred lines and their 22 F1 hybrids grown in the field trails and phenotyped their 13 traits significant for grain yield. The results showed that gene expression varies significantly among inbreds, among hybrids and in heterosis. The gene clustering heat map and gene action networks in inbreds and hybrids were constructed respectively based on their gene expression profile. According to these pattern analyses, we find dramatically difference between inbreds and their hybrids, although the differential expression varies across different hybrids. Our results also suggest that gene networks are altered from inbreds to hybrids, including their gene contents and wire structures. Last but not least, we have determined the genetic variation correlations between the gene expression and trait performance and constructed the gene networks for the development of 12 of the 13 traits that varied significantly among genotypes. This has led to identification of genes significantly contributing to the performances of the traits, with 1 ? 16 genes per trait. These results have indicated that heterosis results not only from altered expression level of corresponding genes between inbreds and their hybrids, importantly, also from the altered gene action networks and expression patterns. These alternations could be derived from gene actions in a manner of additivity, dominance, over dominance, pseudo-overdominance, epistasis and/or their combinations. Therefore, our findings provide a better understanding of the underlying molecular basis of heterosis. The genes identified for the traits will provide tools for advanced studies of the trait heterosis and could be used as tools for their heterosis breeding in maize. The strategy developed in this study will provide an effective tool for studies of other complicated, quantitative traits in maize and other species.Item Novel tools for ultrafast spectroscopy(2011-12) Jarvis, Thomas William; Li, Elaine; Fink, Manfred; Keto, John; Lim, Sang-Hyun; Shih, Chih-Kang; Sitz, GregExciton dynamics in semiconductor nanostructures are dominated by the effects of many-body physics. The application of coherent spectroscopic tools, such as two-dimensional Fourier transform spectroscopy (2dFTS), to the study of these systems can reveal signatures of these effects, and in combination with sophisticated theoretical modeling, can lead to more complete understanding of the behaviour of these systems. 2dFTS has previously been applied to the study of GaAs quantum well samples. In this thesis, we outline a precis of the technique before describing our own experiments using 2dFTS in a partially collinear geometry. This geometry has previously been used to study chemical systems, but we believe these experiments to be the first such performed on semiconductor samples. We extend this technique to a reflection mode 2dFTS experiment, which we believe to be the first such measurement. In order to extend the techniques of coherent spectroscopy to structured systems, we construct an experimental apparatus that permits us to control the beam geometry used to perform four-wave mixing reflection measurements. To isolate extremely weak signals from intense background fields, we extend a conventional lock-in detection scheme to one that treats the optical fields exciting the sample on an unequal footing. To the best of our knowledge, these measurements represent a novel spectroscopic tool that has not previously been described.Item Prediction of gas-hydrate formation conditions in production and surface facilities(Texas A&M University, 2006-10-30) Ameripour, ShararehGas hydrates are a well-known problem in the oil and gas industry and cost millions of dollars in production and transmission pipelines. To prevent this problem, it is important to predict the temperature and pressure under which gas hydrates will form. Of the thermodynamic models in the literature, only a couple can predict the hydrate-formation temperature or pressure for complex systems including inhibitors. I developed two simple correlations for calculating the hydrate-formation pressure or temperature for single components or gas mixtures. These correlations are based on over 1,100 published data points of gas-hydrate formation temperatures and pressures with and without inhibitors. The data include samples ranging from pure-hydrate formers such as methane, ethane, propane, carbon dioxide and hydrogen sulfide to binary, ternary, and natural gas mixtures. I used the Statistical Analysis Software (SAS) to find the best correlations among variables such as specific gravity and pseudoreduced pressure and temperature of gas mixtures, vapor pressure and liquid viscosity of water, and concentrations of electrolytes and thermodynamic inhibitors. These correlations are applicable to temperatures up to 90????F and pressures up to 12,000 psi. I tested the capability of the correlations for aqueous solutions containing electrolytes such as sodium, potassium, and calcium chlorides less than 20 wt% and inhibitors such as methanol less than 20 wt%, ethylene glycol, triethylene glycol, and glycerol less than 40 wt%. The results show an average absolute percentage deviation of 15.93 in pressure and an average absolute temperature difference of 2.97????F. Portability and simplicity are other advantages of these correlations since they are applicable even with a simple calculator. The results are in excellent agreement with the experimental data in most cases and even better than the results from commercial simulators in some cases. These correlations provide guidelines to help users forecast gas-hydrate forming conditions for most systems of hydrate formers with and without inhibitors and to design remediation schemes such as: ???? Increasing the operating temperature by insulating the pipelines or applying heat. ???? Decreasing the operating pressure when possible. ???? Adding a required amount of appropriate inhibitor to reduce the hydrateformation temperature and/or increase the hydrate-formation pressure.Item Probabilistic assessment of wind loads on a full scale low rise building(2006-05) Bi, Anjing; Smith, Douglas A.; Letchford, Christopher W.; Mehta, Kishor C.The damage to low-rise buildings caused by wind was significant in recent years. How to specify appropriate wind loads on low-rise buildings to balance the requirements of economic and safety design is crucial. The full-scale experiments conducted at TTU have set a widely accepted benchmark on wind loads on low-rise buildings. The large amount of data recently collected from the full-scale measurements carried out at WERFL, TTU enable a systematic investigation of the area-averaged wind pressures and wind forces on low-rise buildings. In this study, the correlations between the wind and its actions were investigated. The assessment methods of wind loads on static structures were calibrated. The techniques employed to predict the peak pressure (force) coefficients were examined. It was found that the Type I extreme distributions fitted to the pseudo-steady pressure (force) coefficients very well. The findings of this study were incorporated into the probabilistic framework in the context of LRFD design to examine their effects on the wind load factors for ultimate limit state design. The primary results indicate that the inconsistent specification of the extreme wind loads can be partly due to ignoring the distributions of the extreme pressure (force) coefficients and their correlations with the wind speeds.Item Probabilistic assessment of wind loads on a full scale low rise building(Texas Tech University, 2006-05) Bi, Anjing; Smith, Douglas A.; Letchford, Christopher W.; Mehta, Kishor C.The damage to low-rise buildings caused by wind was significant in recent years. How to specify appropriate wind loads on low-rise buildings to balance the requirements of economic and safety design is crucial. The full-scale experiments conducted at TTU have set a widely accepted benchmark on wind loads on low-rise buildings. The large amount of data recently collected from the full-scale measurements carried out at WERFL, TTU enable a systematic investigation of the area-averaged wind pressures and wind forces on low-rise buildings. In this study, the correlations between the wind and its actions were investigated. The assessment methods of wind loads on static structures were calibrated. The techniques employed to predict the peak pressure (force) coefficients were examined. It was found that the Type I extreme distributions fitted to the pseudo-steady pressure (force) coefficients very well. The findings of this study were incorporated into the probabilistic framework in the context of LRFD design to examine their effects on the wind load factors for ultimate limit state design. The primary results indicate that the inconsistent specification of the extreme wind loads can be partly due to ignoring the distributions of the extreme pressure (force) coefficients and their correlations with the wind speeds.Item Project Bidding Strategy Considering Correlations between Bidders(2012-10-19) Kim, MinsooOne of the most important considerations in winning a competitive bid is the determination of an optimum strategy developed by predicting the competitor's most probable actions. There may be some common factors for different contractors in establishing their bid prices, such as references for cost estimating, construction materials, site conditions, or labor prices. Those dependencies from past bids can be used to improve the strategy to predict future bids. By identifying the interrelationships between bidders with statistical correlations, this study provides an overview of how correlations among bidders influence the bidders winning probability. With data available for over 7,000 Michigan Department of Transportation highway projects that can be used to calculate correlations between the different contractors, a Monte Carlo simulation is used to generate correlated random variables and the probability of winning from the results of the simulation. The primary focus of this paper outlines the use of conditional probability for predicting the probability of winning to establish a contractor's strategy for remaining bids with their estimated bid price and known information about competitors from past data. If a contractor estimated his/her bid price to be lower than his/her average bid, a higher probability of winning would be achieved with competitors who have a low correlation with the contractor. Conversely, the lower probability of winning decreases as the contractor bid with highly correlated contractors when their bid price is estimated to be higher than the average bid.Item Resolving discrepancies in predicting critical rates in low pressure stripper gas wells(2005-08) Awolusi, Olufemi S.; Oetama, Teddy; Lea, James F.The minimum gas rate for unloading liquids from a gas well has been the subject of much interest, especially in old gas producing fields with declining reservoir pressures. For low-pressure stripper gas wells, liquid production accumulating in the tubing is a pivotal factor that could lead to premature well abandonment and a huge detrimental difference in the economic viability of the well. Some notable correlations that exist for predicting the critical rate required for liquid unloading in gas wells include Turner et al., (1969), Coleman et al., (1991), Nosseir et al. (1997), Li et al. (2001) and Veeken et al., (2003). However, these correlations offer divergent views on the critical rates needed for liquid unloading, and for some correlations in particular, at low wellhead pressures below 50 psia. The objectives of this research are to evaluate discrepancies in the previous work on critical gas velocities required to keep liquid from accumulating in the tubing. Also during the course of the work, data were collected using a flow test facility at Texas Tech University. The critical gas rates were experimentally measured in order to determine an improved correlation with specific application for low-pressure stripper gas wells below 50 psia and at average temperature of 64 oF.Item Resolving discrepancies in predicting critical rates in low pressure stripper gas wells(Texas Tech University, 2005-08) Awolusi, Olufemi S.; Oetama, Teddy; Lea, James F.The minimum gas rate for unloading liquids from a gas well has been the subject of much interest, especially in old gas producing fields with declining reservoir pressures. For low-pressure stripper gas wells, liquid production accumulating in the tubing is a pivotal factor that could lead to premature well abandonment and a huge detrimental difference in the economic viability of the well. Some notable correlations that exist for predicting the critical rate required for liquid unloading in gas wells include Turner et al., (1969), Coleman et al., (1991), Nosseir et al. (1997), Li et al. (2001) and Veeken et al., (2003). However, these correlations offer divergent views on the critical rates needed for liquid unloading, and for some correlations in particular, at low wellhead pressures below 50 psia. The objectives of this research are to evaluate discrepancies in the previous work on critical gas velocities required to keep liquid from accumulating in the tubing. Also during the course of the work, data were collected using a flow test facility at Texas Tech University. The critical gas rates were experimentally measured in order to determine an improved correlation with specific application for low-pressure stripper gas wells below 50 psia and at average temperature of 64 oF.Item Thermo-Hydrological-Mechanical Analysis of a Clay Barrier for Radioactive Waste Isolation: Probabilistic Calibration and Advanced Modeling(2012-07-16) Dontha, LakshmanThe engineered barrier system is a basic element in the design of repository to isolate high level radioactive waste (HLW). In this system, the clay barrier plays a prominent role in dispersing the heat generated from the waste, reduce the flow of pore water from the host rock, and maintaining the structural stability of the waste canister. The compacted expansive clay (generally bentonite blocks) is initially in unsaturated state. During the life time of the repository, the barrier will undergo different coupled thermal, hydrological and mechanical (THM) phenomena due to heating (from the heat-emitting nuclear waste) and hydration (from the saturated host rock). The design of nuclear waste disposal requires the prediction of the long term barrier behavior (i.e. hundred or thousand years), so numerical modeling is a basic component of the repository design. The numerical analyses are performed using mathematical THM formulation and the associated numerical code. Constitutive models are an essential part of the numerical simulations. Those constitutive models represent the intrinsic behavior of the material for the individual physical phenomenon (i.e. thermal, hydraulic and mechanical). Deterministic analyses have shown the potential of such mathematical formulations to describe the physical behavior of the engineered barrier system. However, the effect of the inherent uncertainties associated with the different constitutive models on the global behavior of the isolation system has not been explored yet. The first part of this thesis is related to application of recent probabilistic methods to understand and assess the impact of uncertainties on the global THM model response. Experimental data associated with the FEBEX project has been adopted for the case study presented in this thesis. CODE_BRIGHT, a fully coupled THM finite element program, is used to perform the numerical THM analysis. The second part of this thesis focuses on the complex mechanical behavior observed in a barrier material subjected (during 5 years) to heating and hydration under actual repository conditions The studied experiment is the (ongoing) full scale in-situ FEBEX test at Grimsel test site, Switzerland. A partial dismantling of this experiment has allowed the inspection of the barrier material subjected to varying stresses due to hydration and heating. The clay underwent both elastic and plastic volumetric deformations at different suction and temperature levels with changes in the pre-consolidation pressure and voids ratio that are difficult to explain with conventional models. In this thesis a double structure elasto plastic model is proposed to study the mechanical behavior of this barrier material. The numerical modeling was performed with CODE_BRIGHT. The study shows that the double structure model explains satisfactorily the observed changes in the mechanical behavior of the clay material.Item Using correlation analysis to identify possible device sensitivities(2012-12) Olvera, JuanOne of the semiconductor industry's biggest concerns is yield. Product development engineers have the responsibility of improving yield and must use various tools to ensure that any yield issues are promptly corrected. This study outlines some of the tools that product engineers use, namely that of identifying possible device sensitivities through correlation and regression analysis.