Browsing by Subject "Validation"
Now showing 1 - 12 of 12
Results Per Page
Sort Options
Item Characterization and evaluation of Escherichia coli biotype I strains for use as surrogates for enteric pathogens in validation of beef carcass interventions(2009-05-15) Cabrera-Diaz, ElisaAntimicrobial interventions implemented in slaughter establishments for the reduction of enteric pathogens on beef carcasses must be validated to demonstrate efficacy under commercial operation conditions. Validation studies can be conducted using surrogates which are nonpathogenic organisms that respond to a particular treatment in a manner equivalent to a target pathogen. The purpose of this study was to identify surrogates for enteric pathogens to validate antimicrobial interventions on beef carcasses. The growth, attachment, resistance properties as well as the response to interventions on beef carcasses of nonpathogenic fluorescent protein-marked E. coli strains were evaluated and compared to E. coli O157:H7 and Salmonella strains. Growth curves were performed in tryptic soy broth at 37?C and it was demonstrated that in general, growth parameters were not different among surrogates and target pathogens. Thermal resistance was compared in phosphate buffered saline (PBS) at 55, 60 and 65?C; D-values of surrogates were not different or were higher than those of target pathogens. The acid resistance of surrogates was not different to that of E. coli O157:H7 in PBS acidified with lactic acid at pH 2.5, 3.0 and 3.5. Some Salmonella serotypes were found to be less acid resistant than the surrogates. Survival of surrogates after storage at low temperatures (4?C and -18?C) was not different or was longer than survival of E. coli O157:H7 and Salmonella. Additionally, the cell surface hydrophobicity and attachment to beef carcasses surfaces was not different among surrogates and pathogens. Antimicrobial interventions were applied on carcass surfaces under laboratory controlled conditions. After application of hot water washes, D-values were not different among surrogates and pathogens, while no differences were observed in log reductions (CFU/cm2) among surrogates and pathogens when 2% L-lactic acid sprays at 25 and 55?C were applied, regardless of the temperature and volume of the acid solution. The response of surrogates to water washes and lactic acid sprays on beef carcasses was also evaluated in commercial slaughter facilities. Reductions of surrogates were not different to those of aerobic plate count, coliforms and E. coli. However, the surrogates showed less variation and provided more consistent results than traditional indicators.Item Designs and methodologies for post-silicon timing characterization(2013-05) Jang, Eun Jung; Abraham, Jacob A.Timing analysis is a key sign-off step in the design of today's chips, but technology scaling introduces many sources of variability and uncertainty that are difficult to model and predict. The result of these uncertainties is a degradation in our ability to predict the performance of fabricated chips, i.e., a lack of model-to-hardware matching. The prediction of circuit performance is the result of a complex hierarchy of models ranging from the basic MOSFET device model to full-chip models of important performance metrics including power, frequency of operation, etc. The assessment of the quality of such models is an important activity, but it is becoming harder and more complex with rising levels of variability and the increase in the number of systematic effects observed in modern CMOS processes. The purpose of this research is (i) to introduce special-purpose test structures that specifically focus on ensuring the accuracy of gate timing models, and (ii) to introduce methods that analyze the extracted information, in the form of path delay measurements, using the proposed test structures. The certification of digital design correctness (the so-called signoff) is based largely on the results of performing Static Timing Analysis (STA), which, in turn, is based entirely on the gate timing models. The proposed test structures compare favorably to alternative approaches; they are far easier to measure than direct delay measurement, and they are much more general than simple ring-oscillator structures. Furthermore, the structures are specified at a high level, allowing them to be synthesized using a standard ASIC place-and-route flow, thus capturing the local layout systematic effects which can sometimes be lost by simpler (e.g., ring oscillator) structures. For the silicon timing analysis, we propose methods that deduce segment delays from the path delay measurements. These estimated segment delays using our methods can be directly compared with the timing models. Therefore, it will be easy to identify the cause of timing mismatches. Deducing segment delays from path delays, however, is not an easy problem. The difficulties associated with deconvolving segment delays from measured path delays come from insufficient sampling points. To overcome this limitation, we first group the segments based on certain characteristics of segments, and adapt Moore-Penrose pseudo-inverse method to approximately solve the segment delays. Secondly, we used equality-constrained least squares methods, which enable us to find a unique and optimized solution of segment delays from underdetermined systems. We also propose another improved test structure that has a built-in test pattern generator, and hence does not require ATPG (Automatic Test Pattern Generation). It is a self-timed circuit, and this feature makes the test structure run as fast as it can. Therefore, measurements can be made under high speed switching conditions. Finally, we can study dynamic effects such as timing effects of different levels of switching activities and voltage drop with the new test structure.Item Dynamic Modeling and Wavelet-Based Multi-Parametric Tuning and Validation for HVAC Systems(2014-07-10) Liang, ShuangshuangDynamic Heating, Ventilation, and Air-Conditioning (HVAC) system models are used for the purpose of control design, fault detection and diagnosis, system analysis, design and optimization. Therefore, ensuring the accuracy and reliability of the dynamic models is important before their application. Parameter tuning and model validation is a crucial way to improve the accuracy and reliability of the dynamic models. Traditional parameter tuning and validation methods are generally time-consuming, inaccurate and can only handle a limited number of tuning parameters. This is especially true for multiple-input-multiple-output (MIMO) models due to their intrinsic complexity. This dissertation proposes a new automatic parameter tuning and validation approach to address this problem. In this approach, a fast and accurate model is derived using linearization. Discrete-time convolution is then applied on this linearized model to generate the model outputs. These outputs and data are then processed through wavelet decomposition, and the corresponding wavelet coefficients obtained from it are used to establish the objective function. Wavelets are advantageous in capturing the dynamic information hidden in the time series. The objective function is then optimized iteratively using a hybrid method consisting of a global search genetic algorithm (GA) and a local gradient search method. In order to prove the feasibility and robustness of the proposed approach, it is applied on different dynamic models. These models include an HVAC system model with moving boundary (MB) heat exchanger models, a heat pump model with finite control volume (FCV) heat exchanger models, and a lumped parameter residential conditioned space model. These models generally have a large number of parameters which need tuning. The proposed method is proved to be efficient in tuning single data set, and can also tune the models using multiple experimental or field data sets with different operating conditions. The tuned parameters are further cross-validated using other data sets with different operating conditions. The results also indicate the proposed method can effectively tune the model using both static and transient data simultaneously.Item Measuring the validity of self-monitoring heart rate and activity tracking wearables(2016-05) Dooley, Erin Elizabeth; Bartholomew, John B.; Jowers, EsbellePURPOSE: To examine the validity of wearable physical activity tracking devices. METHODS: Participants were 62 students. Participants wore a Polar HR chest strap, Actigraph GT3X+ Acceleremetor, Apple Watch, Fitbit Charge HR, Garmin Forerunner 225 and were connected to a metabolic cart. Participants completed a seated 10-min baseline period, 4-min stages of light, moderate and vigorous intensities, and a 10-min seated recovery. Heart rate (HR), energy expenditure (EE) and step count were examined for each bout of exercise. ANALYSIS: Two-way RM-ANOVA were performed to compare the ability of the wearable devices to accurately measure each outcome relative to the criterion. Paired-samples t-tests compared the number of steps in observed videos and those reported for Fitbit. RESULTS: For HR, Apple Watch was accurate at all stages except in light and moderate intensities, in which the device measured lower HR. The Fitbit Charge HR produced accurate results in moderate PA, but measured significantly higher HR readings at baseline and light activity and lower HR readings at vigorous intensity. The Garmin Forerunner 225 was only accurate at vigorous intensity PA and measured significantly higher HR readings at all other intensities. For EE, the Fitbit measured significantly higher EE for all stages except vigorous intensity and recovery stages. The Apple Watch and Garmin measured significantly higher EE at all PA intensities. The Fitbit measured significantly lower step count than the criterion at all PA intensities. DISCUSSION: This study provides novel findings for Apple Watch and Garmin devices and provides new information regarding Fitbit accuracy. No studies have reported accuracy of these devices to measure HR. Future studies should investigate why differences between the devices exist.Item Mocking embedded hardware for software validation(2016-08) Kim, Steve Seunghwan; Khurshid, Sarfraz; Bard, WilliamThis report makes the case for unit testing embedded systems software, a procedure traditionally found in application software development. While the challenges of developing and executing unit tests on embedded software are acknowledged, multiple solutions are presented. The GNU toolchain and a Texas Instruments microcontroller are used as an example embedded target. Two applications, one introductory and one more realistic, were developed for this embedded target using the C programming language. This report details the procedure required to apply open-source frameworks, Unity and CMock, to the two embedded applications. These frameworks, combined with the techniques outlined in this report, accomplished several goals of unit testing. The goals included automated validation of the embedded applications, increased code coverage, and protection against regression defects. In addition, it is shown how unit tests led to more modular software architecture. Potential ideas to extend this research to other tools, environments, and frameworks are also discussed.Item Numerical Simulation of Three-Dimensional Tsunami Generation by Subaerial Landslides(2012-09-19) Kim, Gyeongbo 1978-Tsunamis are one of the most catastrophic natural events impacting coastal regions often generated by undersea earthquakes. Nevertheless, in enclosed basins, i.e., fjords, reservoirs and lakes, subaerial or submarine landslides can initiate devastating tsunamis with similar consequences. Although a subaerial or submarine landslide that impinges into a large water body can generate a tsunami, subaerial landslides are much more efficient tsunami generators than its counterpart. In this study we aim to integrate laboratory scale experiments of tsunami generation by subaerial landslide with numerical models. The work focuses on the numerical validation of two three-dimensional Navier-Stokes (3D-NS) models, FLOW-3D and our developed model TSUNAMI3D. The models are validated based on previous large scale laboratory experiments performed by a tsunami research team lead by Dr. Hermann Fritz, Georgia Institute of Technology. Three large scale landslide scenarios were selected from the set of laboratory experiments, namely, fjord like, headland and far field coastline. These scenarios showed that complex wave fields can be generated by subaerial landslides. The correct definition and evolution of the wave field are key to accurate modeling the ensuing tsunami and its effect in coastal regions. In this study, comparisons are performed between numerical results and laboratory experiments. Methodology and key parameters for soil rheology are defined for model validations. Results of the models are expected to be under the allowable errors indicated by the National Tsunami Hazard Mitigation Program (NTHMP), National Oceanic and Atmospheric Administration (NOAA) guidelines for validation of tsunami numerical models. The ultimate goal of this research is to obtain better tsunami calculation tools for real-world application of 3-D models for landslide tsunamis, which are necessary for the construction of inundation maps in the Gulf of Mexico and the Caribbean regions.Item Reconstruction of 3D Neuronal Structures from Densely Packed Electron Microscopy Data Stacks(2012-10-19) Yang, Huei-FangThe goal of fully decoding how the brain works requires a detailed wiring diagram of the brain network that reveals the complete connectivity matrix. Recent advances in high-throughput 3D electron microscopy (EM) image acquisition techniques have made it possible to obtain high-resolution 3D imaging data that allows researchers to follow axons and dendrites and to identify pre-synaptic and post-synaptic sites, enabling the reconstruction of detailed neural circuits of the nervous system at the level of synapses. However, these massive data sets pose unique challenges to structural reconstruction because the inevitable staining noise, incomplete boundaries, and inhomogeneous staining intensities increase difficulty of 3D reconstruction and visualization. In this dissertation, a new set of algorithms are provided for reconstruction of neuronal morphology from stacks of serial EM images. These algorithms include (1) segmentation algorithms for obtaining the full geometry of neural circuits, (2) interactive segmentation tools for manual correction of erroneous segmentations, and (3) a validation method for obtaining a topologically correct segmentation when a set of segmentation alternatives are available. Experimental results obtained by using EM images containing densely packed cells demonstrate that (1) the proposed segmentation methods can successfully reconstruct full anatomical structures from EM images, (2) the editing tools provide a way for the user to easily and quickly refine incorrect segmentations, (3) and the validation method is effective in combining multiple segmentation results. The algorithms presented in this dissertation are expected to contribute to the reconstruction of the connectome and to open new directions in the development of reconstruction methods.Item Simulation and Validation of Vapor Compression System Faults and Start-up/Shut-down Transients(2012-10-19) Ayyagari, BalakrishnaThe statistics from the US Department of Energy show that about one-third of the total consumption of electricity in the households and industries is due to the Air Conditioning and Refrigeration (AC & R) systems. This wide usage has prompted many researchers to develop models for each of the components of the vapor compression systems. However, there has been very little information on developing simulation models that have been validated for the conditions of start-up/shutdown operations as well as vapor compression system faults. This thesis addresses these concerns and enhances the existing modeling library to capture the transients related to the above mentioned conditions. In this thesis, the various faults occurring in a vapor compressor cycle (VCC) have been identified along with the parameters affecting them. The transients of the refrigerant have also been studied with respect to the start-up/shutdown of a vapor compression system. All the simulations related to the faults and start-up/shutdown have been performed using the vapor compression system models developed in MATLAB/Simulink environment and validated against the 3-ton air conditioning unit present in the Thermo-Fluids Control Laboratory at Texas A & M University. The simulation and validation results presented in this thesis can be used to lay out certain rules of thumb to identify a particular fault depending on the unusual behavior of the system thus helping in creating certain fault diagnostic algorithms and emphasize the importance of the study of start-up/shutdown transient characteristics from the point of actual energy efficiency of the systems. Also, these results prove the capability and validity of the finite control volume models to describe VCC system faults and start-up/shutdown transients.Item The Method of Manufactured Universes for Testing Uncertainty Quantification Methods(2011-02-22) Stripling, Hayes FranklinThe Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework calls for a manufactured reality from which "experimental" data are created (possibly with experimental error), an imperfect model (with uncertain inputs) from which simulation results are created (possibly with numerical error), the application of a system for quantifying uncertainties in model predictions, and an assessment of how accurately those uncertainties are quantified. The application presented for this research manufactures a particle-transport "universe," models it using diffusion theory with uncertain material parameters, and applies both Gaussian process and Bayesian MARS algorithms to make quantitative predictions about new "experiments" within the manufactured reality. To test further the responses of these UQ methods, we conduct exercises with "experimental" replicates, "measurement" error, and choices of physical inputs that reduce the accuracy of the diffusion model's approximation of our manufactured laws. Our first application of MMU was rich in areas for exploration and highly informative. In the case of the Gaussian process code, we found that the fundamental statistical formulation was not appropriate for our functional data, but that the code allows a knowledgable user to vary parameters within this formulation to tailor its behavior for a specific problem. The Bayesian MARS formulation was a more natural emulator given our manufactured laws, and we used the MMU framework to develop further a calibration method and to characterize the diffusion model discrepancy. Overall, we conclude that an MMU exercise with a properly designed universe (that is, one that is an adequate representation of some real-world problem) will provide the modeler with an added understanding of the interaction between a given UQ method and his/her more complex problem of interest. The modeler can then apply this added understanding and make more informed predictive statements.Item Toward a predictive model of tumor growth(2011-05) Hawkins-Daarud, Andrea Jeanine; Oden, J. Tinsley (John Tinsley), 1936-; Babuska, Ivo; Ghattas, Omar; Zaman, Muhammad; Cristini, Vittorio; Prudhomme, SergeIn this work, an attempt is made to lay out a framework in which models of tumor growth can be built, calibrated, validated, and differentiated in their level of goodness in such a manner that all the uncertainties associated with each step of the modeling process can be accounted for in the final model prediction. The study can be divided into four basic parts. The first involves the development of a general family of mathematical models of interacting species representing the various constituents of living tissue, which generalizes those previously available in the literature. In this theory, surface effects are introduced by incorporating in the Helmholtz free ` gradients of the volume fractions of the interacting species, thus providing a generalization of the Cahn-Hilliard theory of phase change in binary media and leading to fourth-order, coupled systems of nonlinear evolution equations. A subset of these governing equations is selected as the primary class of models of tumor growth considered in this work. The second component of this study focuses on the emerging and fundamentally important issue of predictive modeling, the study of model calibration, validation, and quantification of uncertainty in predictions of target outputs of models. The Bayesian framework suggested by Babuska, Nobile, and Tempone is employed to embed the calibration and validation processes within the framework of statistical inverse theory. Extensions of the theory are developed which are regarded as necessary for certain scenarios in these methods to models of tumor growth. The third part of the study focuses on the numerical approximation of the diffuse-interface models of tumor growth and on the numerical implementations of the statistical inverse methods at the core of the validation process. A class of mixed finite element models is developed for the considered mass-conservation models of tumor growth. A family of time marching schemes is developed and applied to representative problems of tumor evolution. Finally, in the fourth component of this investigation, a collection of synthetic examples, mostly in two-dimensions, is considered to provide a proof-of-concept of the theory and methods developed in this work.Item Validating the 1-cm orbit(2015-12) McWilliams, Hannah Elizabeth; Bettadpur, Srinivas Viswanath, 1963-; Ries, JohnDetermination of three dimensional orbit accuracy to the 1-cm level is a difficult problem for even today's most well-tracked satellites. Gravity fields that are extracted from low earth orbit (LEO) satellites operating near the 1-cm accuracy level provide a better understanding of Earth's systems. The importance of the 1-cm orbit requires a closer look at the means of orbit error validation for these LEO satellites. The focus of this analysis is on the orbits of the Gravity Recovery and Climate Experiment (GRACE) satellite pair. The main methods of validation used on GRACE are the analysis of SLR residuals and the generation of statistics of orbit overlaps. The derivation of a method based on the Guier plane analysis of range residuals is presented along with the results of its application. By combining the analysis of various methods for determining orbit accuracy, the processes for validating the 1-cm orbit are assessed. The results of the three methodologies applied to the SLR residuals for a dynamic orbit indicate that GRACE has radial orbit error of 1.5-cm root-mean-square (RMS) and a three dimensional orbit error of roughly 3-cm RMS. Therefore, it is highly unlikely that GRACE has achieved a 1-cm benchmark. The orbit overlaps study resulted in overly optimistic statistics and cannot be used as a measure of orbit accuracy.Item Validation of Hot Water and Lactic Acid Sprays for the Reduction of Enteric Pathogens on the Surface of Beef Carcasses(2011-02-22) Wright, Kyle D.Escherichia coli O157:H7 and Salmonella have emerged as the most common foodborne enteric pathogens causing human illness from the consumption of beef. By mandate of the U.S. Department of Agriculture (USDA), Food Safety and Inspection Service (FSIS), the industry has implemented a Hazard Analysis and Critical Control Points (HACCP) system that utilize intervention technologies for controlling, preventing, and/or reducing enteric pathogens. In addition, USDA-FSIS has mandated that each facility must validate, monitor, and verify the effectiveness of each intervention implemented to eliminate E. coli O157:H7 and Salmonella. For this study, microbial decontamination interventions at two beef slaughter facilities were validated to demonstrate effectiveness in eliminating or reducing enteric pathogens. The facilities selected utilized either a lactic acid spray treatment or a combination of hot water followed by a lactic acid treatment. At both facilities, mesophilic plate counts (MPC) were significantly (P < 0.05) reduced, and E. coli and coliforms were eliminated below detectable limits at both facilities. No Salmonella positive samples were detected after either facility's intervention sequence. The framework used in this research to validate interventions can also be utilized in the future for yearly verification of the effectiveness of each intervention.