Browsing by Subject "Statistical analysis"
Now showing 1 - 5 of 5
Results Per Page
Sort Options
Item Detection and Diagnosis of Out-of-Specification Failures in Mixed-Signal Circuits(2014-12-03) Mukherjee, ParijatVerifying whether a circuit meets its intended specifications, as well as diagnosing the circuits that do not, is indispensable at every stage of integrated circuit design. Otherwise, a significant portion of fabricated circuits could fail or behave correctly only under certain conditions. Shrinking process technologies and increased integration has further complicated this task. This is especially true of mixed-signal circuits, where a slight parametric shift in an analog component can change the output significantly. We are thus rapidly approaching a proverbial wall, where migrating existing circuits to advanced technology nodes and/or designing the next generation circuits may not be possible without suitable verification and debug strategies. Traditional approaches target accuracy and not scalability, limiting their use to high-dimensional systems. Relaxing the accuracy requirement mitigates the computational cost. Simultaneously, quantifying the level of inaccuracy retains the effectiveness of these metrics. We exercise this accuracy vs. turn-around-time trade-off to deal with multiple mixed-signal problems across both the pre- and post-silicon domains. We first obtain approximate failure probability estimates along with their confidence bands using limited simulation budgets. We then generate ?failure regions? that naturally explain the parametric interactions resulting in predicted failures. These two pre-silicon contributions together enable us to estimate and reduce the failure probability, which we demonstrate on a high-dimensional phase-locked loop test-case. We leverage this pre-silicon knowledge towards test-set selection and post-silicon debug to alleviate the limited controllability and observability in the post-silicon domain. We select a set of test-points that maximizes the probability of observing failures. We then use post-silicon measurements at these test-points to identify systematic deviations from pre-silicon belief. This is demonstrated using the phase-locked loop test-case, where we boost the number of failures to observable levels and use the obtained measurements to root-cause underlying parametric shifts. The pre-silicon contributions can also be extended to perform equivalence checking and to help diagnose detected model-mismatches. The resultant calibrated model allows us to apply our work to the system level as well. The equivalence checking and model-mismatch diagnosis is successfully demonstrated using a high-level abstraction model for the phase-locked loop test-case.Item Genomics analysis on the responses of E. coli cells to varying environmental conditions(2016-05) Yan, Xiwei; Wilke, C. (Claus); Lin, LizhenThe natural living environments of E. coli cells are diverse, varying from mammalian gastrointestinal tracts and soil. Each environment might require distinct metabolic pathways and transporter systems, and long-term evolution has established elaborate regulatory system for E. coli cells to quickly adapt to the changing conditions. Sensing outside stresses and then adopting a different phenotype enable them to take advantage of any possible nutrients and defend against hostile environment. A lot of regulatory mechanisms have been identified by genetic, biochemical and molecular biology methods, and our study aim to build a systematic view on the response of the whole genome to four different environmental conditions. We used statistical tests including Pearson’s tests and Spearman’s tests and multiple testing adjustments to identify feature genes that are induced or repressed significantly across treatment levels. The feature genes identified were partially supported by previous literatures, and some of the novel genes not found in any previous studies may infer a potential research blind spot. Additionally, we compared the correlation tests to the implementation of machine learning algorithms, and discussed the advantage and drawbacks of each method.Item Large-scale statistical analysis of NLDAS variables and hydrologic web applications(2016-05) Espinoza Dávalos, Gonzalo Enrique; Maidment, David R.; McKinney, Daene C; Passalacqua, Paola; Hodges, Ben R; Yang, Zong-LiangThe Land Data Assimilation System (LDAS) is a model developed by the National Aeronautics and Space Administration (NASA) for the purpose of quantifying the heat and water fluxes between the atmosphere and the land-surface hydrology. LDAS has two forms: National (NLDAS) and Global (GLDAS). The NLDAS grid is 1/8° with hourly and monthly estimates since 1979. The LDAS model output provides a comprehensive time-space dataset. A statistical analysis is necessary to obtain descriptive information, understand seasonal patterns, spatial distribution, and frequency distribution of the model output. The current conditions can be compared to those in the past by using statistical distributions for each variable unique to each time interval and spatial grid point. This dissertation objectives are: (1) perform a statistical analysis on the time series of NLDAS variables and model their spatial-temporal probability distributions, (2) improve data exposure through the comparison of current values with the past using web applications, and (3) evaluate the framework for access to NLDAS data. The methodology presented consists of: (1) the estimation of the NLDAS cumulative distribution functions (CDFs) on a daily and a monthly time step and development of the probability models for five variables: precipitation, runoff, soil moisture, evapotranspiration, and temperature. (2) The creation of dynamic websites displaying the maps, time series, and latest values in the NLDAS model and its relation with the historic distributions. And (3) the implementation of time-indexed and spaced-index data access procedures. The methodology is implemented using the latest technologies in high-performance computing (HPC), cloud storage and deployment, and Geographic Information Systems (GIS) that allow performing this analysis on a large dataset (NLDAS) on a national scale, using the United States as a study case. A statistical analysis of the NLDAS model output and the comparison of current values with the historic distribution provides a thorough insight of the ranges, extremes, and seasonal variation of the hydrologic variables. The exposure of large scientific datasets such as NLDAS though the use of standards and web applications can enhance its use in hydrologic sciences and engineering.Item On the solution of rank deficient least squares problems(2011-08) Lira, Mark J.; Iyer, Ram V.; Trindade, A. Alexandre; Howle, Victoria E.In this thesis, we introduce a new method for solving minimum norm least squares problems. This method involves a QR decomposition followed by a Cholesky decomposition (QC). The existing methods in the literature are the Complete Orthogonal Factorization which involves two QR decompositions, and the SVD method. We compare the computational requirements of our method to the Complete Orthogonal Factorization method and show that QC requires fewer ops as long as the matrix is rank deficient. We also compare the sensitivity of the solution obtained by our method and the Complete Orthogonal Factorization method to parameter perturbations for generic matrices. A Kolmogorov-Smirnov test was run on the results of numerical experiments using normally distributed parameter perturbations. The results showed that the Null Hypothesis that the solutions by both algorithms have the same continuous underlying distribution cannot be rejected to a significance level of 0.05. The same numerical experiments showed that for the full rank case, the normal equation method using a Cholesky decomposition is significantly computationally faster than the QR method.Item Simulation and optimization techniques applied in semiconductor assembly and test operations(2016-05) Jia, Shihui; Bard, Jonathan F.; Morrice, Douglas J; Hasenbein, John; Khajavirad, Aida; Gao, ZhufengThe importance of back-end operations in semiconductor manufacturing has been growing steadily in the face of higher customer expectations and stronger competition in the industry. In order to achieve low cycle times, high throughput, and high utilization while improving due-date performance, more effective tools are needed to support machine setup and lot dispatching decisions. In previous work, the problem of maximizing the weighted throughput of lots undergoing assembly and test (AT), while ensuring that critical lots are given priority, was investigated and a greedy randomized adaptive search procedure (GRASP) developed to find solutions. Optimization techniques have long been used for scheduling manufacturing operations on a daily basis. Solutions provide a prescription for machine setups and job processing over a finite the planning horizon. In contrast, simulation provides more detail but in a normative sense. It tells you how the system will evolve in real time for a given demand, a given set of resources and rules for using them. A simulation model can also accommodate changeovers, initial setups and multi-pass requirements easily. The first part of the research is to show how the results of an optimization model can be integrated with the decisions made within a simulation model. The problem addressed is defined in terms of four hierarchical objectives: minimize the weighted sum of key device shortages, maximize weighted throughput, minimize the number of machines used, and minimize makespan for a given set of lots in queue, and a set of resources that includes machines and tooling. The facility can be viewed as a reentrant flow shop. The basic simulation was written in AutoSched AP (ASAP) and then enhanced with the help of customization features available in the software. Several new dispatch rules were developed. Rule_First_setup is able to initialize the simulation with the setups obtained with the GRASP. Rule_All_setups enables a machine to select the setup provided by the optimization solution whenever a decision is about to be made on which setup to choose subsequent to the initial setup. Rule_Hotlot was also proposed to prioritize the processing of the hot lots that contain key devices. The objective of the second part of the research is to design and implement heuristics within the simulation model to schedule back-end operations in a semiconductor AT facility. Rule_Setupnum lets the machines determine which key device to process according to a machine setup frequency table constructed from the GRASP solution. GRASP_asap embeds a more robust selection features of GRASP in the ASAP model through customization. This allows ASAP to explore a larger portion of the feasible region at each decision point by randomizing machine setups using adaptive probability distributions that are a function of solution quality. Rule_Greedy, which is a simplification of GRASP_asap, always picks the setup for a particular machine that gives the greatest marginal improvement in the objective function among all candidates. The purpose of the third part of the research is to statistically validate the relative effectiveness of our top six dispatch rules by comparing their performance on 30 real and randomly generated data sets. Using both GRASP and our ASAP discrete event simulation model, we have (1) identified the general order of dispatch rule performance, (2) investigated the impact of having setups installed on machines at time zero on rule performance, (3) determined the conditions under which restricting the maximum number of changeover affects the rule performance, and (4) studied the factors that might simultaneously affect rule performance with the help of a common random numbers experimental design. In the analysis, the first two objectives, weighted key device shortages and weighted throughput, are used to measure outcomes.