Browsing by Subject "Quality control"
Now showing 1 - 15 of 15
Results Per Page
Sort Options
Item A profit-based lot-sized model for the N-job, M-machine job shop: incorporating quality, capacity, and cycle time(Texas Tech University, 1997-12) Kenyon, George N.Existing economic order quantity models base their calculations upon operations management principles established in the early 1900's. These principles focused primarily upon the control and reduction of the firm's variable costs. Total Quality Management has shifted this focus from costs and local optimization to quality and systems optimization. The marketplace also has changed. It has expanded from being primarily domestic into a globally competitive marketplace. In this new business environment, quality, market share, and profits must be primary elements in all of the firm's operating policies. Recent operations management theories, such as Goldratt's (1980) Theory of Constraints, not only address these concerns but redefine how operations management should think about the production system. This research proposal evaluates the classical economic order quantity model and proposes a new model that addresses the lot sizing decision for shop floor operations. A profit maximizing (rather than a cost minimizing) perspective is taken. In this research, a theorized model is derived that considers cycle time and quality issues in addition to the traditional cost issues of production, holding and setup. To validate this model, empirical data and a simulation model are developed to parameterize and collaborate the findings of the theorized model.Item A study on the profit-based quality-productivity relationship model and its verification in manufacturing industries(Texas Tech University, 1997-05) Lee, Wen-RueyFor years quality and productivity have been regarded as two important indexes of company performance. Many companies hope to pursue high quality and high productivity at the same time. However, in most cases, these two variables are not linked together in the production system mainly because of the variety of their definitions (Human Resources Productivity, 1983; Garvin, 1988; Belcher, 1987; Hoffherr & Moran & Nadler, 1994; Smith, 1990). Additionally, these two variables are not taken into account together because: (1) the objectives of quality management and productivity management are viewed as contradictory,^1 (2) the definitions of quality and productivity are difficult to define, (3) the affecting factors on quality or productivity are too numerous, (4) seemingly, quality and profit have no direct connection, and (5) many companies believe that they have distinct characteristics which may not be subject to any model.Item An analysis of the effect of confirmation bias on industrial radiography(Texas Tech University, 1997-05) Romero, Henry AbrahamThis experiment was undertaken to ascertain the current performance, the variables affecting performance, and performance improvements of a dynamic visual inspection process, real time radiography, used at a Department of Energy laboratory. The system was being used to determine contents of 5 5-gallon mixed waste drums prior to storage. This experiment started with a task analysis in which critical tasks necessary for successful completion of a barrel assessment were determined. One of the factors expected to shape the performance was the provision of a shipper's manifest. The reason is that the manifest induces a confirmation bias on the part of the operator concerning what to expect in the barrel. A literature review was performed concerning perception and industrial inspection processes. From this review, it was determined that this process had not been adequately studied in the literature. One of the main differences was that the confirmation bias was elicited in the literature by the use of tachistoscopic cues which were not feasible in this process. A repeated-measures design experiment was developed that utilized specifically created drums to test the confirmation bias as well as effects of experience and Mental Fatigue. The responses from this experiment were categorized into three binomial distributions: correct identifications, misclassifications. and detection failures. The results were that this experiment failed to note an effect due to the confirmation bias. In addition, this experiment failed to note impact from the Mental Fatigue or experience variables. In examining experience effects, it appears that between three and nine months time on task, the performance of the operators reached a level and did not significantly improxe. However, individual differences alone could have accounted for these results.Item An economic determination of single sampling plans under inspection error(Texas Tech University, 1972-05) Hailey, Max LavoydPrevious research efforts relating to inspector error have been mostly confined to an examination of the factors which influence inspector error and the development of techniques to compensate for the presence of inspector error. In the previous research major effort has not been given to cost as a criterion for the selection of sampling plans. Hence, the purpose of this work is to develop a solution procedure which will enable the selection of a single sampling plan which will satisfy a minimum cost criterion—with or without inspector error.Item Defect detection, design comprehension, and improved productivity with software gauges(Texas Tech University, 1996-12) Butsch, David C.The cost of software development and maintenance is expensive. Therefore, management pushes to cut cost up front by building the product as rapidly as possible. Defect prevention and removal strategies add cost during the development phase but return the investment during the maintenance phase. Rapid development without concern for defect prevention and removal is simply confusing speed with progress. Too much focus on defect free software without concern for the schedule is unrealistic. Therefore, a balanced approach to the development of reliable software is the goal. This thesis considers an approach to defect prevention and removal through the use of software gauges. These gauges offer visibility into software design, automation of lessons learned and improved productivity.Item Defect rate estimation and cost minimization using imperfect acceptance sampling with rectification(1997-05) Wadhwa, Neerja, 1966-; George, Edward I.An important aspect of any quality control program is estimation of the quality of outgoing products. This dissertation applies Acceptance Sampling with rectification to the problem of quality assurance when the inspection procedure is imperfect. The objective is to develop effective rectification sampling plans and estimators based on these plans without making the assumption of a perfect inspection procedure. We develop estimators, under two different sampling plans, for the number of undetected defects remaining after a set of lots has been passed. We compare, by extensive simulation, the proposed estimators with existing ones in terms of Root Mean Squared Error (RMSE). One of our estimators, an empirical Bayes estimator, is seen to consistently obtain substantially lower RMSE overall. We also construct expected cost functions for sampling plans based on fixed sample sizes. We then show how intermediate empirical Bayes estimates of population characteristics can be used to obtain adaptive acceptance sampling plans which vary the sample sizes in order to reduce expected cost. We also compare the two sampling plans on the basis of RMSE and expected cost functions. We show that RMSE comparisons across different levels of machine imperfection can be misleading and propose a measure which accounts for MSE and expected cost simultaneously.Item Design of an automated inkless wafermap system(Texas Tech University, 2001-05) Zhou, XuelinThis project deals with developing a fully automated inkless wafermap system to implement inkless process for Texas Instruments (TI) foundry wafers. In the process, TI to TSMC (Taiwan Semiconductor Manufacturing Company) automated inkless wafermap system has been designed in accordance with TI computer integrated manufacturing inkless standards. The system developed here also can be used for other foundries and subcons. The results will be evaluated as successful when a fully automatic process of uploading Wafermaps from TSMC server to TI WISH (Wafermap Inkless System Host) system is completed. The conclusions and recommendations for future work are discussed.Item Experimental designs and analyses to measure process variation capabilities(Texas Tech University, 1986-12) Raju, PNot availableItem Graded structure of defect categories in automated defect classification(Texas Tech University, 1996-05) Wong, Wan SangDefect review and classification are time consuming, monotonous, fatiguing, and very subjective tasks. Large amount of variability has been observed from different operators, same operator for the same wafer, from wafer to wafer, or from day to day depending on the type of defects observed. This research hypothesizes the problem as due mainly to the graded structure of semiconductor defect categories. Every category exhibits a graded structure. Graded structure refers to degree of membership representation from the most typical to atypical members of a category to those nonmembers that are least similar to the category members. Two levels of categorization and therefore two levels of graded structures occur in knowledge-based defect classification which can adversely affect the performance of the system. The first level of graded structure occurs in the design of defect knowledge base when the defect expert describes defect categories in terms of defects' visual features using linguistic variables. Feature values which are acquired subjectively and represented in natural language are most subjected to the limitation associated with graded structure. Graded structure level two occurs in the actual defect classification where feature similarities between the actual defect and the categorical defects are compared. The focus of this research is minimization of the effect of graded structure in automated defect classification for patterned semiconductor wafers. This research provides insights into the use of standardization and feature combination as a way to minimize the effect of graded structure on categorization. Level one graded structure can be minimized by standardizing defect features, e.g. by representing them as fuzzy sets. The level two graded structure can be minimized by combining human-based and computationally-derived defect features in categorization. Human generated object features are directly visible and recognizable to a human observer while computationally-derived features are not directly perceptible by humans.Item Integrated performance prediction and quality control in manufacturing systems(2014-12) Bleakie, Alexander Q.; Djurdjanovic, DraganPredicting the condition of a degrading dynamic system is critical for implementing successful control and designing the optimal operation and maintenance strategies throughout the lifetime of the system. In many situations, especially in manufacturing, systems experience multiple degradation cycles, failures, and maintenance events throughout their lifetimes. In such cases, historical records of sensor readings observed during the lifecycle of a machine can yield vital information about degradation patterns of the monitored machine, which can be used to formulate dynamic models for predicting its future performance. Besides the ability to predict equipment failures, another major component of cost effective and high-throughput manufacturing is tight control of product quality. Quality control is assured by taking periodic measurements of the products at various stages of production. Nevertheless, quality measurements of the product require time and are often executed on costly measurement equipment, which increases the cost of manufacturing and slows down production. One possible way to remedy this situation is to utilize the inherent link between the manufacturing equipment condition, mirrored in the readings of sensors mounted on that machine, and the quality of products coming out of it. The concept of Virtual Metrology (VM) addresses the quality control problem by using data-driven models that relate the product quality to the equipment sensors, enabling continuous estimation of the quality characteristics of the product, even when physical measurements of product quality are not available. VM can thus bring significant production benefits, including improved process control, reduced quality losses and higher productivity. In this dissertation, new methods are formulated that will combine long-term performance prediction of sensory signatures from a degrading manufacturing machine with VM quality estimation, which enables integration of predictive condition monitoring (prediction of sensory signatures) with predictive manufacturing process control (predictive VM model). The recently developed algorithm for prediction of sensory signatures is capable of predicting the system condition by comparing the similarity of the most recent performance signatures with the known degradation patterns available in the historical records. The method accomplishes the prediction of non-Gaussian and non-stationary time-series of relevant performance signatures with analytical tractability, which enables calculations of predicted signature distributions with significantly greater speeds than what can be found in literature. VM quality estimation is implemented using the recently introduced growing structure multiple model system paradigm (GSMMS), based on the use of local linear dynamic models. The concept of local models enables representation of complex, non-linear dependencies with non-Gaussian and non-stationary noise characteristics, using a locally tractable model representation. Localized modeling enables a VM that can detect situations when the VM model is not adequate and needs to be improved, which is one of the main challenges in VM. Finally, uncertainty propagation with Monte Carlo simulation is pursued in order to propagate the predicted distributions of equipment signatures through the VM model to enable prediction of distributions of the quality variables using the readily available sensor readings streaming from the monitored manufacturing machine. The newly developed methods are applied to long-term production data coming from an industrial plasma-enhanced chemical vapor deposition (PECVD) tool operating in a major semiconductor manufacturing fab.Item Neural networks and evolutionary computation for real-time quality control(Texas Tech University, 1997-05) Patro, SanjuktaQuality control in general and automated quality control in particular are assuming major importance in modem society as technological SNStems are becoming increasingly complex and highly interconnected. Traditional statistical process control techniques are inadequate to address control problems in automated processes because of the high degree of data correlation characterized by such processes. Classical process control methods depend on simplifying assumptions of plant linearity and time-invariance to make the problem analytically tractable. They are therefore limited in effectiveness of the control of complex, nonlinear, multivariable processes. This dissertation attempts to overcome some of the limitations and shortcomings of traditional quality control methods through the integration of two technologies, neural networks and evolutionary computation. An autonomous control system prototype has been developed to control (maintain quality variables within desired limits) a process by providing high level adaptation to changes in the plant, environment, and control objectives. This technology utilizes memory and learning techniques to overcome limitations of traditional control methods, namely data autocorrelation, requirements of simplifying assumptions, and requirements of a priori information about the process. The robustness and applicability of this integrated technology is demonstrated though results obtained from tests involving simulated processes of varying degrees of complexity.Item A new method of data quality control in production data using the capacitance-resistance model(2011-08) Cao, Fei, active 21st century; Lake, Larry W.; Nicot, Jean-PhilippeProduction data are the most abundant data in the field. However, they can often be of poor quality because of undocumented operational problems, or changes in operating conditions, or even recording mistakes (Nobakht et al. 2009). If this poor quality or inconsistency is not recognized as such, it can be misinterpreted as a reservoir issue other than the data quality problem that it is. Thus quality control of production data is a crucial and necessary step that must precede any further interpretation using the production data. To restore production data, we propose to use the capacitance resistance model (CRM) to realize data reconciliation. CRM is a simple reservoir simulation model that characterizes the connectivity between injectors and producers using only production and injection rate data. Because the CRM model is based on the continuity equation, it can be used to analyze the production corresponding to the injection signal in the reservoir. The problematic production data are then put into the CRM model directly and the resulting CRM output parameters are used to evaluate what the correct production response would be under current injection scheme. We also make sensitivity analysis based on synthetic fields, which are heterogeneous ideal reservoir models with imposed geology and well features in Eclipse. The aim is to show how bad data could be misleading and the best way to restore the production data. Using the CRM model itself to control data quality is a novel method to obtain clean production data. We can then apply the new clean production data in reservoir simulators or any other processes where production data quality matters. This data quality control process can help better understand the reservoir, analyze its behavior in a more ensured way and make more reliable decisions.Item Performance monitoring of run-to-run control systems used in semiconductor manufacturing(2008-08) Prabhu, Amogh V., 1983-; Edgar, Thomas F.Monitoring and diagnosis of the control system, though widely used in the chemical processing industry, is currently lacking in the semiconductor manufacturing industry. This work provides methods for performance assessment of the most commonly used control system in this industry, namely, run-to-run process control. First, an iterative solution method for the calculation of best achievable performance of the widely used run-to-run Exponentially Weighted Moving Average (EWMA) controller is derived. A normalized performance index is then defined based on the best achievable performance. The effect of model mismatch in the process gain and disturbance model parameter, delays, bias changes and nonlinearity in the process is then studied. The utility of the method under manufacturing conditions is tested by analyzing three processes from the semiconductor industry. Missing measurements due to delay are estimated using the disturbance model for the process. A minimum norm estimation method coupled with Tikhonov regularization is developed. Simulations are then carried out to investigate disturbance model mismatch, gain mismatch and different sampling rates. Next, the forward and backward Kalman filter are applied to obtain the missing values and compared with previous examples. Manufacturing data from three processes is then analyzed for different sampling rates. Existing methods are compared with a new method for state estimation in high-mix manufacturing. The new method is based on a random walk model for the context states. This approach is also combined with the recursive equations of the Kalman filter. The method is applied to an industrial exposure process by extending the random walk model into an integrated moving average model and weights used to give preference to the context that is more frequent. Finally, a performance metric is derived for PID controllers, when they are used to control nonlinear processes. Techniques to identify nonlinearity in a process are introduced and polynomial NARX models are proposed to represent a nonlinear process. A performance monitoring technique used for MIMO processes is then applied. Finally, the method is applied to an EWMA control case used before, a P/PI control case from literature and two cases from the semiconductor industry.Item Quality control test for carbon fiber reinforced polymer (CFRP) anchors for rehabilitation(2009-12) Huaco Cárdenas, Guillermo David; Jirsa, J. O. (James Otis); Bayrak, OguzhanDifferent strategies can be used to repair, rehabilitate and strengthen existing structures. Techniques based on Fiber Reinforced Polymer (FRP) materials appear to be innovative alternatives to traditional solutions because of their high tensile strength, light, weight, and ease of installation. One of the most common and useful FRPs is Carbon Fiber Reinforced Polymer (CFRP) used in sheets and anchors attached on the concrete surface to strengthen the section through addition of tensile capacity. The purpose of this study was develop a technique for assesses the strength of anchors for quality control purpose. However, to transfer tensile capacity to a concrete surface, the sheets are bonded to the surface with epoxy adhesive. As tension increase, CFRP sheets lose adherence of the epoxy from the concrete surface and finally debond. To avoid this failure, CFRP anchors are applied in addition at the epoxy. The CFRP anchors allow the CFRP sheets to utilize their full tensile capacity and maximize the material efficiency of the CFRP retrofit. The number and size of anchors play a critical role. However the capacity of CFRP anchors has not been investigated extendedly. A methodology for assessing the quality of CFRP anchors was developed using plain concrete beams and reinforced externally with CFRP sheets attached with epoxy and CFRP anchors. Applying load to the beam, allowed the development a tensile force in the CFRP sheets and a shear force on the CFRP anchors. The shear forces in the CFRP anchors were defined by the load applied to the beam and compared with forces based on measured stress in CFRP sheets.Item The sensitivity of sampling inspection to inspector error(Texas Tech University, 1966-05) Davis, Allan StevensNot available