Browsing by Subject "Automation"
Now showing 1 - 13 of 13
Results Per Page
Sort Options
Item A generic real-time process control system(Texas Tech University, 1989-12) Arrant, Edwin KeithA major problem facing manufacturers is the design and implementation of flexible automation systems. This problem is complicated by the many differing requirements for automated systems. Every facility has its own specific requirements; therefore, a generic factory control system design could provide the flexibility and adaptability to solve a wide variety of automation needs. Since factory automation systems are implemented from the bottom up, it is necessary to ensure that the initial automated subsystems are going to be compatible with future system enhancements. This thesis describes the various levels of factory automation and establishes a functional specification for each subsystem in the automated facility. These specifications provide the general information required to define the design objectives for the facility subsystems. The first step toward obtaining a solution to the automation problem is to develop a generic software control system for process equipment. A first-generation control system has been developed and applied to a four-module HF vapor etcher system, used in the processing of semiconductor wafers.Item A Research Center for RAI: (Robotics and Automated Technologies, Inc.)(Texas Tech University, 1985-04) Cober, James MNot Available.Item Attitudes toward computer technology between nursing and medical educators(Texas Tech University, 1988-05) Harsanyi, Bennie E.This descriptive study compared nursing and medical educators' attitudes toward computer technology. The primary assumption was that an individual's perception of, acceptance of, and resistance to automation was related to attitudes toward automation. The effect of demographic characteristics, previous experience and education regarding computer technology, and the usage of computer technology in educational and clinical environments was also addressed. A belief exists that biomedical technological innovations are being diffused at a slow rate. The educator's mission is to prepare professionals to function in technologically-based health care delivery systems. Attitudes regarding the human-machine interface can impact the diffusion process and this mission. Reported research regarding attitudes toward computer technology does not compare nursing and medical educators' attitudes. One hundred seventy-seven Texas nursing and medical educators responded to the Attitudes Toward Computers in General questionnaire and the Computer Attitude Profile. The results indicated no significant difference between nursing and medical educators' attitudes toward computer technology. Demographic variables were not significant. Previous experience with computer technology was significant, but negatively correlated; whereas, education regarding computer technology was not. Usages of computer technology in educational and clinical practice environments were not significant. However, word processing and record keeping (educational environment) negatively correlated with attitude toward computers in general. In the clinical environment, diagnosing was positively correlated and patient assessment and network systems were negatively correlated with attitudes toward computers in general. Recommendations for further study included investigation of: (1) the relationship of peer influence and the educators' role in professional networks on attitudes toward computer technology in educational and clinical practice environments; (2) educators' attitudes toward specific technological innovations perceived as enhancing or threatening the traditional professional roles in educational and clinical practice environments; and (3) the use of attitude assessment data as they impact technological innovation in the development of education, training, orientation, implementation, and attitude modification strategies. Other recommendations were included.Item Automation of in vitro selections(2004) Sooter, Letha Jane; Ellington, Andrew D.Automation is a powerful tool, which may be used to increase the throughput of many otherwise laborious manual manipulations. Aptamer and deoxyribozyme selections are prime examples of processes, which require substantial amounts of time at the bench, but which are amenable to automation. Double-stranded DNA binding sites that bound with high affinity to the nuclear factor kappa B (NFκB) p50 homodimer were selected using a Tecan Genesis workstation. This was followed by selections against whole cell lysates. The resultant sequences represented an array of transcription factor binding sites within the E. coli genome. Finally, a Biomek2000 was used to perform a deoxyribozyme ligase selection, which formed an unnatural phosphorothioate linkage.Item Characterization of aggregate shape properties using a computer automated system(Texas A&M University, 2005-02-17) Al Rousan, Taleb MustafaShape, texture, and angularity are among the properties of aggregates that have a significant effect on the performance of hot-mix asphalt, hydraulic cement concrete, and unbound base and subbase layers. Consequently, there is a need to develop methods that can quantify aggregate shape properties rapidly and accurately. In this study, an improved version of the Aggregate Imaging System (AIMS) was developed to measure the shape characteristics of both fine and coarse aggregates. Improvements were made in the design of the hardware and software components of AIMS to enhance its operational characteristics, reduce human errors, and enhance the automation of test procedure. AIMS was compared against other test methods that have been used for measuring aggregate shape characteristics. The comparison was conducted based on statistical analysis of the accuracy, repeatability, reproducibility, cost, and operational characteristics (e.g. ease of use and interpretation of the results) of these tests. Aggregates that represent a wide range of geographic locations, rock type, and shape characteristics were used in this evaluation. The comparative analysis among the different test methods was conducted using the Analytical Hierarchy Process (AHP). AHP is a process of developing a numerical score to rank test methods based on how each method meets certain criteria of desirable characteristics. The outcomes of the AHP analysis clearly demonstrated the advantages of AIMS over other test methods as a unified system for measuring the shape characteristics of both fine and coarse aggregates. A new aggregate classification methodology based on the distribution of their shape characteristics was developed in this study. This methodology offers several advantages over current methods used in practice. It is based on the distribution of shape characteristics rather than average indices of these characteristics. The coarse aggregate form is determined based on three-dimensional analysis of particles. The fundamental gradient and wavelet methods are used to quantify angularity and surface texture, respectively. The classification methodology can be used for the development of aggregate shape specifications.Item Development of Systems to Improve Cotton Module Shape(2011-10-21) Hardin, Robert GlenProperly constructed modules will prevent reduced lint value and increased ginning costs when significant rainfall occurs. Additionally, cotton producers often have difficulty finding adequate labor during harvest. These issues were addressed by developing a graphical operator feedback system, a biomass package measurement system, a powered tramper, and an autonomous module forming system. A system that provided feedback on the module shape by recording the position of the tramper and carriage was used to direct the operator to move cotton to appropriate locations. The system correctly predicted the height of 67% of data points. Use of the feedback system resulted in a 55% reduction in water collection area of the modules. The module builder operators indicated that the system was useful. The module builder feedback system is a simple, useful, and inexpensive tool that can have a rapid payback for producers. A powered tramper, with an auger to move cotton to the center of the module, was developed to replace the conventional tramper. The powered tramper operated automatically without affecting the operating speed or pressure of the tramper cylinder. During testing, the powered tramper was observed moving cotton to the center and crowned modules were produced. A biomass package measurement system was developed to record the height at multiple points on the top surface of modules. The system was found to produce repeatable measurements with an error of 5 cm. Data collected with this system did not indicate a difference in module shape when using the powered tramper; however, during these tests the powered tramper was turned off prematurely due to an improperly sized valve on the module builder. An automated module building system capable of both moving and tramping cotton was developed. This system utilized the feedback system sensors and photoelectric sensors to determine the location of cotton in the builder. A wireless display allowed the boll buggy operator to control the automatic system. The automatic system constructed modules with 64% less water collection area in an average time of 37.4 min. Cotton producers indicated that the system was easy to use and of significant value in reducing labor requirements.Item A first-principles directional drilling simulator for control design(2014-12) Leonard, Rebecca Leigh; Van Oort, Eric; Pryor, Mitchell WayneA directional drilling simulator was constructed using a re-formulation of first-principles classical mechanics in order to serve as a platform for advanced control design. Dedicated focus was placed on building a modular solution that would interface with an existing Supervisory Control And Data Acquisition (SCADA) architecture. Model complexity was restricted to include only the features required to make an immediate step change in tool face control performance through more accurate determination of torsional dead time and time constant values. Development of this simulator advanced the art of drilling automation by building a foundation upon which developers may design novel control schemes using big data gathered in the modern oilfield. This first-principles model is supported by theoretical formulation of equations of motion that capture fundamental behavior of the drill string during both rotary and slide drilling operations. Wellbore trajectory was interpolated between survey points using the Minimum Curvature Method, and a semi-soft-string drill string model was assumed. Equations of motion were derived using energy methods captured in both Hamiltonian and Lagrangian mechanics and solved using the finite-element method. Transient dynamic solutions were obtained using Newmark integration methods. A sensitivity analysis was conducted to determine which parameters played the most influential roles in dynamic drill string behavior for various operational scenarios and to what extent those parameters influenced torsional dead time and time constant calculations. The torsional time constant was chosen as a measure of correlation between case studies, due to the significant role this value plays in state-of-the-art tool face control algorithms. Simulation results were validated using field data collected from rigs using a SCADA system to operate in various shale plays in North America. Results from field tests were used to compare torsional time constant values calculated using manually-determined, simulation-based, and analytical methods and investigate directional drilling performance over a range of operational scenarios. Simulation-based time constant calculation results were consistently more accurate than analytically-determined values when compared to manually-tuned values. The first-principles directional drilling simulator developed for this study will be adopted by the existing SCADA system in order to standardize and improve slide drilling performance.Item Industrial automation and control in hazardous nuclear environments(2015-05) Peterson, Clinton II Dean; Landsberger, Sheldon; Pryor, Mitchell WayneThis report discusses the design and implementation of an automated system for use in geometrically-constrained, hazardous glovebox environments. This system’s purpose is to reduce a hemispherical plutonium pit into smaller pieces that fit inside of a crucible. The size reduction of plutonium pits supports stockpile stewardship efforts by the United States Department of Energy. The automation of this process increases the safety of radiation workers by handling radioactive nuclear material. This decreases glovebox worker dose and exposure to tools, sharps, and fines. This effort examines the hardware and software framework developed to support the use of a Port Deployed Manipulator (PDM) for a contact task. This research effort uses a 7 Degree-of-Freedom (DOF) PDM and a micropunch to reduce hemispherical pit surrogates. Formulation of the material reduction execution algorithm involved addressing a variety of topics related to industrial automation: 1. Collision detection and object recognition based on user-specified parameters. 2. Joint torque monitoring 3. Online motion planning for contact tasks 4. Object-in-hand industrial manufacturing 5. Grasping and handling of nuclear material 6. Software compliance via robust nonlinear control methods A high-bandwidth collision detection algorithm involving joint torque monitoring was developed to increase robot safety during operation. The motion planning algorithm developed for this effort takes variable geometric properties to be used with a range of hemishells. The algorithm’s feasibility was validated on a hardware test bed in a laboratory setting. Hardware cold tests conclude that mechanical compliance is sufficient for task completion. However, software compliance would increase performance, ef- ficiency, and safety during task execution. Two different nonlinear force control laws (feedback linearization and sliding mode control) that minimize object shear forces were developed using a simplified material reduction simulation. It is recommended that glovebox automation research continue to increase worker safety throughout the DOE complex.Item Investigations Into Using Machine Learning Models to Automate the Sorting of Digitized Texas State Publications(Texas Digital Library, 2023-05-16) Rikka, PraneethOver the past ten years the UNT Libraries has been digitizing Texas State Publications it receives from the Texas State Library and Archives Commission as part of the Texas State Depository program. During this time, over 19,000 items have been digitized and made available in The Portal to Texas History’s Texas State Publications Collection (https://texashistory.unt.edu/explore/collections/TXPUB/). Each year, batches of publications are sent to a digitization vendor, digitized, and sent back to UNT where each publication is sorted so that similar items are grouped together to assist in metadata creation. This sorting usually happens with sets of over 1,000 publications at a time. The manual sorting process is time consuming and requires expert knowledge of the subject matter. Recent advances in machine learning offer an automated approach to this manual sorting of documents. This poster presents a research project to build and test a classification model to assist librarians in the sorting of digitized Texas State Publications into groups. It discusses the labeled dataset that was created to test different machine learning approaches and presents the findings of text-based and image-based classification models. We hope that this poster encourages others in two specific ways, first to build datasets that highlight specific problems in the library and archives space that can be worked on by students interested in real world problems, and second, to think about processes that exist in their institution that might benefit from judicious use of machine learning to complement human decisions in making resources available for users.Item Manual alignment of IVS sequences and its implication in multiple sequence alignment(2011-12) Jiang, Yanan, master of cellular and molecular biology; Gutell, Robin; Miranker, Daniel P.It is recognized that an iterative comparative analysis of large-scale homologous RNAs significantly promote the understanding of an RNA family. The Gutell lab is renowned for maintaining high quality RNA sequence alignments and accurately predicted RNA secondary structures using this approach. While the current available alignment and structure data are mainly obtained by trained domain experts with extensive manual effort, it is highly desired that this process is automated and replicable given the exponentially growing number of RNA sequence data and the amount of time required for expert training. In this thesis, we learn the processes involved in comparative analysis by manually aligning a non-coding RNA family, IVS sequences, with the supervision of Dr. Gutell. Each process is then simulated by mathematical objective functions and algorithms. We also evaluate the current available RNA analysis packages that aim each of the processes. Finally, a new RNA sequence alignment algorithm incorporating structure information that can be extended for different alignment tasks is proposed.Item Methodology for prototyping increased levels of automation for spacecraft rendezvous functions(2009-05-15) Hart, Jeremy JayThe Crew Exploration Vehicle (CEV) necessitates higher levels of automation than previous NASA vehicles due to program requirements for automation, including Automated Rendezvous and Docking (AR&D). Studies of spacecraft development often point to the locus of decision-making authority between humans and computers (i.e. automation) as a prime driver for cost, safety, and mission success. Therefore, a critical component in the CEV development is the determination of the correct level of automation. To identify the appropriate levels of automation and autonomy to design into a human space flight vehicle, NASA has created the Function-specific Level of Autonomy and Automation Tool (FLOAAT). This research develops a methodology for prototyping increased levels of automation for spacecraft rendezvous functions. This methodology was used to evaluate the accuracy of the FLOAAT-specified levels of automation, via prototyping. Two spacecraft rendezvous planning tasks were selected and then prototyped in Matlab using Fuzzy Logic (FL) techniques and existing Shuttle rendezvous trajectory algorithms. The prototyped functions are the determination of the maximum allowable Timeof- IGnition (TIG) slip for a rendezvous phasing burn and the evaluation of vehicle position relative to Transition initiation (Ti) position constraints. The methodology for prototyping rendezvous functions at higher levels of automation is judged to be a promising technique. The results of the prototype indicate that the FLOAAT recommended level of automation is reasonably accurate and that FL can be effectively used to model human decision-making used in spacecraft rendezvous. FL has many desirable attributes for modeling human decision-making, which makes it an excellent candidate for additional spaceflight automation applications. These conclusions are described in detail as well as recommendations for future improvements to the FLOAAT method and prototyped rendezvous functions.Item Out of the Woods: Charting Metadata with Digital Tools(Texas Digital Library, 2022-05-23) Ramirez, Ada Laura; Smith, Marian; Bowaniya, Salima; Weidner, AndrewIn the fall of 2021, a metadata working group was created and charged to streamline the process of evaluating and refining metadata for a retrospective thesis and dissertations digitization (TDD) project at the University of Houston Libraries. The group took to their task by improving existing workflows and reworking scalability through the introduction of an updated automated tool kit created for the team by another member involved with the TDD project. Using MARC records as an existing foundation, metadata was transformed into Dublin Core formatted records with MARCEdit and OpenRefine. Group members then evaluate each Dublin Core metadata record and edit and enhance metadata as needed. As part of the workflow, copyright status is also evaluated and noted in the metadata record. The automated tool kit aids in scaling production by allowing for batch metadata verification, file sorting, and writing EXIF data to the PDF files. This poster highlights the MARC to Dublin Core metadata transformation and the use of the automation tool kit to streamline the metadata process, a necessary step in a large-scale digitization project that promotes accessibility to scholarly materials.Item Standardization for intelligent detection and autonomous operation of non-structured hardware, and its application on railcar brake release operation(2015-05) Hammel, Christopher Scott; Tesar, Delbert; Ashok, PradeepkumarThis thesis introduces a standard framework for evaluating and planning for desired autonomous (or semi-autonomous) operations, then applies the framework, in detail, to the task of automating emergency brake release before rail-car decoupling. A significant hurdle to be accounted for is the lack of standardization of much of the hardware of interest in industry. Non-standardized rail car components must be formally structured as fully as possible to improve the reliability of the robotic automation. This brake release task requires either pushing or pulling a “bleed rod” that protrudes from the side of each rail car. The requirements for each step of the evaluation and planning process will be laid out in this thesis, as an example of how it should be applied to future automation tasks.