Browsing by Subject "Particle filter"
Now showing 1 - 5 of 5
Results Per Page
Sort Options
Item Analysis of circular data in the dynamic model and mixture of von Mises distributions(2013-05) Lan, Tian, active 2013; Carvalho, Carlos Marinho, 1978-Analysis of circular data becomes more and more popular in many fields of studies. In this report, I present two statistical analysis of circular data using von Mises distributions. Firstly, the maximization-expectation algorithm is reviewed and used to classify and estimate circular data from the mixture of von Mises distributions. Secondly, Forward Filtering Backward Smoothing method via particle filtering is reviewed and implemented when circular data appears in the dynamic state-space models.Item Bayesian inference methods for next generation DNA sequencing(2014-08) Shen, Xiaohu, active 21st century; Vikalo, HarisRecently developed next-generation sequencing systems are capable of rapid and cost-effective DNA sequencing, thus enabling routine sequencing tasks and taking us one step closer to personalized medicine. To provide a blueprint of a target genome, next-generation sequencing systems typically employ the so called shotgun sequencing strategy and oversample the genome with a library of relatively short overlapping reads. The order of nucleotides in the short reads is determined by processing acquired noisy signals generated by the sequencing platforms, and the overlaps between the reads are exploited to assemble the target long genome. Next-generation sequencing utilizes massively parallel array-based technology to speed up the sequencing and reduce the cost. However, accuracy and lengths of the short reads are yet to surpass those provided by the conventional slower and costlier Sanger sequencing method. In this thesis, we first focus on Illumina's sequencing-by-synthesis platform which relies on reversible terminator chemistry and describe the acquired signal by a Hidden Markov Model. Relying on this model and sequential Monte Carlo methods, we develop a parameter estimation and base calling scheme called ParticleCall. ParticleCall is tested on an experimental data set obtained by sequencing phiX174 bacteriophage using Illumina's Genome Analyzer II. The results show that ParticleCall scheme is significantly more computationally efficient than the best performing unsupervised base calling method currently available, while achieving the same accuracy. Having addressed the problem of base calling of short reads, we turn our attention to genome assembly. Assembly of a genome from acquired short reads is a computationally daunting task even in the scenario where a reference genome exists. Errors and gaps in the reference, and perfect repeat regions in the target, further render the assembly challenging and cause inaccuracies. We formulate reference-guided assembly as the inference problem on a bipartite graph and solve it using a message-passing algorithm. The proposed algorithm can be interpreted as the classical belief propagation scheme under a certain prior. Unlike existing state-of-the-art methods, the proposed algorithm combines the information provided by the reads without needing to know reliability of the short reads (so-called quality scores). Relation of the message-passing algorithm to a provably convergent power iteration scheme is discussed. Results on both simulated and experimental data demonstrate that the proposed message-passing algorithm outperforms commonly used state-of-the-art tools, and it nearly achieves the performance of a genie-aided maximum a posteriori (MAP) scheme. We then consider the reference-free genome assembly problem, i.e., the de novo assembly. Various methods for de novo assembly have been proposed in literature, all of whom are very sensitive to errors in short reads. We develop a novel error-correction method that enables performance improvements of de novo assembly. The new method relies on a suffix array structure built on the short reads data. It incorporates a hypothesis testing procedure utilizing the sum of quality information as the test statistic to improve the accuracy of overlap detection. Finally, we consider an inference problem in gene regulatory networks. Gene regulatory networks are highly complex dynamical systems comprising biomolecular components which interact with each other and through those interactions determine gene expression levels, i.e., determine the rate of gene transcription. In this thesis, a particle filter with Markov Chain Monte Carlo move step is employed for the estimation of reaction rate constants in gene regulatory networks modeled by chemical Langevin equations. Simulation studies demonstrate that the proposed technique outperforms previously considered methods while being computationally more efficient. Dynamic behavior of gene regulatory networks averaged over a large number of cells can be modeled by ordinary differential equations. For this scenario, we compute an approximation to the Cramer-Rao lower bound on the mean-square error of estimating reaction rates and demonstrate that, when the number of unknown parameters is small, the proposed particle filter can be nearly optimal. In summary, this thesis presents a set of Bayesian inference methods for base-calling and sequence assembly in next-generation DNA sequencing. Experimental studies shows the advantage of proposed algorithms over traditional methods.Item Cyber-enabled manufacturing systems (CeMS) : model-based estimation and control of a solidification process(2014-12) Lopez, Luis Felipe, active 21st century; Beaman, Joseph J.; Williamson, Rodney L.Vacuum arc remelting is a secondary melting process used to produce a variety of segregation sensitive and reactive metal alloys. The present day VAR practice for superalloys involves, typically, melting electrodes of 17'' into ingots of 20'' in diameter. Even larger diameter forging stock is desirable. However, beyond 20'' ingots of superalloys are increasingly prone to segregation defects if solidification is not adequately controlled. In the past years a new generation of model-based controllers was developed to prevent segregation in VAR by controlling melt rate, or the total amount of power flowing into the liquid pool. These controllers were seen as significant improvements in the industry of remelting processes, but these controllers were still focusing on the melting sub-process and ignoring ingot solidification. Accurate control of the liquid pool profile is expected to result in segregation-free ingots, but unfortunately a controller capable of stabilizing the solidification front in VAR is currently not available. The goal of the proposed research is to develop a cyber-enabled controller for VAR pool depth control that will enhance the capabilities of current technologies. More specifically, the objectives of this research are threefold. Firstly, a control-friendly model is proposed based on a high-fidelity ingot solidification model and is coupled to a thermal model of electrode melting. Secondly, sequential Monte Carlo estimators are proposed to replace the traditional Kalman filter, used in the previous VAR controllers. And finally, a model predictive controller (MPC) is designed based on the proposed reduced-order model. The time-critical characteristics of these methods are studied, and the feasibility of their real-time implementation is reported.Item Energy storage-aware prediction/control for mobile systems with unstructured loads(2013-08) LeSage, Jonathan Robert, 1985-; Longoria, Raul G.Mobile systems, such as ground robots and electric vehicles, inherently operate in stochastic environments where load demands are largely unknown. Onboard energy storage, most commonly an electrochemical battery system, can significantly constrain operation. As such, mission planning and control of mobile systems can benefit from a priori knowledge about battery dynamics and constraints, especially the rate-capacity and recovery effects. To help overcome overly conservative predictions common with most existing battery remaining run-time algorithms, a prediction scheme was proposed. For characterization of a priori unknown power loads, an unsupervised Gaussian mixture routine identifies/clusters the measured power loads, and a jump-Markov chain characterizes the load transients. With the jump-Markov load forecasts, a model-based particle filter scheme predicts battery remaining run-time. Monte Carlo simulation studies demonstrate the marked improvement of the proposed technique. It was found that the increase in computational complexity from using a particle filter was justified for power load transient jumps greater than 13.4% of total system power. A multivariable reliability method was developed to assess the feasibility of a planned mission. The probability of mission completion is computed as the reliability integral of mission time exceeding the battery run-time. Because these random variables are inherently dependent, a bivariate characterization was necessary and a method is presented for online estimation of the process correlation via Bayesian updating. Finally, to abate transient shutdown of mobile systems, a model predictive control scheme is proposed that enforces battery terminal voltage constraints under stochastic loading conditions. A Monte Carlo simulation study of a small ground vehicle indicated significant improvement in both time and distance traveled as a result. For evaluation of the proposed methodologies, a laboratory terrain environment was designed and constructed for repeated mobile system discharge studies. The test environment consists of three distinct terrains. For each discharge study, a small unmanned ground vehicle traversed the stochastic terrain environment until battery exhaustion. Results from field tests with a Packbot ground vehicle in generic desert terrain were also used. Evaluation of the proposed prediction algorithms using the experimental studies, via relative accuracy and [alpha]-[lambda] prognostic metrics, indicated significant gains over existing methods.Item Suitability of FPGA-based computing for cyber-physical systems(2009-12) Lauzon, Thomas Charles; Chiou, Derek; Mok, AloysiusCyber-Physical Systems theory is a new concept that is about to revolutionize the way computers interact with the physical world by integrating physical knowledge into the computing systems and tailoring such computing systems in a way that is more compatible with the way processes happen in the physical world. In this master’s thesis, Field Programmable Gate Arrays (FPGA) are studied as a potential technological asset that may contribute to the enablement of the Cyber-Physical paradigm. As an example application that may benefit from cyber-physical system support, the Electro-Slag Remelting process - a process for remelting metals into better alloys - has been chosen due to the maturity of its related physical models and controller designs. In particular, the Particle Filter that estimates the state of the process is studied as a candidate for FPGA-based computing enhancements. In comparison with CPUs, through the designs and experiments carried in relationship with this study, the FPGA reveals itself as a serious contender in the arsenal of v computing means for Cyber-Physical Systems, due to its capacity to mimic the ubiquitous parallelism of physical processes.