Browsing by Subject "Functional data"
Now showing 1 - 5 of 5
Results Per Page
Sort Options
Item Fréchet-differentiation of functions of operators with application to functional data analysis(Texas Tech University, 2008-05) Ji, Xiao Y.; Ruymgaart, Frits; Gilliam, David S.; Allen, Edward J.; Wang, AlexIt is well-known that the sample covariance operator converges in distribution in the Hilbert space of Hilbert-Schmidt operators, and that this result entails the asymptotic distribution of simple eigenvalues and corresponding eigenvectors. Several estimators and test statistics for the analysis of functional data require the asymptotic distribution of eigenvalues and eigenvectors of certain functions of sample covariance operators, and it turns out that the asymptotic distribution of such a function of the sample covariance operator is a prerequisite. To obtain such a result, the main result in this dissertation is the determination of the Fr´echet-derivative of an analytic functions of a bounded operator, tangentially to the space of all bounded operators, and an ensuing delta-method to solve this problem. In the Hermitian case, moreover, some results on perturbation of an isolated eigenvalue, its eigenprojection, and its eigenvector if the eigenvalue is simple, are also included. The results are applied to obtain the asymptotic distribution of a test statistic for testing the equality of two covariance operators.Item Functional data analysis: classification and regression(Texas A&M University, 2005-11-01) Lee, Ho-JinFunctional data refer to data which consist of observed functions or curves evaluated at a finite subset of some interval. In this dissertation, we discuss statistical analysis, especially classification and regression when data are available in function forms. Due to the nature of functional data, one considers function spaces in presenting such type of data, and each functional observation is viewed as a realization generated by a random mechanism in the spaces. The classification procedure in this dissertation is based on dimension reduction techniques of the spaces. One commonly used method is Functional Principal Component Analysis (Functional PCA) in which eigen decomposition of the covariance function is employed to find the highest variability along which the data have in the function space. The reduced space of functions spanned by a few eigenfunctions are thought of as a space where most of the features of the functional data are contained. We also propose a functional regression model for scalar responses. Infinite dimensionality of the spaces for a predictor causes many problems, and one such problem is that there are infinitely many solutions. The space of the parameter function is restricted to Sobolev-Hilbert spaces and the loss function, so called, e-insensitive loss function is utilized. As a robust technique of function estimation, we present a way to find a function that has at most e deviation from the observed values and at the same time is as smooth as possible.Item Infinite dimensional discrimination and classification(Texas A&M University, 2007-09-17) Shin, HyejinModern data collection methods are now frequently returning observations that should be viewed as the result of digitized recording or sampling from stochastic processes rather than vectors of finite length. In spite of great demands, only a few classification methodologies for such data have been suggested and supporting theory is quite limited. The focus of this dissertation is on discrimination and classification in this infinite dimensional setting. The methodology and theory we develop are based on the abstract canonical correlation concept of Eubank and Hsing (2005), and motivated by the fact that Fisher's discriminant analysis method is intimately tied to canonical correlation analysis. Specifically, we have developed a theoretical framework for discrimination and classification of sample paths from stochastic processes through use of the Loeve-Parzen isomorphism that connects a second order process to the reproducing kernel Hilbert space generated by its covariance kernel. This approach provides a seamless transition between the finite and infinite dimensional settings and lends itself well to computation via smoothing and regularization. In addition, we have developed a new computational procedure and illustrated it with simulated data and Canadian weather data.Item Perturbations of Operators with Application to Testing Equality of Covariance Operators.(2011-07) Kaphle, Krishna; Ruymgaart, Frits; Allen, Linda J. S.; Hadjicostas, PetrosThe generalization of multivariate statistical procedures to infinite dimension naturally requires extra theoretical work. In this dissertation, we will focus on testing the equality of covariance operators. We derive a procedure from the Union Intersection principle in conjunction with a Likelihood Ratio test. This procedure leads to a statistic which is the largest eigenvalue of a product of operators. We generalize this procedure by using a test statistic that is based on the first $m \in \mathbb{N}$ largest eigenvalues. Perturbation theory of operators and functional calculus of covariance operators are extensively used to derieve the required asymptotics. It is shown that the power of the test is improved with inclusion of more eigenvalues. We perform simulations to corroborate the testing procedure, using samples from two Gaussian distributions.Item Some statistical methods for directly and indirecly observed functional data(Texas Tech University, 2008-08) Pang, Johnny; Ruymgaart, Frits; Wang, Alex; Paige, RobertIn this dissertation, we will be concerned with the statistical inference regarding linear models with functional data. For the sake of generality these functional data will be considered as sample elements in an abstract infinite dimensional Hilbert space. In the special instance of the one-sample problem, both directly and indirectly observed functions will be included. It should be stressed that the linear model mentioned above each sample elements itself is a function, so that we have more information than in cases where the data consist of a number of sampled function values. In Chapter 1, we will review some useful properties and formulas of arbitrary random variables and Gaussian random variables in Hilbert spaces. It should be noted that a Gaussian measure will be employed as a dominating measure because there doesn't exist a shift invariant (i.e. Lebesgue) measure on an infinite dimensional Hilbert space. In Chapter 2, linear model in Hilbert space will be considered. We will borrow the notation from the univariate linear model and use matrices to arrive at a convenient notation for linear models in Hilbert spaces. We will show that our estimator of the function parameter has approximately a Gaussian distribution for large sample size. In Chapter 3, three special cases of the main model introduced in Chapter $2$ will be considered. First, the simplest version of the one-sample problem in Hilbert spaces will be introduced together with an application of neighborhood hypotheses. Second, the indirect-one-sample problem in Hilbert spaces will be considered. We will exploit the spectral-cut-off type regularized inverse and consider the MISE of the estimator as a means to investigate its quality. In fact, we will prove that the estimator is rate-optimal. Finally, multi-sample problem will be briefly considered along the same lines as the direct one-sample problem.