Topics in multivariate covariance estimation and time series analysis.
Beeson, John D. (John David)
MetadataShow full item record
In this dissertation we will discuss two topics relevant to statistical analysis. The first is a new test of linearity for a stationary time series, that extends the bootstrap methods of Berg et al. (2010) to goodness-of-fit (GoF) statistics specified in Harvill (1999) and Jahan and Harvill (2008). Berg's bootstrap method utilizes the statistics specified in Hinich (1982) in the framework of an autoregressive bootstrap procedure, however we show that by utilizing GoF methods, we can increase the power of the test. In Chapter three we discuss an alternative way of approaching the Friedman (1989) regularized discriminant method. Regularized discriminant analysis (RDA) is a well-known method of covariance regularization for the multivariate-normal based discriminant function. RDA generalizes the ideas of linear (LDA), quadratic (QDA), and mean-eigenvalue covariance regularization methods into one framework. The original idea and known extensions involve cross-validating in potentially high di- mensions, and can be highly computational. We propose using the Kullback-Leibler divergence as an optimization method to estimate a linear combination of class co- variance structures, which increases the accuracy of the RDA method, an limits the use of leave one out cross validation.