Browsing by Subject "Longitudinal method"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item A comparison of latent growth models for constructs measured by multiple indicators(2005) Leite, Walter Lana; Stapleton, Laura M.Latent growth modeling (LGM) of composites of multiple items (for example, means or sums of items) has been frequently used to analyze the growth of latent constructs. However, composites are only equivalent to latent constructs if the items’ factor loadings are equal to one and there is no measurement error (Bollen & Lennox, 1991). In this study, the adequacy of using univariate LGM to model composites of multiple items, as well three other alternative methods were evaluated through a Monte Carlo simulation study. The four methods evaluated in this study were the univariate LGM, the univariate LGM with fixed error variances, the univariate LGM with the correction for attenuation, and the curve-of-factors model (McArdle, 1988; Tisak and Meredith, 1990). This simulation study manipulated the number of items per construct, the number of measurement times, the sample size, the reliability of the composites, the invariance of item parameters, and whether the items were essentially tau-equivalent or essentially congeneric. One thousand datasets were simulated for each of the conditions. The results indicate that using univariate LGM with composites of multiple items only produces unbiased parameter estimates and standard errors if the items are essentially tau-equivalent. The univariate LGM with fixed error variances performed identically to the univariate LGM. The univariate LGM with the correction for attenuation produced unbiased parameter estimates when the items were essentially tauequivalent, but produced negatively biased estimates of standard errors. The curve-of-factors model was found to be the most appropriate method to analyze the growth of latent constructs measured by multiple items. The curve-of-factors model was able to provide unbiased parameter estimates and standard errors under all conditions evaluated in this study. However, with sample sizes of 100 or 200, a large percentage of chi-square statistics were positively biased and the fit indices indicated inadequate model fit. This study’s recommendation is that the curve-of-factors model should be preferred to analyze the growth of latent variables measured by multiple items, but the use of sample sizes larger than 200 is strongly recommended to help ensure that adequate fit statistics and fit indices are obtained for appropriate models.Item The performance of missing data treatments for longitudinal data with a time-varying covariate(2005) Adachi, Eishi; Pituch, Keenan A.Item Rank regression in longitudinal data analysis(Texas Tech University, 2001-05) Barefield, Eric W.An important problem that arises frequently in medical research is the analysis of data that arise from a repeated measures design. One of the distinctions of this design is dependencies among the data. Generally, there are a number of subjects that are assumed to be independent, and several measurements are taken on each subject, usually corresponding to different time points. These are the repeated measurements and are generally correlated. Longitudinal data models arise when for each subject at each time point a vector of observations is taken. These situations have been studied extensively in the literature. The parametric approach is described in several papers and summarized nicely by Liang, Diggle, and Zeger (1994). One of the unique features of repeated measures data is the correlation structure. This is generally not the focus of the study, but the more accurately the correlation structure is specified, the better the conclusions will be. In this setting, we typically test linear hypothesis on the slope parameters or on the correlation parameters. Diggle, Liang, and Zeger describe several common assumptions for the covariance structure. The simplest is called the uniform correlation model. This model corresponds to the assumption that all the observations have equal variances and correlations. When the observations within a subject are supposed to be exchangeable, then the uniform correlation model is implied. Another common assumption is the exponential correlation model. This allows for the more reasonable assumption that correlation decreases as the time between observations increases. These first two assumptions specify the covariance structure with only a few parameters to estimate. Another assumption is the unstructured covariance model. This model allows for a separate parameter for each element of the covariance^ matrix. While we are more confident the covariance structure is properly modelled, the addition of new parameters has several drawbacks including increased standard errors. With repeated measures data, typical least squares is not preferred. To take into account the correlation structure weighted least squares is used. Weighted least squares is similar to least squares except the quantity it minimizes involves a weight matrix. If the weight matrix is taken to be the identity matrix, then weighted least squares becomes typical least squares. The most efficient choice of a weight matrix is the inverse of the covariance matrix. A problem with weighted least squares is that when interval estimates are desired the variance parameter needs to be estimated. The method of least squares does not do an adequate job of this. Maximum likelihood estimation {MLE) for the slope parameter remains the same as least squares under Gaussian assumptions, and an estimate of the variance can be obtained, though it is biased. The method of restricted MLE solves the problem of biased estimates by first linearly transforming the data to remove the slope parameters. These methods generally work well when the data is "nice." However, when outlying and influential observations are present, these methods are no longer valid since they put too much weight on these certain observations. Thus, other methods are desired which can resist the effects of these outlying observations.