Browsing by Subject "MIMIC"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Testing Measurement Invariance Using MIMIC: Likelihood Ratio Test and Modification Indices with a Critical Value Adjustment(2012-10-19) Kim, Eun SookMultiple-indicators multiple-causes (MIMIC) modeling is often employed for measurement invariance testing under the structural equation modeling framework. This Monte Carlo study explored the behaviors of MIMIC as a measurement invariance testing method in different research situations. First, the performance of MIMIC under the factor loading noninvariance conditions was investigated through model fit evaluations and likelihood ratio tests. This study demonstrated that the violation of factor loading invariance was not detected by any of the typically reported model fit indices. Consistently, the likelihood ratio tests for MIMIC models exhibited poor performance in identifying noninvariance in factor loadings. That is, MIMIC was insensitive to the presence of factor loading noninvariance, which implies that factor loading invariance should be examined through other measurement invariance testing techniques. To control Type I error inflation in detecting the noninvariance of intercepts or thresholds, this simulation study with both continuous and categorical variables employed the likelihood ratio test with two critical value adjustment strategies, Oort adjustment and Bonferroni correction. The simulation results showed that the likelihood ratio test with Oort adjustment not only controlled Type I error rates below the basal Type I error rates but also maintained high power across study conditions. However, it was observed that power to detect the noninvariant variables slightly attenuated with multiple (i.e., two) noninvariant variables in a model. Given that the modification index is the chi-square difference after relaxing one parameter for estimation, this study investigated modification indices under four research scenarios based on a combination of the cutoffs of modification indices and the procedures of model modification: (a) the noniterative method (i.e., modification indices at the initial stage of model modification) using the conventional critical value, (b) the noniterative method using the Oort adjusted critical value, (c) the iterative procedure of model modification using the conventional critical value, and (d) the iterative procedure using the Oort adjustment. The iterative model search procedure using modification indices showed high performance in detecting noninvariant variables even without critical value adjustment, which indicates that iterative model search specification does not require critical value adjustment in identifying the noninvariance correctly. On the other hand, when the noniterative procedure was used, the Oort adjustment yielded adequate results.Item The Effects of Parceling on Testing Group Differences in Second-Order CFA Models: A Comparison between Multi-Group CFA and MIMIC Models(2010-10-12) Zou, YuanyuanUsing multi-group confirmatory factor analysis (MCFA) and multiple-indicator-multiple-cause (MIMIC) to investigate group difference in the context of the second-order factor model with either the unparceled or parceled data had never been thoroughly examined. The present study investigated (1) the difference of MCFA and MIMIC in terms of Type I error rate and power when testing the mean difference of the higher-order latent factor (delta kappa) in a second-order confirmatory factor analysis (CFA) model; and (2) the impact of data parceling on the test of (delta kappa) between groups by using the two approaches. The methods were introduced, including the design of the models, the design of Monte Carlo simulation, the calculation of empirical Type I Error and empirical power, the two parceling strategies, and the adjustment of the random error variance. The results suggested that MCFA should be favored when the compared groups were when the different group sizes were paired with the different generalized variances, and MIMIC should be favored when the groups were balanced (i.e., have equal group sizes) in social science and education disciplines. This study also provided the evidence that parceling could improve the power for both MCFA and MIMIC when the factor loadings were low without bringing bias into the solution when the first-order factors were collapsed. However, parceling strategies might not be necessary when the factor loadings were high. The results also indicated that the two approaches were equally favored when domain representative parceling strategy was applied.