Search
On-Site Program Calendar
Browse By Day
Browse By Time
Browse By Person
Browse By Room
Browse By Unit
Browse By Session Type
Search Tips
Change Preferences / Time Zone
Sign In
Bluesky
Threads
X (Twitter)
YouTube
Applied researchers often encounter sparse data when few respondents endorse certain item categories. Collapsing categories is a common strategy to address this issue. This study used simulations of a confirmatory factor analysis (CFA) model with sparse data, followed by recoding to collapse sparse responses. Estimators evaluated included MLR, MLMV, WLSMV, and Bayesian methods. Results showed that collapsing categories generally improved model fit under correctly specified models across all estimators. However, in misspecified models, collapsing increased the risk of selecting incorrect models. Bayes showed the highest rejection rates under misspecification, while WLSMV had the lowest rejection rates in correctly specified conditions. These findings offer practical guidance for handling sparse data in CFA.