Search
On-Site Program Calendar
Browse By Day
Browse By Time
Browse By Person
Browse By Room
Browse By Unit
Browse By Session Type
Search Tips
Change Preferences / Time Zone
Sign In
X (Twitter)
This study investigates the impact of response styles on psychological testing outcomes using complicated mixed Item Response Theory (IRT) models. Synthetic data with diverse response patterns, including dominant and ideal-response styles, is generated to reflect real-world variability. The mixture IRT models successfully identify latent response styles, such as Dominant Responders, Ideal-Response Seekers, and Balanced Responders. Significant differences in test scores are observed between these groups, with Dominant Responders showing inflated scores and Ideal-Response Seekers displaying score deflation. The study highlights the importance of considering response styles in psychological testing and their implications for test validity. By disentangling latent response styles from traditional IRT factors, researchers can improve the precision and accuracy of psychological assessments.