Search
On-Site Program Calendar
Browse By Day
Browse By Time
Browse By Person
Browse By Room
Browse By Unit
Browse By Session Type
Search Tips
Change Preferences / Time Zone
Sign In
Bluesky
Threads
X (Twitter)
YouTube
Brian Ripley (1987) famously argued that “Simulation is best regarded as mathematical experimentation, and needs all the care and planning that are regarded as a normal part of training in experimental sciences” (p. 1). This paper builds on that foundational idea by introducing the 6 C’s of Simulation Studies as a pedagogical scaffold that guides students through the design, implementation, and interpretation of simulation-based inquiry. The 6 C’s: Conceptualize, Comprehend, Conduct, Calculate, Chart, and Communicate, form a developmental sequence for simulation studies that aligns methodological training with deeper epistemological reflection.
To illustrate the framework, this paper presents a didactic example using a Monte Carlo simulation study of a 3-factor confirmatory factor analysis (CFA) model. Each of the three latent factors has five indicators, with the population structure specified to include simple structure (no cross-loadings) and equal loadings within each factor. The simulation manipulates three primary design conditions: (1) loading magnitude (.4, .6, .8), (2) number of indicators per factor (3, 5, 7), and (3) sample size (N = 100, 250, 500). A fully crossed factorial design is used to simulate 1000 replications per condition combination. For each cell, the model is fit using maximum likelihood estimation, and key outcomes such as parameter bias, standard error accuracy, model convergence, and RMSEA fit statistics are recorded.
Each stage of the 6 C’s framework is tied to explicit tasks in the CFA simulation:
• Conceptualize: Identify how factor complexity and indicator strength may influence estimation.
• Comprehend: Review theoretical expectations for parameter recovery under varying sample sizes and factor loadings.
• Conduct: Implement data-generating processes and automate CFA model fitting in R using lavaan or in Mplus via automated scripts.
• Calculate: Extract condition-level summary metrics such as percent bias, RMSEA deviation, and convergence rates.
• Chart: Visualize results using boxplots, bias heatmaps, and condition-by-metric graphs to interpret performance tradeoffs.
• Communicate: Draft reports interpreting simulation outcomes and translating methodological insight to applied audiences.
This example illustrates simulation not as a mechanical process but as a methodological reasoning task. The 6 C’s framework encourages students to develop metacognitive skills about model behavior, assumption testing, and result interpretation. Moreover, the reproducibility of this simulation design, facilitated through script-based workflows and version control, demonstrates how simulation pedagogy can reinforce broader values of transparency, replicability, and computational rigor.
By treating simulation studies as structured intellectual arguments rather than technical chores, the 6 C’s framework helps students develop the tools to reason about statistical methods and their limitations. This paper contributes a replicable model for embedding simulation into quantitative methods education, grounded in computational epistemology and reflective inquiry.
References
Ripley, B. D. (1987). Stochastic simulation. Wiley.