Session Submission Summary
Share...

Direct link:

What Would it take to Change your Inference? Informing Collaborative Evidence Generation and Evaluation

Friday, November 14, 1:45 to 3:15pm, Property: Hyatt Regency Seattle, Floor: 5th Floor, Room: 502 - Cowliltz

Session Submission Type: Workshop

Abstract

The workshop informs discussion about the strength of evidence supporting a policy or practice by quantifying the conditions necessary to change an inference. In part I, we use the potential outcomes and counterfactual framework to interpret how much bias there must be to nullify an inference.  This supports statements such as “To nullify the inference, __% of the data would have to be replaced with counterfactual cases for which there was no effect.” In part II, we quantify the robustness of causal inferences in terms of correlations associated with unobserved variables. This supports statements such as “To nullify the inference an omitted variable would have to be correlated at __ with both the focal predictor and outcome.” By leveraging intuition based on potential outcomes and correlations with omitted variables, these techniques inform collaborative evidence generation and evaluation among a broad set of stakeholders. Furthermore, because the techniques are functions of quantities commonly reported in published studies (e.g., estimated effect, standard error) they can be broadly applied and extended (e.g., logistic regression, hazard, propensity score based). Specifically, the techniques have already been widely employed in policy, education, sociology, epidemiology, and management, as well as in general science (e.g., Proceedings of the National Academy of Sciences).


We introduce participants to calculations using modules in Stata and R, as well as the app (https://konfound-project.shinyapps.io/konfound-it/). There will be a mixture of presentation and individual hands-on exploration.  Participants will also have an opportunity to compare with alternative techniques (e.g., Oster’s Coefficient of Proportionality; Cinelli & Hazlett’s Robustness Index). This workshop is intended for anyone familiar with the general linear model but also informs more sophisticated techniques (e.g., propensity scores, logistic regression, multilevel models).


 


WORKSHOP OUTLINE


I. Just the Basics [5 minutes]


II. Motivation: pragmatic social science to inform policy or practice [5 mins]


III. Introduction to Replacement Cases Framework  [20 mins]


·            Thresholds for inference and % bias to invalidate and inference


·            The counterfactual paradigm


·          Application to concerns about non-random assignment to treatments


·            Application to concerns about non-random sample


IV. Reflection and hands on Exercise I [15 mins]


V .Presentation: Introduction to Correlational Framework for Sensitivity Analysis [25 mins]


·              Defining the impact of a confounding variable


·            “Absorption” of Control Variables


VI.  Reflection and hands on Exercise II [15 mins]


VIII. Conclusion [5 mins]


 REFERENCES


Frank, K.A., Lin, Q., Xu, R., Maroulis, S.J., Mueller, A. (2023). Quantifying the Robustness of Causal Inferences: Sensitivity Analysis for Pragmatic Social ScienceSocial Science Research. 110, 102815.


Xu, R., Frank, K. A., Maroulis, S. J., & Rosenberg, J. M. 2019. konfound: Command to quantify robustness of causal inferences. The Stata Journal, 19(3), 523-550.


Frank, K.A., Maroulis, S., Duong, M., and Kelcey, B. 2013.  What would it take to Change an Inference?: Using Rubin’s Causal Model to Interpret the Robustness of Causal Inferences.   Education, Evaluation and Policy Analysis Vol 35: 437-460.


 

Policy Area

Moderator

Organizer

Speakers