Individual Submission Summary
Share...

Direct link:

Cultural Explanations and Policy Effects: What can we learn from PISA?

Tue, March 10, 9:45 to 11:15am, Washington Hilton, Floor: Lobby Level, Jay

Abstract

Large-scale international standardized tests have attracted increasingly more policy makers to compare the achievement of children in other countries with those in their own. (Sahlberg, 2011) (OECDa, 2011) (McKinsey & Company, 2010) Politicians and policy makers have often cited these relatively poor league standing on such international test score comparisons as evidence to ratify domestic reform agendas. (Carnoy & Rothstein, 2013)(Sellar & Lingard, 2013) A premise to such logic dwells on “PISA reasoning,” which applauds superior test performance as a testimony for superior policies. (Feniger & Lefstein, 2014, p. 1) In order to problemtize and challenge the PISA reasoning with this cultural influence hypothesis, Feniger and Lefstein (2014) compared immigrant students of Chinese orgin to Australia and New Zealand, with non-Chinese students in these countries. In broad strokes, this result supports the cultural influence hypothesis and grills the policy assumption of PISA reasoning, as cultural background seems to be more explanatory of student achievement.

This extension paper proposed to complement Feniger and Lefstein’s (2014) analysis would be one that further problematizes the cultural and policy dichotomy, as proposed by Feniger and Lefstein (2014). In a nutshell, immigration by nature does not happen randomly. Based on this premise, this paper seeks to answer two questions. For one, immigration may be caused by many reasons and selection happens at many levels. Thus, it is important to unpack the immigrant schooling decisions and their achievement outcomes. Can cultural roots alone singlehandedly explain variations in schooling outcomes? For another, Chinese immigration to Australia and New Zealand is what I consider a “downstream” flow where children are moving from “high-performer” systems to “average-performer” systems. Would the same findings for cultural effects still hold if the flow is reversed, when children move from “low-performer” systems to “average or high-performer” systems? These are important questions to pursue given the key findings presented in the original 2014 study.

By using identical sample restriction strategies, this paper first replicates Feniger and Lefstein’s (2014) Ordinary Least Squares analysis, and compares their model with ones that include school fixed effects. Preliminarily, after adding school fixed effects, the coefficients of cultural origin dummies were reduced by half, indicating that nuances to test performance cannot be solely attributed to cultural roots which may be taking credit of other confounding factors, causing a concern for omitted variable bias in previous analysis. Second, by pooling all existing waves of PISA data from 2000 to 2012, I replicated the same study design on other major diaspora groups such as Koreans, Thais, and Russians, after controlling for year fixed effects. Findings from this exercise is rather contradictory and inconsistent with that of the results generated from only Chinese immigrants.

To sum up, this paper hopes to contribute to the current comparative education literature on ILSAs in a few ways. First, it is important to continue the discussion on the problematic inferences drawn from PISA and other ILSA results, especially given the limited knowledge of schooling mechanisms that affect achievement. Second, this paper demonstrates that immigration schooling patterns can be highly inconsistent depending on the diaspora group of interest. Third, current results are mostly correlational and future research on this topic should consider quasi-experimental design to draw causal inferences on this important issue.

Author