Search
Program Calendar
Browse By Day
Browse By Time
Browse By Person
Browse By Room
Browse By Committee or SIG
Browse By Session Type
Browse By Research Areas
Browse By Region
Browse By Country
Search Tips
Virtual Exhibit Hall
Personal Schedule
Sign In
X (Twitter)
Since the mid-1990s, cross-national studies conducted by the Organisation for Economic Co-operation and Development (OECD) and the Int’l Association for the Evaluation of Educational Achievement (IEA) have been inextricably linked to the externalization of educational policies and reforms around the world. With results reported publicly in the form of global league tables, the OECD’s PISA and the IEA’s TIMSS, PIRLS, and ICCS have been interpreted by policymakers as objective markers of how countries’ educational systems stack up to one another. Thanks in part to media sensationalization across the globe (Steiner-Khamsi, 2003), ILSA rankings have spurred the global transformation toward testing for accountability (Smith, 2014). And despite their methodological, theoretical, and ethical criticisms (Creemers, 2006; Crossley, 2008), researchers and policymakers continue to draw on data collected from ILSAs.
Focusing on the enormous consequences that ILSAs have over stakeholders vested in the learning process, this paper invokes the humanism of Ubuntu philosophy and Freirian pedagogy to conceptualize the need for ILSA literacy. Through the lenses of these two frameworks, we are reminded of the moral purpose of education and thus can deconstruct the phenomenon of global testing. Given the tremendous expenditure of fiscal and human resources by countries to participate in ILSAs, the reality that ILSA results function as accountability mechanism for teachers (who are targeted increasingly in reform trends), and that the test makers themselves have attested to certain “cautions and caveats” (Feuer, 2012) in the interpretation of ILSAs, we cannot ethically separate the validity of ILSAs from their effects on human beings: students and parents, teachers and administrators. As Ubuntu “opens up possibilities for more nuanced understandings of assessment theory and practice” (Beets & le Grange, 2005) through improved technical validity, psychosocial validity, value validity, and fairness (Muwanga-Zake, 2009)—it also illustrates the Freirian concept of “reading the world” (Freire & Macedo, 1987), which demands us to seek deeper meaning in how we make sense of ILSAs and their global effects.
Indeed, the concept of ILSA literacy establishes how crucial it is for researchers and policymakers to interpret ILSA results as validly as possible. Next, through three cases, this paper models ILSA literacy as an extension of general assessment literacy (Stiggins, 1991; Linn, 2000; Inbar‐Lourie, 2008) in relation to factors of globalization and policy externalization (Schriewer, 1989; Rappleye, 2006; Steiner-Khamsi, 2006). Through these cases, the paper conveys how ILSA literacy requires acknowledgment of the tremendous investment of resources poured into standardized testing worldwide, scrutiny of the interpretation of ILSA results by education stakeholders and media, and, ultimately, advocacy for innovative and more meaningful interpretation of ILSA data.
In the first case, ILSA literacy is demonstrated through meta-interpretation of PISA, TIMSS, PIRLS, and ICCS (Galczynski, 2014), which expands the narrow scope of any one study to offer a clearer impression of national education systems’ effectiveness. This not only facilitates the possibility of comparing student achievement in humanistic vs. STEM fields, comparing student achievement at early versus later ages/grades, and checking for alignment between OECD- and IEA- sponsored ILSAs—it also makes discrepant trends in country performance highly visible and reaffirms the “best practices” of consistent top-performers. Subsequently, we see how teachers are key to student achievement in any country, and that those countries which perform best elevate teachers’ professional status through selective recruitment, competitive salary, extensive professional development, and job autonomy (Carnoy et al., 2009; American Federation of Teachers, 2012).
In the second case, ILSA literacy is used to challenge accountability policy trends in North America. Whereas historical analysis of American and Canadian educational reforms, including the recent Common Core State Standards Initiative, confirms the diminished the power and independence of teachers (Tatto, 2006), meta-interpretation of PISA, TIMSS, PIRLS, and ICCS makes clear that ILSA top-performers all elevate the status of teaching through selective recruitment, high classroom autonomy, accessible professional development, and equitable compensation (Galczynski, 2013). Hence, this finding calls for reversal of recent policy trends.
In the third case, ILSA literacy is achieved by moving beyond contentious student achievement scores to glean new insights from the rich data amassed through questionnaires completed by students, teachers, and schools as part of ILSA administrations—specifically investigating how multicultural values are embedded in student responses to ICCS 2009 items (Galczynski, 2013). Through the lens of a broadly-defined and inclusive multiculturalism (Taylor, 1994; Ghosh, 2002), the student questionnaires suggest how attitudes toward multicultural values vary globally. Such insights are highly valuable for the effective integration of multiculturalism into curriculum and pedagogy.