Individual Submission Summary
Share...

Direct link:

Students performance in year-on-year high-stakes examinations: could it be a case of Comparing the Incomparable?

Thu, March 26, 11:45am to 1:15pm EDT (11:45am to 1:15pm EDT), Hyatt Regency Miami, Floor: Terrace (Level 0), Gardenia C

Proposal

School examinations certify student achievement, with a focus on transparently selecting students for admission into academic secondary schools, or on screening candidates for tertiary education or placements in the job market. In many countries, examination certificates are allocated based on competitive, uniform, nationwide examinations. Achievement of the global mandates, such as SGD 4 for education, is only possible if there is a measure to indicate the extent to which the system of education is achieving the performance expectations of society.

This paper investigates variation in examination performance, year-by-year, in the West African Examination Council’s high-stakes secondary school completion examinations (the ‘WASSCE’). By administering an exam combining items from multiple years’ assessments to 2,000 students in Ghana, a common scale of item difficulties – for all items – can be created. Armed with this information, the ‘real’ changes in examination difficulty can be established. These may be used to update current interpretations of, and justifications for, the large performance fluctuations.

This study is important for at least two reasons. Firstly, in West Africa, a particular concern is the apparent variability of the exam over time (Abreh et al., 2018). In Ghana the proportion of candidates achieving a minimum score of 50% (the standard required for university entry, equivalent to A1-C6) in Mathematics increased from 32.8% in 2016 to 42.7% in 2017, while Integrated Science results fell five percentage points in the same period. Subsequent declines in 2018 – a year in which 62% of Ghana’s WASSCE candidates fail to achieve the pass mark – led the Ministry of Education to ‘probe unsatisfactory results for Mathematics and English Language’, with a view to identifying reforms that may improve future performance. With the abolition of user-fees in Ghana’s secondary schools, the Ministry of Education requires information with which to monitor system performance and decide on course adjustments – reliable WASSCE examination results are central to this.

Secondly, if it is true that the test is not comparable over time, that has obvious implications regarding fairness for the candidates. The stakes here are huge. Pupils spend, collectively, millions of hours each year cramming for tests that may have limited validity. And variations in the content or errors in the marking of the exams lead to arbitrary outcomes affecting the career fates of hundreds of thousands of students. Unreliable exams blunt the incentives for human capital generation (why study if the exam is largely luck?), incentivize bad pedagogy in schools (cramming based on past items, rather than learning curriculum content), and undermine employers’ faith in school degrees as a signal of human capital -- with potential macro implications for the labor market and economy as a whole. Identifying such discrepancies would be a first step toward improving the content and reliability of the exams, making them fairer for candidates and more efficient for society as a whole.

Authors