Individual Submission Summary
Share...

Direct link:

The impact of language on data - exacerbating exclusion in emergencies

Tue, March 24, 10:00 to 11:30am EDT (10:00 to 11:30am EDT), Hyatt Regency Miami, Floor: Terrace (Level 0), Orchid A

Proposal

Education in emergencies is failing to meet the needs of minority language speakers, according to research from Translators without Borders (TWB).

We know too little about the languages crisis-affected communities speak and understand. This is linked to language-blind data in the EiE sector, exacerbating educational inequalities.

People in the 42 countries currently experiencing humanitarian emergencies speak over 3,200 languages. Neither development nor humanitarian actors routinely collect information on the languages of the people they are working with. When we are recording gender and age and household size, we don’t systematically ask about language and literacy. And those organizations that do collect language data on occasion, don’t routinely share it.

That means we make important choices about the languages of surveys, training, classroom, and teacher and pupil assessments, without the necessary information about what languages people understand.

Not only are we not sure what languages people speak, but we underestimate what this does to our wider data gathering.

Humanitarian organizations tend to expect their local staff, enumerators, and partners, to meet their language needs at no cost. They rely on national colleagues to do most of the interpreting and translating – largely without training or guidance.

TWB findings show that poor language preparation for data collectors has an impact on data quality. In northeast Nigeria surveys are typically sight-translated from English into one of the 30 local languages spoken, and the replies translated back into English on the survey form. Enumerator teams tested on key survey terms understood at best 80% and at worst just 10% of the terms. 84% of male and 95% of female enumerators interviewed felt that language was a major barrier in their work.

Back-translations of survey questions and responses reveal further language problems and a need for quality assurance. Informed consent may not be explained, and answers coded by enumerators may not reflect actual responses. TWB has found that when enumerators struggle to understand all the options in a menu of possible responses, they often select the one they understand best. The respondent’s views and experiences are sometimes lost. Humanitarian organizations, research institutes and think tanks around the world conduct thousands of surveys every year. How many suffer from language challenges? And how many substandard survey results fall below the radar for lack of effective quality assurance?

TWB’s research in Cox’s Bazar in June this year highlighted how language blindness also negatively impacts other types of data gathering in education.

Rohingya refugee children were graded based on an ASER-Plus test held in Myanmar language and English – two languages little spoken in the community. They felt hugely frustrated that they had been assessed as lower level students simply because they didn’t understand the languages used for the assessment.

Teachers in Cox’s Bazar struggle to assimilate training delivered in a language other than Rohingya. Their professional development assessments aren’t conducted in Rohingya either.

Language is a much-neglected factor of inclusion in education. Not until we routinely collect and disaggregate language data can we address it effectively.

Authors