NJCIE 2021, Vol. 5(3), 74–95

http://doi.org/10.7577/njcie.4054

Effectiveness of I’m Learning! for achieving quality learning environments: Impact and implications from three country pilots

Lisa Zook[1]

InformEd International

Cameron Ryall

InformEd International

Copyright the author

Peer-reviewed article; received 16 November 2020; accepted 22 June 2021

Abstract

From 2013 to 2017, Save the Children Norway tested the hypothesis that a global framework could be used to empower locally driven solutions within the education sector. It did so by galvanizing support and aligning stakeholders to common goals articulated through the Quality Learning Environment Framework but allowing each community context to determine its own path for achieving those goals. This paper explores the effectiveness and impact of these projects across the three pilot countries of Cambodia, Uganda, and Zimbabwe as defined by the original project goals, outcomes, and objectives. Reflecting on both qualitative and quantitative data gathered throughout the life of the project, the paper speaks to project impact and achievements as well as operational findings including commonalities and differences between the three pilot projects and keys to success. It outlines lessons learned across the programming sites and in doing so it explores the role of a large International Non-Governmental Organization as a catalyst for change. Finally, it discusses the rigorous research and reporting framework driven by funders and development agencies, the rigidity of which struggled to capture the emergent nature of locally driven solutions.

 

Keywords: international development; SDGs; global governance; school governance; quality education

Introduction

For many years national and international development actors in education were focused on access to school, including student enrolment, attendance and school completion. However, a growing body of evidence exemplified by the Education for All Global Monitoring Report (see UNESCO, 2005) and the World Bank’s evaluation of its education programming in over 700 projects (World Bank, 2006) found that while student access to school had increased, learning outcomes had declined. This has become known as the global learning crisis (Education Commission, 2016), which has recently been quantified as more than half of school aged children around the world as not achieving minimum learning proficiency (UIS, 2017). These developments influenced a leading international development actor, Save the Children, to review its programming. Finding similar gaps, it responded by creating a framework to define quality in education from early childhood through the end of secondary school, known as the Quality Learning Environment (QLE) framework. The goal of the QLE is to guide Save the Children’s programme design and implementation to focus on the quality of learning opportunity for children, targeting improved outcomes from children’s literacy, numeracy and holistic wellbeing: physical, social, emotional, cognitive and mental wellness (Maranto, 2017).

The QLE framework outlines twenty-eight sub-standards across four guiding principles that Save the Children believes should be present in a school or learning environment to support children’s learning and wellbeing, and thereby respond to the global learning crisis. The four Guiding Principles (GPs) are:

GP 1: Emotional and psychological protection

GP 2: Physical protection

GP 3: Active learning process, improved learning outcomes

GP 4: Close collaboration between school & parents/community

To operationalize the QLE framework, Save the Children Norway (SCN) developed I’m Learning!, a programming approach which uses the QLE framework to help Save the children’s country offices, as well as the schools and communities it works with to think about quality education in a structured and holistic way, thereby creating enabling learning environments for effective teaching and learning. From 2013 to 2017, SCN implemented a pilot of I’m Learning! in a total of 32 schools: 15 schools in Cambodia, 11 schools in Uganda, and 6 schools in Zimbabwe. The total number of primary school aged children who participated in the I’m Learning! pilot project across the three pilot countries is estimated to be 26,492 students (13,005 girls). While it is more difficult to track the number of teachers involved in the project, due to high teacher turnover and transfers, it is estimated that a total of 533 teachers participated in the project. Additionally, through training and awareness raising activities, many government officials, school management committee members, parents and community members were engaged by the project.

Purpose

The purpose of this paper is to contribute to the global education community’s knowledge and understanding of effective approaches for improving education outcomes in low-income countries through foreign aid and international development programming. This paper does so by exploring the programming approach of I’m Learning!, a programme model which uses the QLE framework to facilitate conversations and encourage local solutions to education challenges. The programme’s rationale, design, implementation, and approach are detailed in “I’m Learning! Intervention methodology for quality learning environments in developing country contexts” (Ryall & Zook, 2021).

This paper explores how each country in the pilot programme applied the QLE framework to their local context and explores results on how effectively this approach addressed the learning needs of students, and the supports schools, and communities require to mitigate the learning crisis.

To examine the effectiveness of I’m Learning!, a programme evaluation was undertakento answer the following key questions:

1.     Did the I’m Learning! pilot achieve its goals/objectives? Why or why not?

2.     What best practices emerged from the three countries? Why?

3.     What did not work well across the three countries? Why?

4.     What areas need further investigation or development?

5.     What is recommended for Save the Children’s continued programme development toward a common approach for participatory school improvement linked to the Quality Learning Framework?

This paper builds upon that evaluation report to explore these findings and, more importantly, discuss the following research questions:

·       Did I’m Learning! improve children’s learning outcomes and wellbeing?

·       What programme components of the I’m Learning! pilot, and in what respect, contributed to improving children’s learning outcomes and wellbeing?

·       How should the results of I’m Learning! pilot inform future international development interventions to address the global learning crisis?

Methodology

The research design took a realist approach using mixed methods to collect, analyse and interpret both quantitative and qualitative data. The study aimed to understand the outcomes that result from implementing different mechanisms in different contexts and the relationship between them. (Pawson & Tilley, 2004). The study pairs quantitative research findings and qualitative findings from local stakeholders to explore the tension between varying definitions of impact and the role of an international non-profit organization. 

Given project interventions had finished in the three implementing countries a year prior, an ex-post evaluation was undertaken enabling enquiries enabling areas of enquiry on performance, implementation relevance, impact and sustainability, including what factors and contexts helped or hindered interventions. Data for the evaluation were compiled through a document review, key stakeholder interviews, reflection workshops among programme staff and stakeholders, and project site visits in each of the three pilot countries.

The document review was carried out first; including project documentation as well as a review of the longitudinal research carried out simultaneously with implementation in each country. The project documentation included annual reports, internal evaluation reports, training reports, monitoring data (including on children’s learning outcomes), QLE database, presentations and expense reports. This project documentation was paired with reviewing longitudinal research studies completed in each country, in Cambodia by Kampuchean Action for Primary Education (KAPE), in Uganda by Gulu University and in Zimbabwe by the University of Zimbabwe. Furthermore, the three-country longitudinal study was coordinated by the University of Oslo who compiled findings into a consolidated report which was also reviewed. An important feature of the longitudinal study was the twice yearly use of learning assessment tools (literacy, numeracy and life skills), with this data being a core data source for the learning outcomes analysis. Where gaps in data or findings against the evaluation questions were identified, an additional investigation was made through key stakeholder interviews, reflection workshops, and project site visits.

Visits to project sites focused on schools targeted for the project. Given the dispersed implementation localities, a purposeful sample was undertaken, selecting 3-5 schools in each country to cover different types of schools based on criteria of remoteness (distance from the main road) and student population size. School visits included observation of the school grounds, buildings, programmes and resources, as well as classroom-based teaching and learning activities.

Key informant interviews and focus group discussions were carried out among key programme stakeholders in each country. Tools and protocols were developed for each stakeholder type, with one instance of each tool per sampled school (and one per district for the Ministry of Education). At each school (or district visited), data was collected from each of the stakeholders using the tools outlined in Table 1.

Table 1: Summary of qualitative tools

 

Tools

Description of usage

1.       

Headmaster questionnaire

In each school sampled for the ex-post evaluation, the school headmaster was interviewed using a 10-question questionnaire. Collected information about the school on different guiding principle indicators (relevant to the headmaster) from the headmaster’s perspectives.

2.       

Ministry of Education official (district level) questionnaire

A 10-question questionnaire implemented through interview format during a visit to the district education office. The purpose was to collect information about education / school context in the area, teacher training and school monitoring, comparative reflection on project schools to non-project schools and the current use of the QLE framework within the district.

3.       

Student focused group

discussion

8-10 children from grades 4-6 participated from each school in the group discussion. The facilitator guided discussion with 6 questions to prompt children’s discussion on their perceptions, attitudes and opinions on different QLE guiding principle indicators (relevant to the children). Each group discussion went about 1 hour.

4.       

School Management Committee (SMC) focused group

discussion

In each school sampled for the ex-post evaluation, 5-8 members of the SMC participated in a group discussion. An eight-question discussion guide was used to facilitate discussion among participants to gather insights on the knowledge of SMC members on the school’s activities as well as the SMC contribution to the development of the school. It also checked the community contribution on the guiding principle indicators pertaining to the development of the school.

5.       

Teachers focused group discussion

5-10 teachers at each of the sampled schools participated in a facilitated discussion. An 8 question discussion guide was used to facilitate discussion and collect information on the QLE guiding principle indicators from the teachers’ perspective.

6.       

Primary school observation check list

Measures the physical environment of the school and the school facilities that supports children’s quality learners as measures by the indicators from an observer’s perspective.

7.       

Classroom/lesson

observation

Collects information on the guiding principles indicators to do with the learning of children and what happens in the classroom

A sub-set of these stakeholders including local programme staff, Country Office staff, Ministry of Education officials, headmasters and parent representatives also participated in a reflection workshop over a period of two days within each country. The objectives of the workshop were to discuss and document the following items:

Framework analysis was employed as a method to analyse the qualitative data. Analysis was carried out using NVivo, overlaying the QLE as a thematic framework to facilitate identification of themes, sub-themes and trends in relation to the research questions across the three countries and data sources.

Student learning outcome data was collected during project implementation by both Save the Children (for its project monitoring) and the research partners in each country who were tasked with the longitudinal research study. Learning outcome assessments were implemented to assess literacy, numeracy and life skills for two cohorts of children. Within each country, the same learning assessment tools were used for both the project monitoring and the longitudinal study but carried out independently. While the tools differed between countries, all tools were validated to local contexts.

Importantly, Save the Children’s data focused on intervention schools within the I’m Learning! project, while the longitudinal study had a quasi-experimental, mixed methods design which sampled from intervention and comparison schools. The key purpose of this study “was to examine the interrelationship between the learning environment as understood in the QLE and learning outcomes and child development” (University of Oslo, 2018). The methodology for this study is summarised in Annex 1.

The research outlined in this paper reanalysed the learning outcome data from both Save the Children’s programme monitoring as well as the learning outcome data collected by the longitudinal studies conducted in each country. Excel and STATA were used to complete this analysis. No new learning outcome data was collected in the ex-post evaluation, rather relying on the extensive data sets from the aforementioned sources.

Country Snapshots

I’m Learning! took different forms in each country, embodying the project’s intention to encourage contextualisation.  What resulted from this flexibility was three-country projects with different character, emphasis, and interventions, while maintaining the same objectives and goals. While each Country Office had different defining characteristics for the project, each embraced the rights-based holistic approach to education outlined in the QLE framework. Basic information regarding each country’s pilot project such as duration, location, number of schools, number of students, and number of teachers is presented below. Through the facilitated reflection workshop, stakeholders developed a three-sentence description of I’m Learning! within their country.

Cambodia started designing their I’m Learning! program in 2013. Implementation was carried out from 2014 to 2017 in 15 schools located in three provinces in central Cambodia: Kampong Cham, Kampong Chhnang, and Kratie. During this time, 9,353 students and 197 teachers took part in the programme. Participants in the programme collaboratively developed the following description of I’m Learning! in their country:

I’m Learning! in Cambodia fosters community engagement and creates community structures that help the school achieve the Ministry of Education, Youth and Sport (MoEYS) policy for Child-Friendly Schools. The programme empowers community members, parents, students, and teachers to routinely assess their environment using the QLE assessment and to plan interventions that enhance the safety, WASH, child participation, child rights, learning aids, and teaching methodologies within the schools. Partnership with local and national governments throughout the life of the pilot, coupled with strong evidence of programme effectiveness, has ensured nationwide governmental uptake and support of the programme, the first step for sustained commitment to I’m Learning!’s approach for children’s learning and development.

Programme design in Uganda also took place in 2013 and implementation was carried out from 2014 to 2016. Programming took place in 11 schools located in Gulu and Omoros Districts in Northern Uganda. Over the length of the programme, 13,548 students and 242 teachers participated in activities. Programme participants within Uganda collaboratively developed the following description of their programme:

I’m Learning! in Uganda works with all school stakeholders through an action research approach to ensure effective teaching, community engagement and the pupil’s wellbeing.  Psycho-social support, disaster risk reduction, nutrition, hygiene and sanitation are key interventions to establish foundations for a quality learning environment.  The project builds upon this base by strengthening capability of teachers, school management committees (SMC) and local authorities to ensure sustained conditions for pupil acquisition of relevant skills and knowledge in a safe and engaging learning environment.

Zimbabwe’s I’m Learning! timeline matched that of Uganda. Programming took place in 6 schools located in the Rushinga and Matobo Districts. Rushinga is located northeast of Harare, adjoining the boarder with Mozambique whereas Matobo District is located south of Bulawayo, adjoining the border with Botswana. The programme had 3,591 students and 94 teachers participating. Stakeholders explained their programme in Zimbabwe as follows:

I’m Learning! in Zimbabwe works hand-in-hand with children, parents, teachers, and the School Development Committee to holistically create a quality learning environment in schools. All stakeholders work together to foster inclusive schools where children are not only safe, but actively participate in their own learning. Integrating WASH, Disaster Risk Reduction, child rights, and safety along with child-centred methodologies ensures improved learning outcomes for all.

Looking across the local manifestations of I’m Learning!, there were significant differences in programme activities, although all projects aimed to achieve the same outcomes, articulated by the QLE framework. Cambodia’s project was characterized by strengthening the existing School Management Committee structure to a broader, more inclusive School Development Committee (SDC) that empowered learners and parents to take a more active role in their own schooling. Accompanied by intensive teacher professional development (TPD), increased accountability of teachers was an important focus.  Uganda’s project focused heavily on school infrastructure (especially WASH related) and children’s psychosocial needs, ensuring children have access to appropriate mental health services through the school and community systems. Zimbabwe, while supporting TPD and SMC engagement, also emphasized inclusive education, thereby ensuring children with disabilities had access to school facilities including classrooms and latrines.

All three projects aimed to amplify stakeholder voices and ensure meaningful children’s participation in their own learning. The work produced numerous teaching and learning materials across all three countries – including educational games, teaching materials packs, book cabinets, and storybooks. Handwashing stations were constructed, water purifiers and systems were installed, and latrines were constructed. Additionally, classrooms and playgrounds in every school were renovated. Across the implementation schools, suggestion boxes now enable pupils and parents to communicate with school management. Class committees were formed to monitor key aspects of quality learning at the classroom level, as well as to facilitate school level accountability among the teachers, parents, and pupils.

Findings

Quality Learning Environment

Each Country Office developed a contextualized QLE assessment tool to help schools carry out self-assessment against the QLE framework principles and sub-standards. These tools were used with two different intentions during the pilot project: firstly, all schools used the assessment tools to prioritize school improvement planning activities and routinely measure progress against the QLE sub-standards. The data collected through this school self-assessment process was used not only as a programme intervention but also for monitoring progress. Additionally, the QLE assessments were used by the research teams in the research sites (a sub-set of the intervention schools as well as comparison sites). The research teams used these assessments to answer the question: How do learning environments change over time in intervention and comparison schools?

The analysis presented here examines the changes in QLE for each of the countries using both the project monitoring data and the research data. Note that while the assessment tools were consistent between monitoring and research, the method of calculation of results differed. The longitudinal research examined each country’s achievement of the Guiding Principles (GPs) by the percentage of sub-standards schools achieved. In this regard, 100% achievement of a GP required all sub-standards to be rated either 3 or 4. This differed from the project’s own monitoring approach which had a lower threshold of GP achievement, where only 50% of sub-standards had to be rated 3 or 4 in a school for a GP to be 100% achieved. For the purposes of comparable data, the analysis presented here aligned the reporting of the monitoring data to the calculation used by the longitudinal study, that is, to report the percentage of sub-standards achieved. It is useful to note an implication of this decision. While the trends for Cambodia and Zimbabwe are similar for both ways of calculating results (albeit with slightly lower values), Uganda’s achievement appears much lower and more variable when the sub-standards calculation is used. Note key differences between the two calculations as the Country Office data is for all intervention schools, while the longitudinal study had a smaller sample for Cambodia (4 schools) and Uganda (3 schools). Data was collected at similar times in the school year by both the CO and the researchers.

Cambodia

The research team found a trend of substantial and positive progress toward achieving QLE sub-standards in intervention schools, while comparison schools had very low results with little or no change over the three years, as shown in Figure 1. While intervention schools started in 2014 with a higher level of achievement in three of four GPs, the achievement gap in the quality of the learning environment between intervention and comparison schools widened dramatically for all GPs by 2016.

Figure 2 shows the same trends were apparent in the monitoring data, suggesting the significant progress was made in the achievement of sub-standards over the life of the programme. However, it is useful to note that in 2016 the measurements for all Guiding Principles are lower than the research findings. This is because the research focused on one province only (Kampong Cham), and this province has background factors like lower poverty rates (OPHI, 2018) as well as a much higher per school project expenditure than the other two provinces.

Figure 1: Cambodia's QLE GP Sub-Standard Achievement - Research Data

Figure 1: Cambodia's Quality Learning Environment General Principles Sub-Standard Achievement - Research Data. The figure shows the results from the intervention, which only took place in one region, compared to the results at country level.

Figure 2: Cambodia's QLE GP Sub-Standard Achievement - Monitoring Data

Cambodia's QLE GP Sub-Standard Achievement - Monitoring Data

Uganda

Figure 3 shows the sampled intervention schools achieving variable, but declining results for GP sub-standards between 2014 and 2016 in Uganda. Comparison schools were also variable in results, but by 2016 their GP achievement was either equal to or out-performing intervention schools. Note that QLE results for 2014 had validity issues and were dropped from the analysis in the research report. Using 2015 as a baseline, three of four GPs show improvement in intervention schools, but this finding is attenuated when results for comparison schools are taken into account. It is important to keep in mind the small sample size of three schools for the longitudinal research in Uganda.

Figure 3: Uganda's QLE GP Sub-Standard Achievement - Research Data

Figure 3: Uganda's QLE GP Sub-Standard Achievement - Research Data

 

Figure 4 shows project monitoring results, showing gains for all GPs over the duration of the project. The biggest gains were seen in the last year of the project. While this trend was similar to Cambodia, there is still a number of sub-standards not being achieved in Uganda, especially for GP 2 & GP 3. Importantly, achievement may not be fully represented in the data. For instance, despite the project building 22 latrine blocks, 80% of schools still did not achieve sub-standard 2.3 for sanitation. The criteria for achievement were set unnecessarily high given the contextual situation. This also applies to some other indicators in GP 2-3.

Figure 4: Uganda's QLE GP Sub-Standard Achievement - Monitoring Data

Figure 4: Uganda's QLE GP Sub-Standard Achievement - Monitoring Data

Zimbabwe

Figure 5 shows intervention schools achieving incremental progress toward achieving the QLE sub-standards, with comparison schools showing little or no change. While intervention schools started in 2014 with a higher level of achievement in three of four GPs, the achievement gap in the quality of the learning environment widened for GP1 – GP3. GP 4 with its 3 sub-standards was already highly rated in 2014, maintaining that status by end of the research phase.

Figure 5: Zimbabwe's QLE GP Sub-Standard Achievement - Research Data

Figure 5: Zimbabwe's QLE GP Sub-Standard Achievement - Research Data

The pattern of achievement for all GPs mimics the findings from the research data. By 2016, most indicators were being achieved in the six intervention schools. Two indicators where schools struggled were safe playgrounds (2.4) and children’s participation in decision making. The lowest performing schools at end of the project were the two satellite schools, but having started at a lower level of QLE achievement, they actually made the strongest gains in their respective districts.

Figure 6: Zimbabwe's QLE GP Sub-Standard Achievement - Monitoring Data

Figure 6: Zimbabwe's QLE GP Sub-Standard Achievement - Monitoring Data

 

Summary of short-term results

Bringing together the trends observed in the research and monitoring data, it can be determined that Cambodia had substantial impact on all four Guiding Principles. Uganda’s team was challenged by research data validity issues, but a slight to modest impact on all four QLE guiding principles is observed. Finally, in Zimbabwe, there was a modest impact observed on the first three QLE GPs but no impact observed on GP4.

Learning Outcomes

Each research team investigated the following question: How do pupils in intervention schools perform in terms of learning outcomes and child development compared to those in comparison schools?

As illustrated in Table 2, there are both improvements and declines in the three learning outcomes (literacy, numeracy, and life skills) across the life of the project in both intervention and comparison schools.

Table 2: Changes in learning outcomes in intervention (IS) and comparison schools (CS), by country

Learning outcome

Cambodia

Uganda

Zimbabwe

IS

CS

IS

CS

IS

CS

Literacy

Improved

Improved

Improved

Improved

Improved

Improved

Numeracy

Declined*

Declined*

Improved

Improved

Declined

Declined*

Life skills

Declined

Declined

Improved*

Improved*

Improved*

Improved

Source: University of Oslo, 2018, p. 55

Key: *: Insignificant per cent improvement or decline.

 

Literacy improved in intervention schools in all countries over time. While comparison schools also improved in literacy in all countries, there is evidence of project impact on literacy scores in Uganda and Cambodia, where intervention schools significantly outperformed comparison schools. Project monitoring data in Cambodia also shows large gains in literacy outcomes. Uganda and Zimbabwe’s literacy monitoring data, however, are inconsistent with the research findings. Uganda’s monitoring data shows a small decline in literacy outcomes whereas Zimbabwe’s monitoring data shows large increases in literacy outcomes from 2014 to 2016 (among both cohorts the Country Office followed).

Regarding numeracy outcomes, there were no consistent trends across the three countries. In Cambodia, there was no change in numeracy performance among intervention schools. Uganda observed significant improvement in numeracy outcomes, whereas Zimbabwe observed significant decreases in numeracy outcomes. Comparing research data to monitoring data, the effect of I’m Learning! on numeracy outcomes becomes even less clear. Within Cambodia’s intervention schools, monitoring data actually showed an improvement in numeracy outcomes, whereas Uganda and Zimbabwe’s monitoring data showed decreases in numeracy outcomes. By the end of the project, comparison schools in Zimbabwe and Cambodia outperformed intervention schools in numeracy. As such, there is no consistent evidence that the project impacted numeracy outcomes.

Uganda’s project carried out supplemental research to understand if the project contributed to significantly higher performance in literacy and numeracy taught in mother-tongue and basic English taught as a subject among children in intervention schools compared to their peers in comparison schools. The supplementary study used cross-sectional data collected among Primary 3 pupils at the end of term three in the school year 2016. Primary 3 pupils were selected to be the focus of this study because Primary 3 is the highest education level where thematic curriculum that emphasizes mother-tongue as the medium of instruction as per the Uganda Ministry of Education Policy 2007. The study found that pupils in intervention schools performed better in mother tongue literacy, numeracy, and English than their peers in comparison schools, indicating that there was a significant contribution of the I’m Learning! project.

While the project clearly had an impact on literacy outcomes in Cambodia and Uganda, the lack of impact on literacy in Zimbabwe may be explained by the presence of several other development organizations working on literacy in the area, including in the comparison schools. Qualitative data collected by the research team in Zimbabwe describes the child-centred training that the teachers underwent, and parents/learners describe that they now participate in partner and group work. Interviews and focus group discussions within the intervention schools found that parents and learners were convinced that Save the Children interventions that improved learning environments were having a positive effect on learning outcomes and child development. As an example, a pupil said, “Having received most of our needs it has motivated our learning interest.”

Qualitative findings from the research shed light on why improvements in literacy were observed, whereas numeracy outcomes were inconsistent. Teachers in Cambodia acknowledged that their students demonstrate better proficiency in reading than other subjects. Specifically, teachers noted that students have difficulty remembering the multiplication tables, easily ‘forgetting’ what they learned in math. One teacher explained that math requires the teachers to make a strong pedagogical effort by producing and using teaching materials and inventing exercises to teach various mathematical algorithms. But according to her observations, most teachers simply follow the textbook. Furthermore, there is evidence that children are better able to receive help at home with literacy homework as opposed to math homework. Teachers explained that while some parents can help their child with reading and writing at home, many of them have weaker skills in math.

In Uganda, a similar finding arose in interviews and focus group discussions with learners and teachers who affirmed that all basic numeracy operations and concepts were taught in both intervention and comparison schools.  Furthermore, the methodology for teaching these topics was the same between the two school groups. Both intervention and comparison schoolteachers reported challenges with conceptualising numeracy topics.

Child development was measured through life skills assessment in each of the countries. The research team in Cambodia found that life skills outcomes decreased among intervention schools during the life of the project. There were no significant changes in life skills outcomes in Uganda and Zimbabwe. Similar to numeracy outcomes, comparison schools in Zimbabwe and Cambodia actually outperformed intervention schools in life skills.

While the quantitative life skills assessment results did not show project impact, qualitative data provides useful insight into life skills achievement in the three countries. Cambodia’s longitudinal research report explains that there were significant gains in life skills observed among both comparison and intervention schools:

The observations conducted in the classrooms, on the playground and at home (i.e., case study students) did not show any difference in behaviours regarding communication among peers, conflict and its resolution, self-regulation, ethics or citizenship, etc. In both groups of schools, students ask permission before leaving the classroom, and greet the teacher when he/she enters the classroom. Self-regulation behaviours (e.g., persevering in an activity, putting one’s hand down if the teacher does not invite the student to speak, not leaving one’s seat before the teacher has asked a student to do so, etc.) appear to occur at the same level and frequency. Furthermore, observations and interviews do not show any difference regarding conflicts among students and the ways they resolve them when they occur.

While there was no difference between intervention and comparison students observed given the above life skills, students were observed to be spontaneously more polite towards adults. Through strengthening the educational structure (GP 4), the project has empowered these children to constructively engage with teachers and community adults politely. This suggests that after four years of project implementation, intervention school students are more empowered, but in the framework of a strengthened hierarchical educational structure.

In Uganda, interview and focus group discussions among learners, parents, and teachers found that cordial interpersonal relationship and skills appear stronger in intervention schools than comparison schools in terms of interpersonal interaction, cooperation and interpersonal communication. Additionally, intervention schools are experiencing improving moral and ethical standards while comparison schools are experiencing declining moral ethical standards.

Zimbabwe’s qualitative data also found strong signs of improvements in life skills, although mostly connected to the Guiding Principles rather than directly to life skills. Workshops on psychosocial support facilitated by Save the Children have had a strong impact in addressing the emotional and psychosocial needs of the learners (GP1). Parents, teachers and learners indicated that there was no longer use of violence, threats, or corporal punishment in intervention schools. In relation to the protection of learners’ physical well-being (GP2), the participants in intervention schools acknowledged improved health and hygiene practices.

Overall, the project demonstrated strong results in literacy, especially in Cambodia and Uganda. Both monitoring and research findings are limited and sporadic in terms of numeracy and life skills outcomes. Quantitative and qualitative data show that numeracy remains to be a challenge in all three of the implementation countries, with learners scoring extremely low on the assessments and teachers explaining that teaching numeracy is an ongoing challenge for them. Quantitative life skills assessments showed little or no improvement over the life of the project, but qualitative data describe improvement in several life skills areas within each country. This could point to challenges with the life skills assessment tool in terms of the construct it uses to define child development (life skills). Measuring behaviour change can be extremely challenging and globally life skills assessment is a challenge.  It is therefore encouraging that qualitative data highlights some of the observed improvements in life skills within intervention schools.

QLE and Learning Outcomes

Each research team also examined the relationship between the QLE Guiding Principles and the learning outcomes. Table 3 shows the results of these analyses. While there is not a consistent pattern in the interrelationships between learning environment and learning outcomes, GP 2 and GP 3 are generally positively related to literacy, numeracy and life skills in all three countries in the lower grades. GP 1 is positively related to literacy and numeracy. Despite some positive relationships, GPs 1-4 are most negatively related to life skills.

Setting aside life skills, the analyses on interrelationships generally supports the I’m Learning! program logic, that enhancing the school environment improves learning outcomes. This finding is most represented in the early grades. It is important to note, however, that it is extremely difficult to separate the Guiding Principles, as many interventions cut across GPs and GPs are interrelated and support one another.

Table 3: Interrelationships between QLE and learning outcomes in intervention schools, by country

Relationship

Cambodia

Uganda

Zimbabwe

Lower grades

Upper grades

Lower grades

Upper grades

Lower grades

Upper grades

Positive

GP 2/3 & Lit/Num/LS

GP1 & Lit/Num (Grade 1)

GP 1-4 & Lit &

Num (Grade 6)

GP 4 & Num (Grade 6) GP 4 & LS (Grade 4)

GP 1/2 & Lit/Num/LS

GP 3 & Lit/Num/LS

 

GP 1/2 & Lit/Num/LS

 

GP 1 & Lit/LS

GP 2 & Lit/Num

GP 3 & LS

 

GP 3 & Lit

Negative

 

GP 2-4 & LS (Grade 5)

GP 1-4 & LS (Grade 4)

GP 4 & LS

GP 3/4 & Lit/Num/LS

 

 

Source: University of Oslo, 2018, p. 59

Note: Lit: literacy; Num: numeracy; LS: life skills.

Results Interpretation

While the quantitative data is mixed in terms of impact, there are some encouraging findings. Firstly, the longitudinal research shows evidence of significant improvement in literacy outcomes in Cambodia and Uganda. Monitoring data further supports this finding in Cambodia, but contradicts the findings in Uganda and Zimbabwe, with Uganda’s monitoring data showing decreases in literacy outcomes and Zimbabwe’s monitoring data showing large increases in literacy among both lower and upper grades.

Numeracy data is similarly inconsistent, with no clear trend across the three countries. Cambodia’s longitudinal research showed no change in numeracy performance among intervention schools whereas the monitoring data showed improvement. Uganda’s longitudinal research observed significant improvement in numeracy outcomes, whereas its monitoring data showed decreases in numeracy outcomes. Zimbabwe observed significant decreases in numeracy outcomes within the longitudinal research, which was confirmed by the project monitoring data. As such, there is no consistent evidence that the project impacted numeracy outcomes.

The inconsistent quantitative findings suggest that there may be methodological issues with the way literacy and numeracy assessments were carried out. Given these inconsistencies, the project relies heavily on qualitative data, which across all three countries, describes improvement in literacy outcomes resulting from I’m Learning!.

While qualitative data strongly supported impact on literacy outcomes within the project sites, qualitative findings in numeracy highlight the challenges the project faced within numeracy. Qualitative data highlights the challenges learners and teachers face in numeracy and suggest that if the I’m Learning! project focused more heavily on numeracy interventions moving forward, there is the potential to see project impact in this area.

While there was little to no impact observed on life skills assessments across the three countries, qualitative data across all three countries show improvements in life skills. This perhaps points to issues with either the conceptual framework for life skills or the quantitative tools used to measure life skills within I’m Learning!.

Importantly, through the interrelationship analysis presented by the research team, the project logic holds – that the Guiding Principles are associated with learning outcomes. While the project did not necessarily exhibit consistent results on learning outcomes, it did observe some impact on QLE indicators, and this impact was increasing over the length of the project. This is especially important considering that the project sites often took at least a year or two to fully implement the project. Therefore, it is possible that I’m Learning!’s impact would be seen over the length of time that is greater than the length of the presented research. Ongoing longitudinal research into learning outcomes is highly recommended.

In addition to the overall examination of results, it is necessary to look at different types of schools. Each country in the I’m Learning! pilot selected to intervene in a range of schools – those who performed decently well but clearly needed additional support and those that were extremely struggling (or even at the point of closing). The project was extremely effective at helping to raise the struggling schools to a point of functioning again. Thus, the project was quite effective at achieving Save the Children’s mandate to target the most marginalized students.

Operational Approach

While interventions aimed at establishing quality learning environments were contextualized to each country’s needs, each Country Office also had similarities and differences in terms of project operations.

All Country Offices worked closely with local education authorities, either at the district or provincial level. District education officers supported functions like training of teachers or SMC/SDC, classroom observation/supervision, and broader monitoring of the school compound. In the case of Cambodia, the project used working groups at the national and sub-national level to develop the content of the project. As such, there was early buy-in nationally by the Ministry of Education, with a broad range of departments contributing to and signing off on the I’m Learning! approach, training modules and tools. Furthermore, the engagement of a core group of technical staff from provincial teacher training colleges and provincial / district offices of education as project implementers build know-how within the Ministry of Education structures for taking on I’m Learning! interventions.

In Uganda, Save the Children staff worked alongside the District Education Department (DED) to integrate the QLE Framework in to the DED’s monitoring tool.  Critical items from both tools were picked and merged in to one Joint Monitoring and Support Supervision (JMSS) tool. The components of QLE that were included in the JMSS were related to sub standards 1.2, 1.3, 1.4 and 1.5 under GP1, 2.2, 2.3, 2.6, 2.8 under GP2. The DED in these two districts have fully adopted the JMSS tool and maintained using it beyond the end of the project.

Regarding Save the Children staffing of the project, Cambodia and Uganda had dedicated I’m Learning! staff based in the field, Zimbabwe did not. Within Zimbabwe, the project was staffed primarily by four staff members, of which each contributed 25% of their time. The combined equivalence of staff time allocation was about 1 full-time staff to 6 schools. Cambodia on the other hand had seven dedicated staff, a ratio of 1 staff for every 2 schools. Moreover, government partners supporting follow-up and monitoring in schools were also more intensively supporting in Cambodia compared to Uganda and Zimbabwe.

All pilot sites used the QLE Assessment for project design, review, and adaptation. Uganda and Zimbabwe had a similar design to their contextualized QLE Assessment tool, using six different tools across their stakeholders and a discussion-based consolidation/determination of indicator scoring. Cambodia, on the other hand, used one assessment tool for all stakeholders, allowing each stakeholder to directly ‘rate’ each indicator.

A strong difference between the countries arose when stakeholders were asked to recall QLE assessment results throughout the time of the project. Cambodian stakeholders could recount QLE Assessment results for each indicator in each year of the project for their school, thus showing that the project truly emphasized, and aligned stakeholders around, the results for project planning and monitoring. Zimbabwean and Ugandan stakeholders, however, were not as familiar with their assessment scores. There are a couple of probable reasons for the difference between countries. SC Cambodia encouraged schools to post/display assessment results at the school each year, providing all stakeholders with documentation of their results and progress. This did not occur in Uganda or Zimbabwe. Also, school annual planning differed between countries. Although all countries used the QLE results to inform annual planning, in Uganda and Zimbabwe it was only the school heads and the SMC chairperson’s using the data for developing plans, while in Cambodia the annual school planning was characterised by a consultative, open and transparent process during which even learners had an opportunity to contribute. This may point to the level of engagement Save the Children had with each school, Cambodia’s staff working very closely with the schools, facilitating openness of stakeholders to share successes and challenges.

Finally, there were differences in the distribution of funding, and therefore the prioritization of interventions, for each Country Office. Funding differed based on school (and school need) in Cambodia and Uganda. Cambodia phased prioritization of Guiding Principles with the idea that Guiding Principle 3 should be emphasized first. Upon seeing improved literacy results, the community was more likely to offer support (Guiding Principle 4) and therefore facilitate the establishment of Guiding Principles 1 and 2. Zimbabwe, however, allocated the same amount of funding for each school, regardless of need and addressed all four guiding principles at each school, in each year.

Discussion and Recommendations

The summative review of the project found that the strongest aspect of the I’m Learning! project is that it empowers communities to identify local priorities and solutions to problems within the school. It therefore encourages a bottom-up approach. As shown elsewhere in the literature (Marzano et al., 2005; Hattie, 1999; Dearing et al., 2006), promotion of parental and community engagement in school has a significant positive effect on children’s learning and school success. Thus, project interventions that strengthen bottom-up approaches increase the likelihood of sustained improvements in school performance. That said, further research into I’m Learning!’s sustainability is needed.

The QLE framework sub-standards align stakeholders to a common understanding of quality education and common objectives. Therefore, setting boundaries of what to emphasize and support in terms of school development. This process is carried out through the effective use of the QLE assessment tool.

Alongside the empowerment factor, the project supported schools in identifying and securing their own resourcing for school improvements. Within this paradigm, I’m Learning! is a valued funding source, but one that can be replicated, not situated as a one-off opportunity for intervention schools.

As this was a pilot project, there was a long start-up phase in which countries became familiar with the programming approach. This resulted in the pilot projects requiring generally a longer start-up phase than had been initially planned. While the project was meant to start in 2014, it was not until 2015 that interventions were truly being implemented.

Two of the pilot sites also used cascade training models, which led to the deteriorating quality of project implementation. Stakeholders expressed dilution of training inputs resulting from this approach, which was not the original intent of the project.

Across the three pilot sites, funding significantly differed. This resulted in Uganda having insufficient resourcing and Cambodia having abundant resourcing. It is possible that given different funding allocation, Uganda could have seen greater project impact. Given these differences, it is challenging to compare project impact and results across the three pilot sites.

The pilot project implemented rigorous longitudinal research alongside the pilot projects. While this provided useful insights to the project implementation and impact, there were numerous challenges that resulted from a project being designed and adapted alongside a rigorous research framework. There was therefore a misalignment between a bottom-up, contextualized project design and a rigid research design that did not effectively capture some of the emergent project developments. It is time that the researchers within the development sector embrace the complexities of programming and develop better assessment and measurement strategies for more effectively defining, measuring, and determining impact.

Finally, while it is recognized that a defining characteristic and strength of I’m Learning! is the fact that the QLE framework allows for country-specific contextualisation, interventions, and innovations, it also poses a significant challenge for project model development and documentation traditionally used by NGOs. The project struggled to provide enough structure that ensures rigour and fidelity of project implementation, while also allowing the project to be tailored to the needs of each country.

I’m Learning! examines root causes of poor enrolment, attendance, and performance – and works to address those causes. Most education initiatives right now are focused on learning outcomes, which is admirable given the global learning crisis, but often these initiatives are only targeting learning outcomes to the exclusion of more holistic programming. While the intent of the pilot project was to enhance learning outcomes, and there are initial indications of improvement in this area, there are certainly encouraging steps being taken to improve learning environments.

Often a missing piece in the global and national literacy initiatives is the development of community support which not only encourages home support and involvement in school but also helps build the sustainability of the project. I’m Learning! has done this in a unique way – by galvanizing support and aligning stakeholders to the QLE framework indicators. The success of this approach through I’m Learning! is extremely encouraging – suggesting that communities can do quite a lot to solve their own problems, although they may just need the catalyst of a common framework or an external party such as a non-profit. This could add a lot to the global development sector’s knowledge regarding how best to work with communities.

Additionally, there is the potential for the project to transform the most deprived schools in the targeted countries. This works in two ways. In the three countries, the schools rely on parent levies to invest and improve the school. At times, enrolment is so low that funds are extremely limited for that investment. Save the Children can help bridge that gap, sparking school improvements and attracting more learners for enrolment. In other settings (such as Zimbabwe), the government requires the schools to reach a certain minimum standard before it is formally recognized by the government and before it receives any support from the government. In this case, Save the Children can help schools achieve this minimum standard.

Finally, it is important to recognize the role international non-profits play in the schools and communities. Project stakeholders described the benefit of having Save the Children’s voice active in school and community conversations, as an ‘outside’ and often unbiased, civil society voice. In relations where there are assumptions and tensions, this external voice can be extremely beneficial to facilitate better relations between stakeholders. This can be an opportunity for non-profits and development agencies to play an important role in catalysing change without imposing Western ideals, measurement frameworks, and restrictive programming.

References

Christensen, S.D. (2016, December 6-8). Save the Children’s quality learning environment framework [PowerPoint presentation]. Save the Children’s Inclusive Education Learning Event, Bangkok, Thailand. https://resourcecentre.savethechildren.net/node/10305/pdf/qle_ppt_sine_christensen_iele_bangkok_dec._2016.pdf.

Dearing, E., Kreider, H., Simpkins, S., & Weiss, H. (2006). Family involvement in school and low-income children’s literacy: longitudinal associations between and within families. Journal of Educational Psychology, 98(4), 653– 664. https://doi.org/10.1037/0022-0663.98.4.653

Education Commission (2016). The Learning Generation: Investing in education for a changing world. The International Commission on Financing Global Education Opportunity. https://report.educationcommission.org/wp-content/uploads/2016/09/Learning_Generation_Full_Report.pdf

Hang, P., Khorn, D., Prigent, S., & Yuth, K. (2017). “I’m Learning!!” pilot in Cambodian Primary Schools (2013-2017): Community-based monitoring and connection to educational praxis. Kampuchean Action for Primary Education. https://resourcecentre.savethechildren.net/node/13432/pdf/im_learning_longitudinal_reserach_report_final_cambodia_2018.pdf.

Hapanyengwi, O., Chataika, T., & Dirwai, C. (2018). Quality of education: Interrelationships between learning environments and learning outcomes and child development in basic education in Zimbabwe. University of Zimbabwe. https://resourcecentre.savethechildren.net/node/13434/pdf/im_learning_longitudinal_research_report_final_zimbabwe_2018_0.pdf.

Hattie, J. (1999). Influences on Student Learning. Transcript of inaugural lecture to the University of Auckland. Research Gate. https://www.researchgate.net/publication/237248564_Influences_on_Student_Learning

Heijinen-Maathuis, E., & Christensen, S. D. (2016). Save the Children’s Quality Learning Environment (QLE) framework: What have we learned and how will this inform our future work. Save the Children.

Maranto, R. (2017). The quality learning framework. Save the Children International. https://resourcecentre.savethechildren.net/node/12460/pdf/00_quality_learning_framework_september_2017.pdf

Marzano, R., Waters, T., & McNulty, B. (2005). School leadership that works: from research to results. Association for Supervision and Curriculum Development.

Nielsen, H. D. (2006). From schooling access to learning outcomes: An unfinished agenda. An evaluation of World Bank support to primary education. World Bank. https://doi.org/10.1596/978-0-8213-6792-6

Ofoyuru, D. T., & Abola, B. (2017). "I’m Learning!" research report: Longitudinal study on learning environments, and learning outcomes and child development in Gulu district. Gulu University.

Oxford Poverty and Human Development Initiative (OPHI) (2018). Global Multidimensional Poverty Index 2018: The Most Detailed Picture to Date of the World’s Poorest People [Report]. OPHI, University of Oxford.

Pawson, R., & Tilley, N. (2004). Realist evaluation. British Cabinet Office. http://www.communitymatters.com.au/RE_chapter.pdf

Ryall, C., & Zook, L. (2017). I’m Learning! cost analysis: Cambodia [Internal report]. Save the Children.

Ryall, C., & Zook, L. (2021). I’m Learning! intervention methodology for quality learning environments in developing country contexts, Nordic Journal for Comparative and International Education, 5(3).

UNESCO Institute for Statistics (UIS) (2017). More than one-half of children and adolescents are not learning worldwide (Report No. UIS/FS/2017/ED/46) http://uis.unesco.org/sites/default/files/documents/fs46-more-than-half-children-not-learning-en-2017.pdf

United Nations Educational, Scientific and Cultural Organization (UNESCO) (2004). Education for All: The quality imperative. UNESCO. https://unesdoc.unesco.org/ark:/48223/pf0000137333

University of Oslo (2018). “I’m Learning!” longitudinal study: a synthesis report. Department of Education, University of Oslo.

Annex 1: Research Design for the I’m Learning! Longitudinal Study

Item

Cambodia

Uganda

Zimbabwe

Location

1 province

1 district, 2 counties

2 provinces, 4 districts

No. of schools

Intervention schools = 4

Comparison schools = 4

 

2015

2016

Intervention schools = 6

Comparison schools = 6

Intervention

3

5

Comparison

3

5

Sampling of schools

Non-random, purposive

Criteria: size (student enrolment), remoteness/location, school characteristics like student-teacher ratio; student background (socio-economic status

Peri-urban, rural

 

Schools selected by district education officials

3 criteria:

a) Highest achieving school

b) Average achieving school

c) satellite school (low resource and low achievement)

Schools selected by district education officials

Sampling technique

Level 1

100% (small schools), 95% confidence level

Two student cohorts per school: Grade 1 (2, 3); Grade 4 (5,6)

Stratified and simple random sampling, 95% confidence level. Two student cohorts per school: Grade 3 (4), Grade 5 (6)

Stratified random sampling, 95% confidence level; 100% (satellite schools)

Two student cohorts per school: Grade 3 (4, 5); Grade 5 (6, 7)

Sample size

Level 1*

 

2014 (Year 1)

2015: 1,011 total students for intervention and comparison with two cohorts from grades 3 and 5.

2016: 869 total students for intervention and comparison with two cohorts from grade 4 and 6.

2014 (Year 1)

 

Gr1

Gr4

 

Gr3

Gr5

Intervention

491

265

Total Students

74

106

2017 (Year 3)

2016 (Year 3)

 

Gr3

Gr6

 

Gr5

Gr7

Intervention

414

362

Total Students

121

126

Comparison

266

168

 

Level 2

140

196 (2016)

414 (2014); 405 (2016)

Level 3 (Intervention schools only)

24 learners; 48 parents

 

6 learners and 6 parents per school

67 learners, 62 parents (2014)

56 learners; 49 parents (2016)

Data collection methods

Mixed

Mixed

Mixed

Analysis techniques

Level 1

 

Descriptive/inferential statistical analysis (t-test, ANOVA Tukey, Pearson correlation, multiple regression. Reliability (Cronbach alpha)

Descriptive/inferential statistical analysis (t-test, ANOVA, regression analysis).

Descriptive/inferential statistical analysis (t-test, ANOVA Tukey, multiple linear regression analysis). Reliability (Cronbach alpha).

Level 2

Credibility (trust, conversation, triangulation), confirmability, dependability

Thematic constant comparative analysis.

Credibility (triangulation), transferability, dependability, confirmability

Deductive content analysis.

Credibility (triangulation, member checking, peer review); transferability (thick descriptions), dependability (overlapping methods), confirmability (triangulation)

Level 3

Triangulation

Case study stories

Deductive content analysis

* Level 1 data include only the number of learners who took the literacy, numeracy and life skills tests. Level 2 data includes other participants including teachers, headmasters, parents, district officials, SMC members, with the data collected through qualitative methods. Full description in University of Oslo, 2018.



[1] Corresponding author: lisazook@informedinternational.org