Investments aligned with this Strategic Goal aim to improve the transparency and accountability of education management systems by ensuring that they have the necessary technological and institutional infrastructure, human and technical capacity, and accountability mechanisms.
Collecting and analyzing data is critical for promoting educational equity and improving all students' learning outcomes; otherwise, as the World Bank notes, governments can “ignore or obscure the poor quality of education, especially for disadvantaged groups” (1, p. 91). Participatory data collection and analysis help all levels of policymakers, teachers, and the community to target resources based on need, mobilize support, and hold stakeholders accountable. Common challenges of education management information systems (EMIS) that inhibit their ability to monitor and report students' educational progress include poor governance and coordination, a lack of human capital and technical capacity, and an absence of technological infrastructure, often resulting from a lack of sustained investment and support (2, 3, 4).
According to a 2016 report by the Education Commission (5), most children in the developing world are not tested. Only about half of developing countries have a systematic national learning assessment at the primary school level, and even fewer have assessments for secondary school students. Only half of all countries report their government expenditures on education.
Besides working with governments at all levels to address these issues, school-level interventions, such as investments related to school report cards and community-based assessments, may also be appropriate. Investors can strengthen education systems and improve data coverage through various approaches, outlined below. Investments can establish institutional and technological infrastructure by:
Investments in relevant business models, products or services can collaborate with service providers and capacity builders to build technical and human capacity by:
Investments can establish appropriate, eﬀective, and inclusive governance and accountability mechanisms by:
The data needed to monitor progress towards inclusive, equitable, and quality education and lifelong learning are still lacking (4). Enrollment and attendance rates continue to be the most widely reported indicators, while indicators on education quality and expenditure are often unavailable. Coverage of data is especially low for informal technical, vocational, and adult education; data availability is most consistent for primary education (3). Certain populations also have less data available, including orphans and vulnerable children (OVC), children with disabilities or special learning needs, children in conflict-affected situations, and out-of-school children (OOSC). As of 2017, only about 30% of countries reported on OVC, and only about 70% of primary schools and 60% of secondary schools reported on students with disabilities (3). Since OOSC are typically excluded from EMIS, and because they can be difficult to reach with household surveys, counts of OOSC are likely significantly underestimated. Even when OOSC are counted in household surveys, if they are excluded from EMIS, they risk becoming “invisible” to education administrators and not receiving targeted support (3).
Education ministry officials and policymakers: This strategy provides education stakeholders, including policymakers and government officials, with the data they need to make decisions.
Students: Improved policies lead to better learning environments and improved learning outcomes.
Teachers: Increased access to information on teaching and learning can help teachers to better tailor their teaching strategies to student needs.
Parents and community members: Increased access to information on educational quality can help parents and community members make decisions about where to send children to school.
Data-management processes in low- and middle-income countries are often underfunded, ad hoc, and dependent on donor engagement (4). Coverage of learning assessments offers one way to assess the efficacy of these processes. While participation on international, regional, and citizen-led learning assessments has grown in these countries over the past two decades, notable gaps in data coverage remain. For example, between 2010 and 2015, national learning assessments of reading and math were administered in approximately one-half of developing countries (7).
Education management systems help stakeholders to design, prioritize, and implement cost-effective, data-driven education policies.
While this strategy can theoretically benefit all children and youth, it particularly benefits the approximately 262 million out-of-school children and youth, the approximately 93 million children under 14 with a moderate or severe disability, and orphans and vulnerable children.
Improving the transparency and accountability of EMIS can lead to improved educational quality, equity, and learning outcomes (7). Some examples of impact aligned with this strategy include the following:
External Risk: Weak political and regulatory frameworks can hinder the start-up, adoption, formalization, and scaling of EMIS. To mitigate this risk, investors should understand the regulatory frameworks of investment target countries and connect investees to experts on specific political and regulatory environments.
Endurance Risk: Any time a government changes from one political faction to another, the necessary long-term funding to build technical capacity in national governments and ministries and establish institutional and technological infrastructure for education stands at risk. To mitigate this risk, investors can build strategic alliances to train or build infrastructure with local governments or with INGOs that have relationships with local governments.
Execution Risk: Weak sector-wide education practices—such as inadequate infrastructure, technological literacy, education systems, or education assessments—will delay the formalization of EMISs, and unsophisticated practices and skills can hinder their business performance. Also, poor access to electricity and other resources in low-income countries can present challenges for some technological solutions. Investors should make sure such solutions fit the geography or demographic to be served.
Stakeholder Participation Risk: Inappropriate tailoring of products, inadequate technical literacy, insufficient understanding of the objectives and experience of those affected by the EMIS, lack of trust in government or technology service providers, or any of these could lead to comparatively low technological adoption rates and reduced positive impact on clients. To mitigate this risk, investors should strive to deeply understand the sociopolitical environments in which investees operate, in particular the likelihood that targeted stakeholders will adopt the product or service. Promoting community participation in assessment initiatives can also foster a sense of shared ownership of and accountability for the proper use of EMIS data.
Such risks could lead to clients’ inability to effectively use the services provided. Opportunity costs from the use of products and services that are not effectively tailored to meet clients’ needs can be one negative impact. In cases involving harmful sociopolitical practices, such as corruption, possible negative effects on targeted or affected stakeholders could be considerable.
Agenda Edu is a provider of a school management and communication platform in Brazil, intended to improve communication between parents and schools. The app tracks students' daily activities and benchmarks progress in real time, improving the engagement and educational journey of school officials, students and educators, enabling schools and parents to monitor their children's progress through indicators in a more transparent way. The solution has reached over 1,300 schools to date, and has received investment from Omidyar Network, among other investors.
Chrysalis is the provider of education and learning assessment services to schools and children in India, enabling schools to receive learning material with child-focused curriculum, teacher assistant tools, student assessments and tech-enabled teaching and home learning applications. Chrysalis has reached over 200,000 students across 11 states and 500-plus schools to date. Menterra Social Impact Investing Fund invested in this solution, among other investors.
Research on Improving Systems of Education (RISE) is a large-scale, multi-country research program to understand how school systems in the developing world can deliver better learning for all. By supporting, facilitating, synthesizing, and harnessing education systems research, the RISE Programme aims to (a) provide an analytical framework to describe and understand how education systems function; (b) generate research that evaluates large-scale efforts at system reform on the basis of their impact on student learning and equity in learning across genders and socio-economic classes; (c) explain why reforms succeed or fail; (d) collect and disseminate new quantitative and qualitative data on education in general; and (e) build a community of practice of local and international researchers, policymakers, and education practitioners to ensure they have access to the most relevant, up-to-date research. RISE is a partnership between UK Aid, Australia Aid and the Bill & Melinda Gates Foundation.
Filmer, Deon, Halsey Rogers, Samer Al-Samarrai, Magdalena Bendini, Tara Béteille, David Evans, Märt Kivine, Shwetlena Sabarwal, Alexandria Valerio et al. World Development Report 2018: Learning to Realize Education’s Promise. Washington, DC: World Bank, 2018.
Abdul-Hamid, Husein. Data for Learning: Building a Smart Education Data System. Washington, DC: World Bank, 2017.
UNESCO Institute for Statistics. The Data Revolution in Education. Information Paper No. 39. Montreal: UNESCO Institute for Statistics, March 2017.
Subosa, Miguel, and Mark West. Re-orienting Education Management Information Systems (EMIS) towards Inclusive and Equitable Quality Education and Lifelong Learning. Paris: UNESCO, 2018.
Steer, Liesbet, Justin W. van Fleet, Gila Sacks, Nicholas Burnett, Paul Isenman, Elizabeth King, Annababette Wils et al. The Learning Generation: Investing in Education for a Changing World. New York: International Commission on Financing Global Education Opportunity, 2016.
Cheng, Xuejiao Joy and Kurt Moses. Promoting Transparency through Information: A Global Review of School Report Cards. Paris: UNESCO, 2016.
Custer, Samantha, Elizabeth M. King, Tamar Manuelyan Atinc, Lindsay Read, and Tanya Sethi. Toward Data-Driven Education Systems: Insights into Using Information to Measure Results and Manage Change. Washington, DC: Center for Universal Education at Brookings, February 2018.
De Hoyos Navarro, Rafael E., Alejandro J. Ganimian, and Peter A. Holland. "Teaching with the Test: Experimental Evidence on Diagnostic Feedback and Capacity Building for Public Schools in Argentina." Policy Research Working Paper No. 8261, Washington, DC, World Bank Education Global Practice Group, November 2017.
Barr, Abigail, Frederick Mugisha, Pieter Serneels, and Andrew Zeitlin. "Information and Collective Action in Community-Based Monitoring of Schools: Field and Lab Experimental Evidence from Uganda." Unpublished preliminary paper, McCourt School of Public Policy, Georgetown University, August 2012.
Airola, Denise T. and Karee E. Dunn. Oregon DATA Project Final Evaluation Report. Salem, Oregon: Oregon Department of Education, 2011. https://www.researchgate.net/publication/268631049_Becoming_data-driven_Exploring_teacher_efficacy_and_concerns_related_to_data_driven_decision_making
Andrabi, Tahir, Jishnu Das, and Asim Ijaz Khwaja. “Report Cards: The Impact of Providing School and Child Test Scores on Educational Markets.” American Economic Review 107, No. 6 (2017): 1535–63.
Makwati, Glory, Bernard Audinos, and Thierry Lairez. 2003. “The Role of Statistics in Improving the Quality of Basic Education in Sub-Saharan Africa.” Working Document, Association for the Development of Education in Africa, African Development Bank, Tunis, Tunisia.
OECD. 2013. Education Policy Outlook. http://www.oecd.org/education/EDUCATION%20POLICY%20OUTLOOK%20AUSTRALIA_EN.pdf
Nayyar‐Stone, Ritu. 2013. “Using National Education Management Information Systems to Make Local Service Improvements: The Case of Pakistan.” PREM Note, Special Series on the Nuts and Bolts of M&E Systems. Poverty Reduction and Economic Management Network (PREM), World Bank, Washington, DC.
Fairfax County Public School District. 2014. “FCPS Superintendent Garza Proposes FY 2015 Budget of $2.5 Billion.” News release, January 9. Fairfax County, Commonwealth of Virginia. http://commweb.fcps.edu/newsreleases/newsrelease.cfm?newsid=2426.
This mapped evidence shows what outcomes and impacts this strategy can have, based on academic and field research.
Andrabi, Tahir, Jishnu Das, and Asim Ijaz Khwaja. 2017. “Report Cards: The Impact of Providing School and Child Test Scores on Educational Markets.” American Economic Review, 107 (6): 1535-63.A study on the impact of providing school report cards with test scores on subsequent test scores, prices, and enrollment in markets with multiple public and private providers. A randomly selected half of the sample villages (markets) received report cards. This increased test scores by 0.11 standard deviations, decreased private school fees by 17 percent, and increased primary enrollment by 4.5 percent. Information provision facilitates better comparisons across providers, and improves market efficiency and child welfare through higher test scores, higher enrollment, and lower fees.
De Hoyos, Rafael, Alejandro J. Ganimian, and Peter A. Holland. "Teaching with the test: experimental evidence on diagnostic feedback and capacity building for public schools in Argentina." (2017).This study conducted an experiment in the Province of La Rioja Argentina, randomly assigning 105 public primary schools to: (a) a “diagnostic feedback” group in which standardized tests were administered in math and reading comprehension at baseline and two follow-ups and the results were made available to the schools through userfriendly reports; (b) a “capacity-building” group for which schools were provided with the reports and also workshops and school visits for supervisors, principals, and teachers; or (c) a control group, in which the tests were administered only at the second follow-up.
Barr, Abigail, Frederick Mugisha, Pieter Serneels, and Andrew Zeitlin. "Information and collective action in community-based monitoring of schools: Field and lab experimental evidence from Uganda." Unpublished paper, Georgetown University, 2012.Are community-monitoring interventions successful because they improve information alone, or do they also need to overcome collective action problems? We investigate this question by implementing a combined field and lab experiment in 100 Ugandan primary schools, which randomly assigns schools and their Management Committees (SMCs) either to standard community-based monitoring, to a participatory variation that addresses coordination problems, or to a control group.
Rockoff, Jonah, Douglas Staiger, Thomas Kane, and Eric Taylor. 2010. “Information and Employee Evaluation: Evidence from a Randomized Intervention in Public Schools.” NBER Working Paper 16240, National Bureau of Economic Research, Cambridge, MA.We examine how employers learn about worker productivity in the context of a randomized pilot experiment which provided objective estimates of teacher performance to school principals. We test several hypotheses that provide support for a simple Bayesian learning model with imperfect information. First, the correlation between performance estimates and prior beliefs rises with more precise objective estimates and more precise subjective priors. Second, new information exerts greater influence on posterior beliefs when it is more precise and when priors are less precise. Employer learning also affects job separation and productivity in schools, increasing turnover for teachers with low performance estimates and producing small test score improvements.
Hastings, J.; Weinstein, J. 2008. "Information, school choice, and academic achievement: Evidence from two experiments." Quarterly Journal of Economics, 123(4), 1373–1414.We analyze two experiments that provided direct information on school test scores to lower-income families in a public school choice plan. We find that receiving information significantly increases the fraction of parents choosing higher-performing schools. Parents with high-scoring alternatives nearby were more likely to choose non-guaranteed schools with higher test scores. Using random variation from each experiment, we find evidence that attending a higher-scoring school increases student test scores. The results imply that school choice will most effectively increase academic achievement for disadvantaged students when parents have easy access to test score information and have good options to choose from.
Pandey, P.; Goyal, S.; Sundararaman, V. 2009. ‘Community participation in public schools: Impact of information campaigns in three Indian states’. Education Economics, 17(3), 355–375.This study evaluates the impact of a community-based information campaign on school performance from a cluster randomized control trial in 610 villages. The campaign consisted of eight or nine public meetings in each of 340 treatment villages across three Indian states to disseminate information to the community about its state-mandated roles and responsibilities in school management.
Piper, B. and Korda, M. 2010. “EGRA Plus: Liberia.” Program Evaluation Report, RTI International.Building on the success of the Early Grade Reading Assessment (EGRA) as a measurement tool, many countries have begun to show interest in moving away from assessments alone and toward interventions focused on changing teacher pedagogy, and as a result, increasing student reading achievement. Liberia, for example, began an EGRA-based intervention, called EGRA Plus: Liberia, in 2008. The results from the EGRA Plus midterm evaluation showed very promising results on a variety of learning outcomes.1 This report is an impact evaluation of the EGRA Plus program at project completion, and it presents compelling evidence that a targeted reading intervention focused on improving the quality of reading instruction in primary schools can have a remarkably large impact on student achievement in a relatively limited amount of time.
Next Level Evaluation, Incorporated. Oregon DATA Project Final Evaluation Report. Salem, Oregon: Oregon Department of Education, 2011.The Oregon Direct Access to Achievement (DATA) Project was designed to increase classroom-level data-driven decision making. Professional development included job-embedded training and traditional seminar training. The impact of this training on teachers was monitored utilizing a triadic assessment framework. This assessment framework examined teacher concerns (emotional response), efficacy (motivational response), and knowledge. Results of this assessment framework were used to provide recommendations for training efforts in order to create “prescriptive professional development.” The results of this prescriptive professional development were dramatic at the teacher level and student level.
Zain, M .Z. M., H. Atan, and R. M. Idrus. 2004. “The Impact of Information and Communication Technology (ICT) on the Management Practices of Malaysian Smart Schools.” International Journal of Educational Development 24 (2): 201–11.The impact of Information and Communication Technology (ICT) on the management practices in the Malaysian Smart Schools was investigated. The analysis revealed that the impact has resulted in changes that include the enrichment of the ICT culture among students and teachers, more efficient student and teacher administration, better accessibility to information and a higher utilisation of school resources. This analysis also revealed that time constraints, higher administrative costs, negative acceptance/support from some untrained staff, abuse of the ICT facilities and problems related to the imposed rigid procedural requirements are among the challenges encountered by the schools.
Moore, Audrey-Marie Schuh, Joseph DeStefano, and Elizabeth Adelman. Opportunity to Learn: A high impact strategy for improving educational outcomes in developing countries. Washington DC: Education Policy and Data Center (EPDC), 2012.This study assesses whether students are learning to read by Grade 3. School effectiveness is measured and evaluated in terms of both specific student learning outcomes and the opportunity to learn provided by the school. To evaluate students’ opportunity to learn, data were gathered to determine whether schools consistently provide opportunities for students to learn and, in particular, to learn to read.
Galab, S., C. Jones, M. Latham, and R. Churches. “Community-Based Accountability for School Improvement: A Case Study for Rural India.” Washington, DC: Center for Education Innovations, 2013.This report looks at community-based accountability and parental participation as a lever for school improvement in rural India.
Nayyar‐Stone, Ritu. 2013. “Using National Education Management Information Systems to Make Local Service Improvements: The Case of Pakistan.” PREM Note, Special Series on the Nuts and Bolts of M&E Systems. Poverty Reduction and Economic Management Network (PREM), World Bank, Washington, DC.Education management information systems (EMISs), usually located within the ministry of education, are tools that can help governments improve education system administration by providing information that can be used in strategic planning, resource allocation, and monitoring and evaluation. Frequently, however, they are underutilized and become merely a reporting mechanism. Using the data at the point of collection—usually individual schools in a decentralized environment—and feeding them into service improvement action plans can circumvent problems with the national EMIS, and allow the data to become instrumental in improving local education service delivery outcomes.
Each resource is assigned a rating of rigor according to the NESTA Standards of Evidence.
(Number of absentee days of students during the reporting period) / Number of school days in the reporting period*School Enrollment: Total (PI2389))
(Number of enrolled students who passed standardized test) / (Number of enrolled students who took standardized test)
(Number of school students enrolling in the next level of schooling for the upcoming year) / (Number of students who completed the previous level of schooling during the preceding year)