You are here

EPDC National Learning Assessments Mapping Project: Key Findings

By Xuejiao Cheng and Carina Omoeva

From July 2012 to September 2013, the Center for Universal Education of the Brookings Institution together with the UNESCO Institute for Statistics convened a global consultation group to form the Learning Metrics Task Force (LMTF) with the aim of shifting the post-2015 global education focus from access to access plus learning.  In its final recommendation report, the Task Force proposed a global framework of seven key learning domains deemed important for the educational experience of children in all countries.   These learning domains are: literacy & communication, numeracy & math, science & technology, social & emotional learning, culture & the arts, physical education, and learning approaches & cognition. 

To contribute to the global knowledge base on learning metrics and measurements, and understand the extent to which the learning domains are reflected in existing national exams and assessments, the FHI 360 Education Policy and Data Center (EPDC) embarked on an effort to gather meta-data on national assessments from publicly available online sources. The EPDC National Learning Assessment Mapping Project (N-LAMP) asks the question: to what extent are the seven domains identified by the LMTF reflected in current assessments and exams?  This brief presents the results of the EPDC N-LAMP project. 

N-LAMP Mapping Process

In Phase I and II of N-LAMP, EPDC selected 125 countries from 6 regions of the world[1] and reviewed available meta-data on standardized exams and assessments administered at the national level from primary to upper secondary education. Data sources included the  International Bureau of Education (IBE) World Data on Education 2010-2011 edition, national education policy documents from International Institute for Educational Planning (IIEP)’s Planipolis portal, and where available, Ministry of Education and national student examination agency websites. With the goal of providing the most recent information possible, the mapping project only considered assessments implemented from 2004 onwards. Where information on high-stakes exams (exit and entrance exams) existed for multiple years, only the most recent year of the exam was included in the review[2].

In the process of mapping the subjects tested in each of the assessments to the LMTF learning domains, EPDC referenced the definition and subdomains for each competency proposed by the LMTF Report. In cases where subjects seemed to fall within multiple domains, EPDC used discretion in assigning those subjects to certain learning domains, ensuring consistency across the mapping exercise.

Illustrative Example of the Subjects Mapped in the LMTF Learning Domains

Scope of EPDC N-LAMP Mapping

Information on national learning assessments is not uniformly available across all countries. The following is a breakdown of the assessments identified in the N-LAMP project.

·         403 national-level learning assessments from 105 countries were identified, which we distinguish by type: high-stakes exams, including primary and secondary school exit exams, and college entrance exams; and low-stakes national large-scale student assessments (NLSAs).

·         NLSAs are sample-based assessments designed to provide formative information about the state of learning outcomes in a given country, but carry no personal stakes for the student taking the exams. 

·         High-stakes exams are mandatory, census assessments required for completing a given level of schooling or gaining admission to the next level. 

·         307 assessments from 85 countries had information available on the subjects tested, which formed the basis of the LMTF learning domain mapping analysis (See graph below for a breakdown of assessments by region). 

·         In terms of assessment types, NLSAs form more than 55% of the 307 national learning assessments, followed by upper secondary exit exams and lower secondary exit exams (see graphs below)[3]. While there are only 10 college and university entrance exams, it needs to be noted that many upper secondary exit exams serve the purpose of college and university selection.

Findings

1. Information availability on assessments and subject information varies by country and region.

Since EPDC collects data from publicly available resources, findings under N-LAMP to a large extent reflect the countries’ openness in sharing information on national-level learning assessments, which differs by country and region. While some countries have examination agency websites providing comprehensive assessment information including content, timetable, preparation tips, as well as a dedicated online portal for students to log in and access exam results, other countries do not have exam information readily available on their official websites at this time. In some cases, even the basic structure of the education system is difficult to locate on Ministry of Education websites.

The graph below compares the availability of information on the subject matter being tested in assessments, by region. As can be seen, the percentage of assessments with subject information available ranges from 95% in Latin American and Caribbean to only 63% in Europe and Central Asia.

2. Literacy and Numeracy Competencies still dominate in national learning assessments

As the graph below demonstrates, unsurprisingly, Literacy & Communication as well as Numeracy & Maths are the two most common LMTF domains covered by almost all assessments. Science & Technology also appears quite frequently, in more than half of the assessments. In stark contrast are the domains of Physical Well-Being and Learning Approaches & Cognition, which only appear 33 and 11 times respectively in all assessments. The Social & Emotional domain also appears quite infrequently, as it’s mostly reflected in the subject of Religion or Civic Education -- tested in a few countries. 

3. It’s fairly rare for countries to cover more than 4 learning domains.

Overall, current assessment practices are far from the recommendations of the LMTF, which calls for comprehensive learning opportunities to be created across seven key domains.   However, it is certainly true that not all learning domains are generally tested (see below on the most tested domains).

On average, each assessment tests knowledge in about 3.4 learning domains. As can be seen from the graph on the right, it’s uncommon for an assessment to cover more than 4 learning domains -- while 78% of the assessments cover 1 to 4 domains, only 22% cover 5 to 7 domains[4].

 

4. The number of learning domains assessed increases with each school level.

It is noteworthy that among all assessments, there is an increase of the number of learning domains covered at higher levels of schooling. As the graph below illustrates, the average number of learning domains per assessment increases from 2.8 at the primary level to 4.2 at the upper secondary level.  It seems indicative of the increasing breadth of students’ learning as they proceed to higher levels of education.

5. NLSAs cover noticeably fewer domains than high-stakes exams.

In terms of the number of learning domains assessed by different assessment types, the coverage of low-stakes NLSAs is noticeably narrower in comparison with high-stakes exams. It seems typical that countries largely focus on students’ literacy and numeracy performance in NLSAs to understand their learning levels and gauge the effectiveness of the education system.

6. International and regional exam agencies exist to administer high-stakes exams for multiple countries.

During the mapping exercise, several regional exam agencies were found administering exams for countries in the same region. For example, the West African Exam Council conducts several national exams such as West Africa Senior School Certificate Exam (WASSCE) for almost all member countries including Gambia, Sierra Leone, Liberia and Ghana. The Caribbean Examination Council also administers the Caribbean Secondary Education Certificate Exam to countries in the Caribbean region such as Belize and Barbados. Furthermore, Cambridge International Education – a provider of international education qualifications – administers General Certificate of Education Ordinary and Advanced Level Exams in multiple countries including Sri Lanka, Mauritius, Cameroon, Namibia, Zimbabwe, Guyana and Lesotho.

The existence of these regional and international exam agencies appears conducive to the convergence of exam practices across countries. Therefore, the practices of these exam agencies deserve equal attention as they have impact for exam systems beyond one single country. 

 

7. Some countries are placing emphasis on less common LMTF learning domains.

As rarely as some of the learning domains appear in national learning assessments, there are exceptions. Some countries are in fact actively placing emphasis on the less common learning domains in their exams and assessments. For example, below is a list of assessments that test the Learning Approach and Cognition domain by translating it into explicit test items or subjects.

As can be seen in the table above, learning cognition is tested mainly at the upper secondary level. It’s noteworthy that countries are approaching the Learning Approaches and Cognition domain from different angles, each testing varying aspects of the same domain including critical thinking, problem solving and cognitive skills.  The degree of emphasis on the domain also varies -- countries such as Egypt have designed a specific NLSA assessing critical thinking and problem solving skills to students in grade 4, 8 and 10, while other countries may only include the domain in one subsection of their exams. In Malaysia for example, critical thinking and analytical skills are included in the “General Studies” subject in the upper secondary exit exam.

Conclusions

The N-LAMP project was the first attempt to “ground-truth” the framework recommended by the Learning Metrics Task Force, and examine to what extent the seven key learning domains reflect the assessment priorities of national governments around the world.   In this respect, this project can be viewed as the “baseline assessment” for the LMTF learning domains framework.  Through N-LAMP, EPDC surveyed the assessment contexts in 125 countries and identified 403 national-level high-stakes exams as well as low-stakes NLSAs.  The following key takeaways have emerged from this analysis:

 

·         Varying degrees of data availability: Countries share the exam and subject information through publicly available online platforms with different degrees of transparency. The disparity in exam data availability somehow reflects the different development stages of countries’ digital infrastructure of education systems as well.

·         Limited exposure to learning domains: Substantial variability exists in the extent to which the seven learning domains are reflected in national learning assessments. More traditional learning domains -- including Literacy & Communication, Numeracy & Maths, and Science & Technology -- appear frequently in national-level learning assessments. In contrast, the domains of Learning Approaches & Cognition, Physical Wellbeing and Social & Emotional receive rare attention in assessments. Although the number of learning domains increases as school level progresses, overall there is an apparent discrepancy between LMTF’s vision of a broad exposure to learning domains and countries’ current exam practices.

·         Challenges in comparability across domains:  Even when countries appear to be assessing the same learning domain in their assessments, it is hard to compare students’ exposure to those learning domains across countries. This is due to the fact that countries often assess different facets of the same domain, and that the importance attached to each domain and subdomain varies. Moreover, although N-LAMP assigns each exam subject into a specific learning domain, some subjects tested in the learning assessments may fall into multiple domains -- further complicating the interpretation of N-LAMP’s findings.

·         Differences between high stakes and NLSAs: Compared with high-stakes national exams, the lower stakes formative NLSAs assess fewer learning domains, and most are focused on assessing student competency in literacy and numeracy. NLSAs also appear to be focused more on students in lower levels of schooling.

In sum, it is evident that the LMTF learning domains framework provides a useful lens for examining the priorities placed by national governments on the types of skills and competencies of their students.  Contrary to popular support for “non-traditional” foci in learning and assessments, basic literacy and numeracy dwarf other subjects in assessments across the board, both in national high stakes exams, as well as low-stakes national formative learning assessments.  Further, the breadth of the student learning experience expands as they progress through the system, with the more complex types of assessments – such as cognition – sometimes tested at the end of secondary schooling, rather than at the start of the primary schooling cycle. 

As we note above, this effort can be viewed as a “baseline assessment” for the LMTF.  There is hope that as the international dialogue around learning progresses, we will begin to see greater convergence around the seven learning domains identified by the LMTF, as well as better articulation of the types of subject matter that form the content of each specific domain.  We also hope to see a growth of reliable nationally representative formative assessments that can be linked to international and regional student achievement studies, forming a comparative knowledge base on the skills and competencies mastered by students at different levels of their education systems.  Most importantly, we hope that this process will stimulate greater information sharing and exchange of experiences, frameworks, instruments, analytic methods, among international and national educators, policy makers and professionals working in the realm of data for education. 

As an immediate next step, EPDC will begin the integration of available data on national assessments into the online database, accessible on www.epdc.org.  Users may also access the  meta information containing data on all of the assessments reviewed for the N-LAMP project from our documents search page, or by clicking on the links below. 

 

Limitations and Caveats

1.    Data collection is restricted to publicly available sources

As we noted above, for the N-LAMP project EPDC gathered meta data from publicly available sources. Consequently, the findings presented by N-LAMP reflect the varying degree of information availability and inevitably over-represent exam practices from countries with more public information on national exam systems. 

 

2.    List of learning domains may mask the actual assessment experience of each student

Some exams offer students the liberty of choosing from a list of optional subjects in addition to compulsory exam subjects. This is especially common for exams at the upper secondary level as it’s often at this stage that students enter different academic streams.  In these cases, EPDC documented all available subjects as well as the corresponding learning domains as offered by the exams; in reality however, students are only tested in a fraction of all the learning domains of the exams.

 

3.    N-LAMP examined an expanded universe of assessments

Although the LMTF proposed learning domains are meant for students at primary and lower secondary education, N-LAMP surveyed exams and learning domains at upper secondary level as well. The LMTF rationale for limiting the recommendation to lower secondary level is that students experience diverse areas of specialization at upper secondary level – indicating a comprehensive range of learning domains may be incompatible with education experiences at this level. In reality however, countries seem to expect students to master a wider – not narrower –  range of learning domains as they progress through schooling levels, and it is at the upper secondary level that exams cover the most number of learning domains.  Consequently, the seven LMTF learning domains may be just as relevant for upper secondary as for primary and lower secondary schooling. 

 

  • Click here for a list of the national assessments reviewed as part of the mapping exercise.
  • Click here for the assessment metadata, including information on grades and subjects tested, as well as the years of administration.

 


[1] The majority of the countries are low-income and lower-middle income countries; and the six regions are: East Asia & the Pacific, Europe & Central Asia, Latin America & the Caribbean, Middle East & North Africa, South Asia, Sub-Saharan Africa.

[2] Different from high-stakes exams, all national large-scale student assessments implemented between 2004 and the present were recorded when information was available. 

[3] NLSAs that assess multiple grade levels were counted as separate data points – one NLSA for each grade level.

[4] There is caveat though, as public information may not comprehensively reflect all the subjects covered by the assessments.

 

Categories: 
Categories: 

Add new comment