You are here

The National Learning Assessments Mapping Project: An Update

What should students be expected to know and do by the time they finish school? Since 2012, the Learning Metrics Task Force (LMTF), which was convened by UNESCO's Institute for Statistics and the Center for Universal Education at the Brookings Institution, and consists of 30 member organizations, has set out to answer this question. The LMTF has focused attention on measuring the quality of students’ learning. This attention is especially relevant given the recent focus on quality and equity in the post-2015 global education agenda. In 2013, LMFT recommended seven “essential” domains of learning—essential in that they “prepare children and youth for their future lives and livelihoods." The seven domains, and examples of subdomains within each, are shown in Table 1.

To map the landscape of national assessments and understand how the seven LMTF domains are reflected in the current priorities of national education systems, FHI 360’s Education Policy and Data Center (EPDC) launched the National Learning Assessment Mapping Project (NLAMP) in 2014 (see the related blog post and policy brief.) The most recent update of the project (NLAMP 2) includes more assessments and refines the mapping methodology, leading to a more comprehensive and representative database, as well as several changes in the findings.  

Methodology
NLAMP2 collected national assessment data from 105 countries and 6 regions[1] of the world (see Figure 1 for a comparison of coverage by NLAMP 1 and NLAMP 2), reviewing publicly available meta-data on national standardized exams and assessments from primary to upper secondary school and creating a database of 349 assessments.  Thus, NLAMP 2 covers 20 more countries than NLAMP 1 and 40 more assessments.  Key data sources include the 2010-11 International Bureau of Education (IBE) World Data on Education reports; national education policy documents from the IIEP Planipolis portal; and national Ministry of Education websites. Key information collected on the assessments includes the (school) level of administration (i.e., primary, lower secondary, or upper secondary), the most recent year of administration and/or data collection, and the subjects tested, if available. Where subject information was available, subjects were mapped to their corresponding LMTF domains.

Assessments were, for the large majority, one of two types: 1) high-stakes exams or 2) national large-scale student assessments (NLSAs). Students participate in high-stakes exams to officially exit a grade or school level (i.e., primary exit exam) or enter another grade or school level (i.e., university entrance exam). Large-scale national student learning assessments are low stakes, intending to provide information about the state of learning but not carrying weight for individual students.  High-stakes exams account for 37.8% of assessments with subject matter information available, while national large-scale student assessments account for 62.2% (see Figure 2 for a breakdown of assessment coverage by exam type and school level).

Following NLAMP 1, the data presented and analyzed here cover assessments from 2004-2015 and those with subject matter information; vocational exams are excluded. (The full NLAMP 2 database includes a larger range of assessments, including vocational exams and those without subject matter information.) Although LMTF emphasizes learning at the primary and lower secondary levels, NLAMP 1 and 2 also include upper secondary level assessments, as higher levels tend to test more domains and are useful for comparison purposes.

The NLAMP 2 findings include more countries and assessments, as NLAMP 2 expanded upon the “known universe” of assessments, finding additional publicly available information on existing assessments. NLAMP 2 follows the NLAMP 1 methodology for the most part in its mapping process, referencing definitions and subdomains for each domain proposed by the LMTF report (see Table 1). However, there are several methodology differences related to the mapping of “religion” and religious subjects[2], the mapping of optional subjects[3], and the mapping of certain higher secondary exit exams[4].

It should also be emphasized that the findings represent the “known universe of assessments,” reflecting not only the presence of exams but also the availability of data on these exams. Frequently, for example, exams are mentioned, but no subject information can be found.

Literacy and Numeracy are the Most Common Domains
Many of the findings in NLAMP 2 echo those in NLAMP 1. Latin America and Sub-Saharan Africa remain the regions with the largest number of countries and assessments. Unsurprisingly, the literacy & communication and numeracy & maths domains remain the two included in the highest proportion of assessments; in fact, it is rare to encounter an assessment that does not include both of these domains. Domains with medium-level coverage are science & technology and culture & the arts. Domains with low-level coverage are learning approaches & cognition, physical well-being, and social & emotional. As pointed out in the NLAMP Policy Brief, this focus on basic skills and knowledge is expected, as literacy, numeracy, and content knowledge of science and social science are much easier to assess in a formal school setting than are other important skills endorsed by the LMTF, including resilience, leadership, and moral and ethical values.

See Table 2 and Figure 3 for a breakdown of domain coverage by NLAMP 1 and NLAMP 2.  

 

The Relative Weight of the Literacy and Numeracy Domains Decreases as School Level Increases
NLAMP 2 further disaggregates domain coverage to consider school level and exam type. The four domains 'literacy & communication,' 'numeracy & maths,' 'culture & the arts,' and 'science & technology' are present at all school levels. The literacy & communication and numeracy & maths domains dwarf all other domains at the primary level, accounting for 65% of all domains mapped.  As the school level increases, the number of literacy & communication and numeracy & maths domains decreases relative to other domains: at the lower secondary level, literacy & communication and numeracy & maths account for 58% of all domains mapped, while at the upper secondary level, they account for 55%. This makes sense, considering that students are likely expected to master a wider range of content at higher levels of schooling, and literacy and numeracy make way for other domains.  At all levels, the three domains 'physical well-being,' 'social & emotional,' and 'learning approaches & cognition' account for a small percentage of all domains mapped, and this percentage varies.

Breaking down the analysis by assessment type, more literacy & communication and numeracy & maths domains are mapped for NSLAs than for high-stakes exams at every school level.  This is especially striking among primary-level NLSAs, where there are 124 numeracy & maths domains mapped and 132 literacy & communications domains mapped, as compared with 41 numeracy & maths domains mapped and 41 literacy & communication domains mapped among primary-level high-stakes exams. This might result from the fact that literacy and numeracy are easier to standardize than other domains such as science & technology.

 

Average Number of Domains per Assessment Increases by School Level
Considering the average number of domains per assessment provides a gauge of how ‘well’ assessments are covering all of the domains.  Overall, the domain coverage increases by school level, echoing findings from NLAMP 1. At the upper secondary level, students are expected to master a wider range of content.  At the primary level, the average number of domains per assessment is 2.7; at the lower secondary level, it is 3.1; and at the upper secondary level, it is 3.4. It is clear that students are expected to test on more subjects at the secondary level than at the primary level, even given that NLAMP 1 did not map "optional subjects."  The average number of domains is low overall.

At the regional level, coverage is highest for Latin American & Caribbean as well as Sub-Saharan African countries.  This is to be expected, since these regions also have the most countries.  However, there are also some striking cases of assessments with a high domain coverage, for example, in Botswana (with three out of four exams covering more than 5 domains and one covering all 7) and Cameroon (with two out of three exams covering more than 5 domains and one covering 6). The lowest domain coverage is found among East Asian & Pacific countries.  However, overall domain coverage is low, with the highest at 3.6 average domains per assessment among Latin American countries.

Moving Forward

Incorporating domains such as learning approaches & cognition and physical well-being into the curriculum and assessments will continue to remain a challenge.  How should these domains be tested?  Moving forward, it might be interesting to consider the weighted or relative importance of each domain: are all domains equal, or should literacy and numeracy remain at the forefront?  Indeed, as the NLAMP 1 Policy Brief suggested, should these domains be weighted differently in different country contexts?  Should high-stakes exams and NLSAs play equal roles in assessing students’ learning? 

Another major task moving forward will be making data on assessments available. It is very possible that many assessments exist that we simply do not know about. Assessments without subject matter information available complicate data analysis as well; it is very possible that this analysis underestimates the number of high-stakes exams because of the absence of information on which subjects are being tested.  In countries with no assessment information available, why is this the case? Is this because of an absence of data or of assessments?

Moving forward, EPDC anticipates continuing to provide data on the world’s “state of assessments” and contributing to the discussion around these and other critical questions related to the robustness of assessment systems and the use of assessment data.

[1] The majority of the countries are low-income and lower-middle income countries.

[2] The mapping of “religion” and religious subjects (e.g., Islamic education or Christianity): these were mapped in NLAMP1 to the “social and emotional” domain. However, NLAMP2 mapped these as both “culture and the arts” and “social and emotional,” with the rationale that, it is not possible to determine, from publicly available sources, whether these subjects consist of a religious education, whether they teach about religion, or whether they function as a mix of both.

[3] Mapping of optional subjects: in some cases, especially at the upper secondary level, exams offer students a list of optional subjects to choose from.  NLAMP 1 mapped all of these optional subjects, as well as the corresponding learning domains covered by the exams. However, because, in reality, students only participate in a fraction of these exams, and the selected exams may vary, NLAMP 2 documented but did not map these subjects to domains

[4] Several countries in South Asia, including Bhutan and Bangladesh, document “higher secondary exit exams” which, upon further investigation, appear to be tertiary rather than part of the secondary curriculum.  Thus, they were excluded from NLAMP2. Among the other regions, however, exit exams from upper secondary are classified as upper secondary exams as they are attached to the secondary schooling system.

 

Add new comment