U.S. students’ math performance fell on the 2015 Programme for International Student Assessment (PISA), while students in Singapore, Finland, Japan, and Estonia emerged as top performers, according to test results released today.
The international exam—administered every three years by the Organisation for Economic Co-operation and Development (OECD)—compared more than half a million 15-year-old students from 72 countries on their math, reading, and science knowledge. This was the first time PISA was administered on computers.
U.S. students remained relatively steady in science and reading in 2015 compared with 2012, when the exam was last administered, but dropped below the global average in math.
They were not alone. Though education spending by countries participating in PISA has increased by nearly 20 percent in the last decade, only 12 out of the 72 countries saw their science performance improve in that time period, according to the OECD.
“Every country has room for improvement, even the top performers,” said Angel Gurria, the OECD’s secretary general, in a statement. “With high levels of youth unemployment, rising inequality, a significant gender gap, and an urgent need to boost inclusive growth in many countries, more must be done to ensure every child has the best education possible.”
The Nuts and Bolts
Based in Paris, the OECD is an international organization with 35 member nations (including the U.S.) that focuses on fostering global economic growth. The OECD administered the first PISA survey in 2000 with 32 countries, and in 2015, participation rose to 72 countries, including OECD members and other countries ranging from Chile to Tunisia and Malaysia.
According to the OECD, the two-hour test measures students’ real-world applications of reading, math, and science against seven levels of proficiency. Each triennial administration of PISA gives priority to one of the three disciplines—in 2015, it was science—in addition to including a new, special subject. Collaborative problem-solving was added to the 2015 exam to examine students’ ability to work with two or more people to solve a challenging problem; countries could also include an optional financial literacy component.
For the 2015 results, the U.S. performance was compared with that of the 71 other participating countries, and was separately compared with the performance of the 34 other OECD member countries, including Finland, South Korea, Germany, and the United Kingdom.
What About Math?
Of the 35 OECD member countries, the U.S. was among the lowest-performing in math—finishing 31st—continuing a downward trend that started in 2009. Japanese students emerged in 2015 as the strongest math performers.
The PISA math test measures students on their use of mathematical concepts, procedures, facts, tools, and reasoning to solve problems. Unlike traditional tests that focus on facts and formulas, PISA is based on the application of math, or how it can be used in real-world contexts. On the 2015 PISA, students were presented with scenarios like measuring the square footage of an apartment or estimating the area of an oil spill.
The PISA results in math and the other disciplines in recent years have prompted concern from educators and policy makers and a flurry of media headlines suggesting that the U.S. is failing to deliver a high-quality education to its students.
These tests were never meant to be a race, and we weren’t supposed to be the horses.
In response to the 2015 rankings this morning, U.S. Secretary of Education John B. King Jr. said the U.S. was “losing ground,” and expressed concern over the future of the workforce. After the dip in U.S. PISA rankings in 2012, then–Secretary of Education Arne Duncan called the results a “wake-up call against educational complacency and low expectations,” in a speech about the nation’s performance.
Outspending all but two countries on education, the U.S. joins countries like Norway and Switzerland that provide significant funding to their school systems yet aren’t seeing the same high performance from their students as other countries that spend less.
The results echo findings from a 2012 OECD analysis, which showed that countries that invested in their schools in more targeted ways—such as through teacher salaries or early childhood programs, or by supporting struggling students—were the ones with the highest gains on PISA, not countries that spent the most overall.
But Does PISA Really Matter?
While PISA may be a wake-up call for some, for others, like Stanford University Professor Martin Carnoy, the international comparisons are deceptive.
Carnoy, who has spent more than 20 years researching international education, says the rise and fall of PISA scores typically has less to do with a country’s education system and more to do with demographic changes. He warns educators and policy makers against comparing countries with completely different populations, cultures, economies, and attitudes toward education.
“The biggest problem is presenting test scores without correcting for social class differences, both between countries and within a country over time,” said Carnoy, “One should be very careful about jumping to conclusions about what these test score comparisons mean: These tests were never meant to be a race, and we weren’t supposed to be the horses.”
Carnoy says U.S. educators trying to make sense of PISA can learn more from what’s happening in some of our states’ most successful education systems, like the one in Massachusetts, rather than from what’s happening in South Korea.
The next PISA exam will be administered in 2018, with results published the following year. The 2018 assessment will put a larger emphasis on reading literacy and add global competence as a subject to measure whether students have the skills and attitudes needed “to interact effectively and appropriately with people in different countries and with people of different cultures in their local context,” according to the OECD.
“[So] where do we go from here?” asks Laura Engel, an assistant professor of international education and international affairs at George Washington University in Washington, DC, about what’s next for the U.S. and international comparisons.
Moving forward, I hope we look beyond rankings with other countries and use cross-national data to build deeper understanding around issues of equity within the U.S.
Engel, who has researched PISA for the last seven years, says it’s important for U.S. educators to look past PISA’s “winners and losers” and focus more on addressing disparities between states in our country.
“In the U.S. it’s important to look at variations across states and how we are doing with traditionally disadvantaged students,” said Engel. “Moving forward, I hope we look beyond rankings with other countries and use cross-national data to build deeper understanding around issues of equity within the U.S.”