PISA Data Analysis
After reading Performance of U.S. 15-Year-Old Students in Science, Reading, and Mathematics Literacy
in an International Context: First Look at PISA 2015 (Kastberg, Ying Chan, & Murray, 2016), consider the
statistical comparisons for the U.S. and the Organization for Economic Cooperation and Development
(OECD, n.d.).
Page 14 of the PISA 2015 report (Kastberg, Ying Chan, & Murray, 2016), shows a breakdown of 15-year-
old students� mathematics achievement in the U.S. Pages 37-38 provide an explanation of how people
were sampled for this study.
Considering the samples compared in this study, write a 2- to 3-paragraph explanation that addresses the
following:
� Are the samples for each country and the OECD average good groups to compare with the U.S.?
Explain.
� Are the test items presented in the report similar to or different than tenth-grade math items presented
for Smarter Balanced, PARCC, or another large-scale state test?
o Explore the testing websites to verify your response.
• Are the samples for each country and the OECD average good groups to compare with
the U.S.? Explain
2
No, I do not think that the PISA test findings are an accurate estimation of the performance of
US students in mathematics literacy. I have made this conclusion based on the following
context:
PISA studies relied on the evaluation of students based on their ability to formulate real
world problems into mathematics or simply said based on the mathematical modeling
skills. But the US students are more proficient in interpreting the results of the
mathematical model than actual modeling of the problem. Moreover, it is also a well-
observed trend that US students do better at tasks related to general algebra and problems
of functions and relationships. But PISA study focused on evaluation based upon
geometry related content. Hence PISA study implicates that more attention is needed
towards higher learning levels that involve in-depth understanding of complex real life
problems, their mathematical modeling, and interpretation of results while relying on the
basic principles. (Kastberg et. Al., 2016, p.14)
• Are the test items presented in the report similar to or different than tenth-grade math
items presented for Smarter Balanced, PARCC, or another large-scale state test?
No the test items presented in the PISA assessment and those on the Smarter Balanced or
Common Core State Standards are different. This is because they both serve different purposes.
This is because PISA is an assessment framework that aims to specify a measurement construct
instead of providing a ground for academic functions of a school system. On the other hand state
standards like Common core aim to bring coherence in the vision of the education stem upon
which curriculum, instruction, and assessment is based.
3
Since the purpose is different hence it is not suitable to compare the items on both tests. But to a
certain degree, there are some common points. This is because both of these complement each
other implicitly. For example, assessment programs do not aim to make a conclusion about the
aligned curricula, but they implicitly indicate in their findings of what is more fundamental and
essential for the students to learn and what should be focused upon while teaching. (Kastberg et.
Al., 2016, p. 37-38)
References:
4
Kastberg, D. Yin Chan, J. Murray, D. (2016), “Performance of U.S. 15-year-old Students
in Science, Reading, and Mathematics Literacy in International Context (PISA 2016)” by
National Center for Education Statistics, p. 14, 37-38.