Pisa 2009 Results Learning Trends Changes In
Student Performance Since 2000 Oecd download
https://ebookbell.com/product/pisa-2009-results-learning-trends-
changes-in-student-performance-since-2000-oecd-6770630
Explore and download more ebooks at ebookbell.com
Here are some recommended products that we believe you will be
interested in. You can click the link to download.
Pisa 2009 Results Learning To Learn Student Engagement Strategies And
Practices Volume Iii Oecd
https://ebookbell.com/product/pisa-2009-results-learning-to-learn-
student-engagement-strategies-and-practices-volume-iii-oecd-1974840
Pisa 2009 Results Vol 2 Overcoming Social Background Equity In
Learning Opportunities And Outcomes Oecd
https://ebookbell.com/product/pisa-2009-results-vol-2-overcoming-
social-background-equity-in-learning-opportunities-and-outcomes-
oecd-6770528
Learners For Life Results From Pisa 2000 Student Approaches To
Learning Prepared By Organisation For Economic Cooperation And
Development Oecd
https://ebookbell.com/product/learners-for-life-results-from-
pisa-2000-student-approaches-to-learning-prepared-by-organisation-for-
economic-cooperation-and-development-oecd-6781354
Learning For Tomorrows World First Reseults From Pisa 2003 Programme
For International Student Assessment Oecd
https://ebookbell.com/product/learning-for-tomorrows-world-first-
reseults-from-pisa-2003-programme-for-international-student-
assessment-oecd-6781388
Pisa 2009 Results Students On Line Digital Technologies And
Performance Oecd
https://ebookbell.com/product/pisa-2009-results-students-on-line-
digital-technologies-and-performance-oecd-2449032
Pisa 2009 Results Vol 4 What Makes A School Successful Resources
Policies And Practices Oecd
https://ebookbell.com/product/pisa-2009-results-vol-4-what-makes-a-
school-successful-resources-policies-and-practices-oecd-6770620
Pisa 2009 Results What Students Know And Can Do Student Performance In
Reading Mathematics And Science Volume I Oecd
https://ebookbell.com/product/pisa-2009-results-what-students-know-
and-can-do-student-performance-in-reading-mathematics-and-science-
volume-i-oecd-1728666
Reading For Change Performance And Engagement Across Countries Results
From Pisa 2000 Oecd
https://ebookbell.com/product/reading-for-change-performance-and-
engagement-across-countries-results-from-pisa-2000-oecd-6781330
Literacy Skills For The World Of Tomorrow Further Results From Pisa
2000 Oecd
https://ebookbell.com/product/literacy-skills-for-the-world-of-
tomorrow-further-results-from-pisa-2000-oecd-6781352
www.oecd.org/publishing
ISBN 978-92-64-09149-8
98 2010 11 1 P -:HSTCQE=U^VY^]:
PISA
2009
Results:
Learning
Trends
CHANGES
IN
STUDENT
PERFORMANCE
SINCE
2000
–
VOLUME
V
Please cite this publication as:
OECD (2010), PISA 2009 Results: Learning Trends: Changes in Student Performance Since 2000
(Volume V), PISA, OECD Publishing.
http://dx.doi.org/10.1787/9789264091580-en
This work is published on the OECD iLibrary, which gathers all OECD books, periodicals and statistical
databases. Visit www.oecd-ilibrary.org, and do not hesitate to contact us for more information.
PISA 2009 Results:
Learning Trends
CHANGES IN STUDENT PERFORMANCE SINCE 2000
VOLUME V
Are students well prepared to meet the challenges of the future? Can they analyse, reason and communicate
their ideas effectively? Have they found the kinds of interests they can pursue throughout their lives as productive
members of the economy and society? The OECD Programme for International Student Assessment (PISA) seeks
to answer these questions through the most comprehensive and rigorous international assessment of student
knowledge and skills. Together, the group of countries and economies participating in PISA represents nearly 90%
of the world economy.
PISA 2009 Results presents the findings from the most recent PISA survey, which focused on reading and also
assessed mathematics and science performance. The report comprises six volumes:
• Volume I, What Students Know and Can Do: Student Performance in Reading, Mathematics and Science,
compares the knowledge and skills of students across countries.
• Volume II, Overcoming Social Background: Equity in Learning Opportunities and Outcomes, looks at how
successful education systems moderate the impact of social background and immigrant status on student and
school performance.
• Volume III, Learning to Learn: Student Engagement, Strategies and Practices, examines 15-year-olds’ motivation,
their engagement with reading and their use of effective learning strategies.
• Volume IV, What Makes a School Successful? Resources, Policies and Practices, examines how human,
financial and material resources, and education policies and practices shape learning outcomes.
• Volume V, Learning Trends: Changes in Student Performance Since 2000, looks at the progress countries have
made in raising student performance and improving equity in the distribution of learning opportunities.
• Volume VI, Students on Line: Reading and Using Digital Information, explores students’ use of information
technologies to learn.
PISA 2009 marks the beginning of the second cycle of surveys, with an assessment in mathematics scheduled
for 2012 and one in science for 2015.
THE OECD PROGRAMME FOR INTERNATIONAL STUDENT ASSESSMENT (PISA)
PISA focuses on young people’s ability to use their knowledge and skills to meet real-life challenges. This orientation reflects a change
in the goals and objectives of curricula themselves, which are increasingly concerned with what students can do with what they learn
at school and not merely with whether they have mastered specific curricular content. PISA’s unique features include its:
– Policy orientation, which highlights differences in performance patterns and identifies features common to high-performing students,
schools and education systems by linking data on learning outcomes with data on student characteristics and other key factors that
shape learning in and outside of school.
– Innovative concept of “literacy”, which refers both to students’ capacity to apply knowledge and skills in key subject areas and to their
ability to analyse, reason and communicate effectively as they pose, interpret and solve problems in a variety of situations.
– Relevance to lifelong learning, which goes beyond assessing students’ competencies in school subjects by asking them to report on
their motivation to learn, their beliefs about themselves and their learning strategies.
– Regularity, which enables countries to monitor their progress in meeting key learning objectives.
– Breadth of geographical coverage and collaborative nature, which, in PISA 2009, encompasses the 34 OECD member countries and
41 partner countries and economies.
P r o g r a m m e f o r I n t e r n a t i o n a l S t u d e n t A s s e s s m e n t
PISA 2009 Results:
Learning Trends
CHANGES IN STUDENT PERFORMANCE
SINCE 2000
VOLUME V
982010111cov.indd 1 29-Nov-2010 2:31:06 PM
PISA 2009 Results:
Learning Trends
CHANGES IN STUDENT
PERFORMANCE SINCE 2000
(Volume V)
The statistical data for Israel are supplied by and under the responsibility of the relevant Israeli authorities.The use of such data
by the OECD is without prejudice to the status of the Golan Heights, East Jerusalem and Israeli settlements in the West Bank
under the terms of international law.
Photo credits:
Getty Images © Ariel Skelley
Getty Images © Geostock
Getty Images © Jack Hollingsworth
Stocklib Image Bank © Yuri Arcurs
Corrigenda to OECD publications may be found on line at: www.oecd.org/publishing/corrigenda.
PISATM
, OECD/PISATM
and the PISA logo are trademaks of the Organisation for Economic Co-operation and Development (OECD).
All use of OECD trademarks is prohibited without written permission from the OECD.
© OECD 2010
You can copy, download or print OECD content for your own use, and you can include excerpts from OECD publications, databases and multimedia
products in your own documents, presentations, blogs, websites and teaching materials, provided that suiTabelle acknowledgment of OECD as source
and copyright owner is given. All requests for public or commercial use and translation rights should be submitted to rights@oecd.org. Requests for
permission to photocopy portions of this material for public or commercial use shall be addressed directly to the Copyright Clearance Center (CCC)
at info@copyright.com or the Centre français d’exploitation du droit de copie (CFC) at contact@cfcopies.com.
This work is published on the responsibility of the Secretary-General of the OECD. The opinions
expressed and arguments employed herein do not necessarily reflect the official views of the
Organisation or of the governments of its member countries.
Please cite this publication as:
OECD (2010), PISA 2009 Results: Learning Trends: Changes in Student Performance Since 2000 (Volume V)
http://dx.doi.org/10.1787/9789264091580-en
ISBN 978-92-64-09149-8 (print)
ISBN 978-92-64-09158-0 (PDF)
Foreword
3
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010
One of the ultimate goals of policy makers is to enable citizens to take advantage of a globalised world economy.
This is leading them to focus on the improvement of education policies, ensuring the quality of service provision,
a more equitable distribution of learning opportunities and stronger incentives for greater efficiency in schooling.
Such policies hinge on reliable information on how well education systems prepare students for life. Most countries
monitor students’ learning and the performance of schools. But in a global economy, the yardstick for success
is no longer improvement by national standards alone, but how education systems perform internationally. The
OECD has taken up that challenge by developing PISA, the Programme for International Student Assessment, which
evaluates the quality, equity and efficiency of school systems in some 70 countries that, together, make up nine-
tenths of the world economy. PISA represents a commitment by governments to monitor the outcomes of education
systems regularly within an internationally agreed framework and it provides a basis for international collaboration
in defining and implementing educational policies.
The results from the PISA 2009 assessment reveal wide differences in educational outcomes, both within and
across countries. The education systems that have been able to secure strong and equitable learning outcomes,
and to mobilise rapid improvements, show others what is possible to achieve. Naturally, GDP per capita influences
educational success, but this only explains 6% of the differences in average student performance. The other 94%
reflect the potential for public policy to make a difference. The stunning success of Shanghai-China, which tops
every league table in this assessment by a clear margin, shows what can be achieved with moderate economic
resources in a diverse social context. In mathematics, more than a quarter of Shanghai-China’s 15-year-olds can
conceptualise, generalise, and creatively use information based on their own investigations and modelling of
complex problem situations. They can apply insight and understanding and develop new approaches and strategies
when addressing novel situations. In the OECD area, just 3% of students reach this level of performance.
While better educational outcomes are a strong predictor of economic growth, wealth and spending on education
alone are no guarantee for better educational outcomes. Overall, PISA shows that an image of a world divided
neatly into rich and well-educated countries and poor and badly-educated countries is out of date.
This finding represents both a warning and an opportunity. It is a warning to advanced economies that they cannot
take for granted that they will forever have “human capital” superior to that in other parts of the world. At a time of
intensified global competition, they will need to work hard to maintain a knowledge and skill base that keeps up
with changing demands.
PISA underlines, in particular, the need for many advanced countries to tackle educational underperformance so
that as many members of their future workforces as possible are equipped with at least the baseline competencies
that enable them to participate in social and economic development. Otherwise, the high social and economic
cost of poor educational performance in advanced economies risks becoming a significant drag on economic
development. At the same time, the findings show that poor skills are not an inevitable consequence of low national
income – an important outcome for countries that need to achieve more with less.
But PISA also shows that there is no reason for despair. Countries from a variety of starting points have shown the
potential to raise the quality of educational outcomes substantially. Korea’s average performance was already high
in 2000, but Korean policy makers were concerned that only a narrow elite achieved levels of excellence in PISA.
Within less than a decade, Korea was able to double the share of students demonstrating excellence in reading
literacy. A major overhaul of Poland’s school system helped to dramatically reduce performance variability among
Foreword
4 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V
schools, reduce the share of poorly performing students and raise overall performance by the equivalent of more
than half a school year. Germany was jolted into action when PISA 2000 revealed a below-average performance and
large social disparities in results, and has been able to make progress on both fronts. Israel, Italy and Portugal have
moved closer to the OECD average and Brazil, Chile, Mexico and Turkey are among the countries with impressive
gains from very low levels of performance.
But the greatest value of PISA lies in inspiring national efforts to help students to learn better, teachers to teach better,
and school systems to become more effective.
A closer look at high-performing and rapidly improving education systems shows that these systems have many
commonalities that transcend differences in their history, culture and economic evolution.
First, while most nations declare their commitment to education, the test comes when these commitments are
weighed against others. How do they pay teachers compared to the way they pay other highly-skilled workers?
How are education credentials weighed against other qualifications when people are being considered for jobs?
Would you want your child to be a teacher? How much attention do the media pay to schools and schooling? Which
matters more, a community’s standing in the sports leagues or its standing in the student academic achievement
league tables? Are parents more likely to encourage their children to study longer and harder or to spend more time
with their friends or in sports activities?
In the most successful education systems, the political and social leaders have persuaded their citizens to make the
choices needed to show that they value education more than other things. But placing a high value on education
will get a country only so far if the teachers, parents and citizens of that country believe that only some subset of
the nation’s children can or need to achieve world class standards. This report shows clearly that education systems
built around the belief that students have different pre-ordained professional destinies to be met with different
expectations in different school types tend to be fraught with large social disparities. In contrast, the best-performing
education systems embrace the diversity in students’ capacities, interests and social background with individualised
approaches to learning.
Second, high-performing education systems stand out with clear and ambitious standards that are shared across the
system, focus on the acquisition of complex, higher-order thinking skills, and are aligned with high stakes gateways
and instructional systems. In these education systems, everyone knows what is required to get a given qualification,
in terms both of the content studied and the level of performance that has to be demonstrated to earn it. Students
cannot go on to the next stage of their life – be it work or further education – unless they show that they are qualified
to do so. They know what they have to do to realise their dream and they put in the work that is needed to achieve it.
Third, the quality of an education system cannot exceed the quality of its teachers and principals, since student
learning is ultimately the product of what goes on in classrooms. Corporations, professional partnerships and
national governments all know that they have to pay attention to how the pool from which they recruit is established;
how they recruit; the kind of initial training their recruits receive before they present themselves for employment;
how they mentor new recruits and induct them into their service; what kind of continuing training they get; how
their compensation is structured; how they reward their best performers and how they improve the performance of
those who are struggling; and how they provide opportunities for the best performers to acquire more status and
responsibility. Many of the world’s best-performing education systems have moved from bureaucratic “command
and control” environments towards school systems in which the people at the frontline have much more control
of the way resources are used, people are deployed, the work is organised and the way in which the work gets
done. They provide considerable discretion to school heads and school faculties in determining how resources
are allocated, a factor which the report shows to be closely related to school performance when combined with
effective accountability systems. And they provide an environment in which teachers work together to frame what
they believe to be good practice, conduct field-based research to confirm or disprove the approaches they develop,
and then assess their colleagues by the degree to which they use practices proven effective in their classrooms.
Last but not least, the most impressive outcome of world-class education systems is perhaps that they deliver high-
quality learning consistently across the entire education system, such that every student benefits from excellent
learning opportunities. To achieve this, they invest educational resources where they can make the greatest
difference, they attract the most talented teachers into the most challenging classrooms, and they establish effective
spending choices that prioritise the quality of teachers.
Foreword
5
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010
These are, of course, not independently conceived and executed policies. They need to be aligned across all aspects
of the system, they need to be coherent over sustained periods of time, and they need to be consistently implemented.
The path of reform can be fraught with political and practical obstacles. Moving away from administrative and
bureaucratic control toward professional norms of control can be counterproductive if a nation does not yet have
teachers and schools with the capacity to implement these policies and practices. Pushing authority down to lower
levels can be as problematic if there is not agreement on what the students need to know and should be able to do.
Recruiting high-quality teachers is not of much use if those who are recruited are so frustrated by what they perceive
to be a mindless system of initial teacher education that they will not participate in it and turn to another profession.
Thus a country’s success in making these transitions depends greatly on the degree to which it is successful in
creating and executing plans that, at any given time, produce the maximum coherence in the system.
These are daunting challenges and thus devising effective education policies will become ever more difficult as
schools need to prepare students to deal with more rapid change than ever before, for jobs that have not yet been
created, to use technologies that have not yet been invented and to solve economic and social challenges that we
do not yet know will arise. But those school systems that do well today, as well as those that have shown rapid
improvement, demonstrate that it can be done. The world is indifferent to tradition and past reputations, unforgiving
of frailty and complacency and ignorant of custom or practice. Success will go to those individuals and countries
that are swift to adapt, slow to complain and open to change. The task of governments will be to ensure that
countries rise to this challenge. The OECD will continue to support their efforts.
***
This report is the product of a collaborative effort between the countries participating in PISA, the experts and
institutions working within the framework of the PISA Consortium, and the OECD Secretariat. The report was
drafted by Andreas Schleicher, Francesca Borgonovi, Michael Davidson, Miyako Ikeda, Maciej Jakubowski,
Guillermo Montt, Sophie Vayssettes and Pablo Zoido of the OECD Directorate for Education, with advice as well as
analytical and editorial support from Marilyn Achiron, Simone Bloem, Marika Boiron, Henry Braun, Nihad Bunar,
Niccolina Clements, Jude Cosgrove, John Cresswell, Aletta Grisay, Donald Hirsch, David Kaplan, Henry Levin,
Juliette Mendelovitz, Christian Monseur, Soojin Park, Pasi Reinikainen, Mebrak Tareke, Elisabeth Villoutreix and
Allan Wigfield. Volume II also draws on the analytic work undertaken by Jaap Scheerens and Douglas Willms in the
context of PISA 2000. Administrative support was provided by Juliet Evans and Diana Morales.
The PISA assessment instruments and the data underlying the report were prepared by the PISA Consortium, under
the direction of Raymond Adams at the Australian Council for Educational Research (ACER) and Henk Moelands
from the Dutch National Institute for Educational Measurement (CITO).The expert group that guided the preparation
of the reading assessment framework and instruments was chaired by Irwin Kirsch.
The development of the report was steered by the PISA Governing Board, which is chaired by Lorna Bertrand
(United Kingdom), with Beno Csapo (Hungary), Daniel McGrath (United States) and Ryo Watanabe (Japan) as vice
chairs. Annex C of the volumes lists the members of the various PISA bodies, as well as the individual experts and
consultants who have contributed to this report and to PISA in general.
Angel Gurría
OECD Secretary-General
Table of Contents
7
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010
Executive Summary............................................................................................................................................................................................................13
Introduction to PISA....................................................................................................................................................................................................17
The PISA surveys.........................................................................................................................................................................................................................17
The first report from the 2009 assessment..............................................................................................................................................................18
The PISA student population............................................................................................................................................................................................19
Reader’s Guide.........................................................................................................................................................................................................................23
Chapter 1 Comparing Performance over Time.................................................................................................................................25
Chapter 2 Trends in Reading................................................................................................................................................................................37
Continuity and change in the reading literacy framework and assessment................................................................................38
How student performance in reading has changed since 2000.............................................................................................................38
How gender differences in reading have evolved............................................................................................................................................46
Changes in performance and changes in student populations.............................................................................................................49
The impact of changes in the socio-economic composition of student populations on trends in reading
performance.....................................................................................................................................................................................................................................49
Establishing an overall estimate of reading performance trends.........................................................................................................50
Country-by-country comparison of reading trends........................................................................................................................................50
Chapter 3 Trends in Mathematics and Science...............................................................................................................................59
Trends in mathematics ...........................................................................................................................................................................................................60
• How student performance in mathematics has changed since 2003.............................................................................................60
Trends in science.........................................................................................................................................................................................................................64
• How student performance in science has changed since 2006.........................................................................................................64
Chapter 4 Trends in Equity....................................................................................................................................................................................73
Trends in the variation of student performance ...............................................................................................................................................74
Trends in student background factors and their relation to reading performance...............................................................77
• Socio-economic status....................................................................................................................................................................................................77
• Immigrant status and home language...................................................................................................................................................................80
Table of Contents
8 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V
Chapter 5 Trends in Attitudes and Student-School Relations.................................................................................87
Trends in reading engagement........................................................................................................................................................................................88
• Changes in whether students read for enjoyment........................................................................................................................................88
• Changes in how much students enjoy reading..............................................................................................................................................90
• Changes in what students read for enjoyment................................................................................................................................................93
• Changes in socio-economically disadvantaged students’ engagement in reading................................................................96
• Changes in the reading performance of students who read fiction..................................................................................................97
Trends in student views on schools and teachers............................................................................................................................................98
• Changes in teacher-student relations....................................................................................................................................................................98
• Changes in disciplinary climate............................................................................................................................................................................100
Conclusions and Policy Implications..................................................................................................................................................105
Changing conditions for learning..............................................................................................................................................................................105
Progress towards raising performance and levelling the playing field.........................................................................................106
References ................................................................................................................................................................................................................................109
Annex A Technical background.................................................................................................................................................................111
Annex A1: Construction of reading scales and indices from the student context questionnaires .......................................112
Annex A2: The PISA target population, the PISA samples and the definition of schools...........................................................120
Annex A3: Standard errors, significance tests and subgroup comparisons..........................................................................................133
Annex A4: Quality assurance.............................................................................................................................................................................................134
Annex A5: Participation of countries across PISA assessments....................................................................................................................136
Annex A6: Linear and adjusted trends..........................................................................................................................................................................138
Annex B Tables of results......................................................................................................................................................................................145
Annex B1: Results for countries and economies....................................................................................................................................................146
Annex B2: Subnational tables.............................................................................................................................................................................................191
Annex C The development and implementation of pisa – a collaborative effort.....................................205
This book has...
StatLinks 2
A service that delivers Excel
®
files
from the printed page!
Look for the StatLinks at the bottom left-hand corner of the tables or graphs in this book.
To download the matching Excel®
spreadsheet, just type the link into your Internet browser,
starting with the http://dx.doi.org prefix.
If you’re reading the PDF e-book edition, and your PC is connected to the Internet, simply
click on the link. You’ll find StatLinks appearing in more OECD books.
Table of Contents
9
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010
Boxes
Box V. A Key features of PISA 2009...................................................................................................................................................21
Box V.1.1 Interpreting trends requires some caution............................................................................................................................26
Box V.B Korea .............................................................................................................................................................................................................................31
Box V.C Poland ...............................................................................................................................................................................33
Box V.D Portugal .............................................................................................................................................................................68
Box V.E	Turkey ................................................................................................................................................................................70
Box V.F Chile ..................................................................................................................................................................................85
Box V.G Brazil ...............................................................................................................................................................................102
Figures
Figure V. A A map of PISA countries and economies.............................................................................................................................19
Figure V.1.1 A summary of changes in reading performance...................................................................................................................27
Figure V.1.2 A summary of annualised performance trends in reading, mathematics and science...........................................................29
Figure V.2.1 Change in reading performance between 2000 and 2009 ..................................................................................................39
Figure V.2.2 How countries perform in reading and how reading performance has changed since 2000................................................40
Figure V.2.3 Multiple comparisons between 2000 and 2009 .................................................................................................................41
Figure V.2.4 Percentage of students below proficiency Level 2 in reading in 2000 and 2009 .................................................................43
Figure V.2.5 Percentage of top performers in reading in 2000 and 2009 ................................................................................................44
Figure V.2.6 Performance changes among the lowest- and highest-achieving students in reading between 2000 and 2009 ..................45
Figure V.2.7 Comparison of gender differences in reading between 2000 and 2009 ..............................................................................47
Figure V.2.8 Change in the share of boys and girls who are low performers in reading between 2000 and 2009 ...................................48
Figure V.2.9 Changes in reading performance between 2000 and 2009 .................................................................................................49
Figure V.2.10 Linear trends and performance differences between 2000 and 2009 ..................................................................................51
Figure V.2.11	Trends in reading performance: countries above the OECD average ...................................................................................52
Figure V.2.12	Trends in reading performance: countries at the OECD average .........................................................................................54
Figure V.2.13	Trends in reading performance: countries below the OECD average .................................................................................55
Figure V.3.1 Change in mathematics performance between 2003 and 2009...........................................................................................60
Figure V.3.2 How countries perform in mathematics and how mathematics performance has changed since 2003 ...............................61
Figure V.3.3 Percentage of students performing below proficiency Level 2 in mathematics in 2003 and 2009 .......................................62
Figure V.3.4 Percentage of top performers in mathematics in 2003 and 2009.........................................................................................63
Figure V.3.5 Change in science performance between 2006 and 2009 ..................................................................................................64
Figure V.3.6 How countries perform in science and how science performance has changed since 2006 ...............................................65
Figure V.3.7 Percentage of students performing below proficiency Level 2 in science in 2006 and 2009 ...............................................66
Figure V.3.8 Percentage of top performers in science in 2006 and 2009.................................................................................................67
Figure V.4.1 Comparison of the variation in student performance in reading between 2000 and 2009...................................................74
Figure V.4.2 Change in variation and change in reading performance between 2000 and 2009..............................................................76
Figure V.4.3 Variation in reading performance between and within schools in 2000 and 2009...............................................................77
Figure V.4.4 Relationship between students’ socio-economic background and their reading performance in 2000 and 2009.................78
Figure V.4.5 Relationship between socio-economic background and reading performance between and with in schools in 2000 and 2009...79
Figure V.4.6 Percentage of students with an immigrant background in 2000 and 2009...........................................................................80
Figure V.4.7 Immigrant background and reading performance in 2000 and 2009...................................................................................81
Figure V.4.8 Percentage of students who speak a language at home that is different from the language of assessment in 2000 and 2009....83
Figure V.4.9 Home language and reading performance in 2000 and 2009..............................................................................................83
Table of Contents
10 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V
Figure V.5.1 Percentage of students who read for enjoyment in 2000 and 2009......................................................................................88
Figure V.5.2 Changes in the percentage of boys and girls who read for enjoyment between 2000 and 2009...........................................89
Figure V.5.3 Percentage of students who read only if they have to and percentage of students who enjoy going to a bookstore or a
library in 2000 and 2009....................................................................................................................................................91
Figure V.5.4 Index of enjoyment of reading in 2000 and 2009................................................................................................................92
Figure V.5.5 Change in the index of enjoyment of reading for boys and girls between 2000 and 2009...................................................92
Figure V.5.6 Change in the index of enjoyment of reading and the proportion of students who read for enjoyment
between 2000 and 2009.....................................................................................................................................................93
Figure V.5.7 Percentage of students who read fiction in 2000 and 2009.................................................................................................94
Figure V.5.8 Percentage of students who read comic books in 2000 and 2009.......................................................................................95
Figure V.5.9 Percentage of students who read for enjoyment in 2000 and 2009, by socio-economic background...................................96
Figure V.5.10 Change in the percentage of boys and girls who read for enjoyment between 2000 and 2009, by socio-economic
background.........................................................................................................................................................................97
Figure V.5.11	Teacher-student relations in PISA 2000 and 2009................................................................................................................99
Figure V.5.12 Disciplinary climate in PISA 2000 and 2009.....................................................................................................................101
Figure A6.1 Observed score change and score point change adjusted for sampling differences between 2000 and 2009.....................140
Tables
Table A1.1 Link Error Estimates...........................................................................................................................................................113
Table A1.2 Levels of parental education converted into years of schooling.........................................................................................116
Table A1.3 A multilevel model to estimate grade effects in reading, accounting for some background variables.................................117
Table A2.1 PISA target populations and samples................................................................................................................................................................122
Table A2.2 Exclusions.........................................................................................................................................................................124
Table A2.3 Response rates..................................................................................................................................................................126
Table A2.4a Percentage of students at each grade level.........................................................................................................................129
Table A2.4b Percentage of students at each grade level, by gender.......................................................................................................130
Table A2.5 Percentage of students and mean scores in reading, mathematics and science, according to whether students are in or out
of the regular education system in Argentina.....................................................................................................................132
Table A5.1 Participation of countries in different PISA assessments..............................................................................................................................137
Table A6.1 Student background characteristics in PISA 2000 and 2009.....................................................................................................................141
Table A6.2 Trends adjusted for sampling differences...........................................................................................................................144
Table V.2.1 Mean reading performance in PISA 2000, 2003, 2006 and 2009...........................................................................................................146
Table V.2.2 Percentage of students below Level 2 and at Level 5 or above on the reading scale in PISA 2000 and 2009.....................147
Table V.2.3 Percentiles on the reading scale in PISA 2000 and 2009...................................................................................................148
Table V.2.4 Gender differences in reading performance in PISA 2000 and 2009.................................................................................150
Table V.2.5 Percentage of boys below Level 2 and at Level 5 or above on the reading scale in PISA 2000 and 2009...........................151
Table V.2.6 Percentage of girls below Level 2 and at Level 5 or above on the reading scale in PISA 2000 and 2009............................152
Table V.2.7 Trends in reading performance adjusted for demographic changes....................................................................................153
Table V.2.8 Linear trends and annual changes in reading performance across all PISA assessments.....................................................154
Table V.2.9 Mean reading score change between 2003 and 2009 and between 2006 and 2009.........................................................155
Table V.3.1 Mean mathematics performance in PISA 2003, 2006 and 2009.............................................................................................................156
Table V.3.2 Percentage of students below Level 2 and at Level 5 or above on the mathematics scale in PISA 2003 and 2009.............157
Table V.3.3 Annualised changes in mathematics since 2003...............................................................................................................158
Table V.3.4 Mean science performance in PISA 2006 and 2009..........................................................................................................159
Table V.3.5 Percentage of students below Level 2 and at Level 5 or above on the science scale in PISA 2006 and 2009.....................160
Table of Contents
11
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010
Table V.4.1 Between- and within-school variance in reading performance in PISA 2000 and 2009..................................................................161
Table V.4.2 Socio-economic background of students in PISA 2000 and 2009 .....................................................................................162
Table V.4.3 Relationship between reading performance and the PISA index of economic, social, and cultural status (ESCS)
in PISA 2000 and 2009 ....................................................................................................................................................163
Table V.4.4 Percentage of students and reading performance by immigrant status in PISA 2000 and 2009 .........................................164
Table V.4.5 Language spoken at home and reading performance in PISA 2000 and 2009 ...................................................................165
Table V.5.1 Percentage of students reading for enjoyment in PISA 2000 and 2009, by gender ..........................................................................166
Table V.5.2 Index of enjoyment of reading in PISA 2000 and 2009, by gender ...................................................................................167
Table V.5.3 Percentage of students for several items in the index of enjoyment of reading in PISA 2000 and 2009 .............................168
Table V.5.4 Percentage of students reading for enjoyment in PISA 2000 and 2009, by socio-economic background and gender ........171
Table V.5.5 Index of enjoyment of reading in PISA 2000 and 2009, by socio-economic background and gender ...............................174
Table V.5.6 Percentage of students who read diverse materials in PISA 2000 and 2009 ......................................................................177
Table V.5.7 Percentage of students who read diverse materials in PISA 2000 and 2009, by gender .....................................................179
Table V.5.8 Reading performance of students who read fiction in PISA 2000 and 2009 ......................................................................183
Table V.5.9 Performance of students who read fiction in PISA 2000 and 2009, by gender ..................................................................184
Table V.5.10 Diversity of reading materials in PISA 2000 and 2009, by gender ....................................................................................186
Table V.5.11 Teacher-student relations in PISA 2000 and 2009 ............................................................................................................187
Table V.5.12 Disciplinary climate in PISA 2000 and 2009 ....................................................................................................................188
Table S.V.a Mean reading performance in PISA 2000, 2003, 2006 and 2009...........................................................................................................191
Table S.V.b Percentage of students below Level 2 and at Level 5 and above on the reading scale in PISA 2000 and 2009...................191
Table S.V.c Percentiles on the reading scale in PISA 2000 and 2009...................................................................................................191
Table S.V.d Percentage of girls below Level 2 and at Level 5 and above on the reading scale in PISA 2000 and 2009.........................192
Table S.V.e Gender differences in reading performance in PISA 2000 and 2009.................................................................................192
Table S.V.f Percentage of boys below Level 2 and at Level 5 and above on the reading scale in PISA 2000 and 2009........................192
Table S.V.g Mean mathematics performance in PISA 2003, 2006 and 2009........................................................................................193
Table S.V.h Percentage of students below Level 2 and at Level 5 and above on the mathematics scale in PISA 2000 and 2009...........194
Table S.V.i Mean science performance in PISA 2006 and 2009..........................................................................................................195
Table S.V.j Mean mathematics performance in PISA 2003, 2006 and 2009........................................................................................196
Table S.V.k Between- and within-school variance in reading performance in PISA 2000 and 2009.....................................................197
Table S.V.l Socio-economic background of students in PISA 2000 and 2009......................................................................................197
Table S.V.m Relationship between reading performance and the PISA index of economic, social and cultural status (ESCS)
in PISA 2000 and 2009.....................................................................................................................................................198
Table S.V.n Percentage of students and reading performance by immigrant status in PISA 2000 and 2009..........................................199
Table S.V.o Language spoken at home and reading performance in PISA 2000 and 2009....................................................................199
Table S.V.p Between- and within-school variance in reading performance in PISA 2000 and 2009.....................................................200
Table S.V.q Index of enjoyment of reading in PISA 2000 and 2009, by gender (results based on students’ self-reports)........................200
Table S.V.r Percentage of students who read diverse materials in PISA 2000 and 2009.......................................................................201
Table S.V.s Relationship between reading performance and the PISA index of economic, social and cultural status (ESCS)
in PISA 2000 and 2009....................................................................................................................................................202
Table S.V.t Teacher-student relations in PISA 2000 and 2009..............................................................................................................202
Table S.V.u Disciplinary climate in PISA 2000 and 2009.....................................................................................................................203
13
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010
The design of PISA does not just allow for a comparison of the relative standing of countries in terms of their learning
outcomes; it also enables each country to monitor changes in those outcomes over time. Such changes indicate how
successful education systems have been in developing the knowledge and skills of 15-year-olds.
Indeed, some countries have seen impressive improvements in performance over the past decade, sometimes
exceeding the equivalent of an average school year’s progress for the entire 15-year-old student population. Some of
these countries have been catching up from comparatively low performance levels while others have been advancing
further from already high levels. All countries seeking to improve their results can draw encouragement – and learn
lessons – from those that have succeeded in doing so in a relatively short period of time.
Changes in student performance over time prove that a country’s performance in reading is not set in stone. In both
absolute and relative terms, educational results can improve, and they cannot be regarded either as part of fixed
“cultural” differences between countries or as inevitably linked to each country’s state of economic development.
Since both PISA 2000 and PISA 2009 focused on reading, it is possible to track how student performance in reading
changed over that period. Among the 26 OECD countries with comparable results in both assessments, Chile, Israel,
Poland, Portugal, Korea, Hungary and Germany as well as the partner countries Peru, Albania, Indonesia, Latvia,
Liechtenstein and Brazil all improved their reading performance between 2000 and 2009, while performance
declined in Ireland, Sweden, the Czech Republic and Australia.
Between 2000 and 2009, the percentage of low performers in Chile dropped by more than 17 percentage points, while
the share of top performers in Korea grew by more than 7 percentage points.
In many countries, improvements in results were largely driven by improvements at the bottom end of the
performance distribution, signalling progress towards greater equity in learning outcomes. Among OECD
countries, variation in student performance fell by 3%. On average across the 26 OECD countries with
comparable data for both assessments, 18% of students performed below the baseline reading proficiency Level 2
in 2009, while 19% did so in 2000. Among countries where between 40% and 60% of students performed below
Level 2 in 2000, Chile reduced that proportion by the largest amount, and Mexico and the partner country Brazil
also show important decreases in their share of low performers. Among countries where the proportion of students
performing below Level 2 was smaller than 40% but still above the OECD average of 19%, the partner country
Latvia reduced the proportion by 13 percentage points, while Portugal, Poland, Hungary, Germany, Switzerland
and the partner country Liechtenstein reduced the share by smaller amounts. In Denmark, the percentage of
students below Level 2 fell from an already below-average level.
The share of top performers – those students who attain reading proficiency Level 5 or 6 in reading – increased in
Japan, Korea and the partner economy Hong Kong-China such that these countries now have the largest proportions
of high-achieving students among the countries participating in the 2009 assessment. Several countries that had
above-average proportions of top performers in 2000 saw those proportions decrease in 2009. Notable among them
was Ireland, where the proportion of top performers fell from 14% to 7%, which is below the OECD average.
Executive Summary
Executive Summary
14 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V
Between 2000 and 2009, Poland, Portugal, Germany, Switzerland and the partner countries Latvia and Liechtenstein
raised the performance of their lowest-achieving students while maintaining the performance level among their
highest-achieving students. Korea, Israel and the partner country Brazil raised the performance of their highest-
achieving students while maintaining the performance level among their lowest-achieving students. Chile and the
partner countries Indonesia, Albania and Peru showed improvements in reading performance among students at all
proficiency levels.
On average, OECD countries narrowed the gap in scores between their highest- and lowest-performing students
between 2000 and 2009; some also improved overall performance. In Chile, Germany, Hungary, Poland, Portugal,
and the partner countries Indonesia, Latvia and Liechtenstein, overall performance improved while the variation in
performance decreased. In many cases, this was the result of improvements among low-achieving students.
The gender gap in reading performance did not narrow in any country between 2000 and 2009.
The gender gap in reading performance widened in Israel, Korea, Portugal, France and Sweden, and in the partner
countries and economies Romania, Hong Kong-China, Indonesia and Brazil between 2000 and 2009. The fact that
girls outperform boys in reading is most evident in the proportion of girls and boys who perform below baseline
proficiency Level 2. Across OECD countries, 24% of boys perform below Level 2 compared to only 12% of girls.
The proportion of girls performing below this level decreased by two percentage points between 2000 and 2009,
while the share of low-achieving boys did not change during the period.
Across the OECD area, the percentage of students with an immigrant background increased by an average of two
percentage points between 2000 and 2009. The performance gap between students with and without an immigrant
background remained broadly similar over the period. However, some countries noted large reductions in the
performance advantage of students without an immigrant background. In Belgium, Switzerland and Germany, the
gap narrowed by between 28 and 38 score points due to improvements in reading proficiency among students with
an immigrant background. However, the gap is still relatively wide in these countries.
Across OECD countries, overall performance in mathematics remained unchanged between 2003 and 2009, as did
performance in science between 2006 and 2009.
In mathematics, students in Mexico, Turkey, Greece, Portugal, Italy, Germany and the partner countries Brazil and
Tunisia improved their mathematics scores considerably, while students in the Czech Republic, Ireland, Sweden,
France, Belgium, the Netherlands, Denmark, Australia and Iceland saw declines in their performance. On average
across the 28 OECD countries with comparable results in the 2003 and 2009 assessments, the share of students
below mathematics proficiency Level 2 remained broadly similar over the period, with a minor decrease from
21.6% to 20.8%. Among the OECD countries in which more than half of students performed below mathematics
proficiency Level 2 in 2003, Mexico shrunk this proportion by 15 percentage points, from 66% to 51%, by 2009
while Turkey reduced it from 52% to 42% during the same period. Meanwhile, the percentage of top performers
in mathematics in those 28 OECD countries decreased slightly, from 14.7% in 2003 to 13.4% in 2009. Portugal
showed the largest increase – four percentage points – in top performers in mathematics.
In science, 11 of the 56 countries that participated in both the 2006 and 2009 assessments show improvements in
student performance. Turkey, for example, saw a 30 score point increase, nearly half a proficiency level, in just three
years. Turkey also reduced the percentage of students below science proficiency Level 2 by almost 17 percentage
points, from 47% to 30%. Portugal, Chile, the United States, Norway, Korea and Italy all reduced the share of lowest
performers in science by around five percentage points or more, as did the partner countries Qatar, Tunisia, Brazil
and Colombia. Performance in science declined considerably in five countries.
On average across OECD countries, the percentage of students who report reading for enjoyment daily dropped by five
percentage points.
Enjoyment of reading tends to have deteriorated, especially among boys, signalling the challenge for schools to
engage students in reading activities that 15-year-olds find relevant and interesting. On average across OECD
countries, the percentage of students who said they read for enjoyment every day fell from 69% in 2000 to 64%
in 2009. On the other hand, changes in student-teacher relations and classroom climate have generally been
favourable or, at least, they have not deteriorated as many would have expected. Generally, students have become
more confident that they can get help from their teachers. Across the 26 OECD countries that participated in both
assessments, 74% of students in 2000 agreed or strongly agreed with the statements, “If I need extra help, I will
Executive Summary
15
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010
receive it from my teachers” or “Most of my teachers treat me fairly”, while in 2009, 79% of students agreed or
strongly agreed with those statements. Overall, aspects of classroom discipline have also improved. Thus there is no
evidence to justify the notion that students are becoming progressively more disengaged from school.
17
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010
Introduction to PISA
The PISA surveys
Are students well prepared to meet the challenges of the future? Can they analyse, reason and communicate
their ideas effectively? Have they found the kinds of interests they can pursue throughout their lives as productive
members of the economy and society? The OECD Programme for International Student Assessment (PISA) seeks to
answer these questions through its triennial surveys of key competencies of 15-year-old students in OECD member
countries and partner countries/economies. Together, the group of countries participating in PISA represents nearly
90% of the world economy.1
PISA assesses the extent to which students near the end of compulsory education have acquired some of the
knowledge and skills that are essential for full participation in modern societies, with a focus on reading, mathematics
and science.
PISA has now completed its fourth round of surveys. Following the detailed assessment of each of PISA’s three main
subjects – reading, mathematics and science – in 2000, 2003 and 2006, the 2009 survey marks the beginning of
a new round with a return to a focus on reading, but in ways that reflect the extent to which reading has changed
since 2000, including the prevalence of digital texts.
PISA 2009 offers the most comprehensive and rigorous international measurement of student reading skills to date.
It assesses not only reading knowledge and skills, but also students’ attitudes and their learning strategies in reading.
PISA 2009 updates the assessment of student performance in mathematics and science as well.
The assessment focuses on young people’s ability to use their knowledge and skills to meet real-life challenges. This
orientation reflects a change in the goals and objectives of curricula themselves, which are increasingly concerned
with what students can do with what they learn at school and not merely with whether they have mastered specific
curricular content. PISA’s unique features include its:
• Policy orientation, which connects data on student learning outcomes with data on students’ characteristics
and on key factors shaping their learning in and out of school in order to draw attention to differences in
performance patterns and identify the characteristics of students, schools and education systems that have high
performance standards.
• Innovative concept of “literacy”, which refers to the capacity of students to apply knowledge and skills in key
subject areas and to analyse, reason and communicate effectively as they pose, interpret and solve problems in
a variety of situations.
• Relevance to lifelong learning, which does not limit PISA to assessing students’ competencies in school
subjects, but also asks them to report on their own motivations to learn, their beliefs about themselves and
their learning strategies.
• Regularity, which enables countries to monitor their progress in meeting key learning objectives.
• Breadth of geographical coverage and collaborative nature, which, in PISA 2009, encompasses the 34 OECD
member countries and 41 partner countries and economies.2
Introduction to PISA
18 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V
The relevance of the knowledge and skills measured by PISA is confirmed by studies tracking young people in the
years after they have been assessed by PISA. Longitudinal studies in Australia, Canada and Switzerland display a
strong relationship between performance in reading on the PISA 2000 assessment at age 15 and future educational
attainment and success in the labour market (see Volume I, Chapter 2).3
The frameworks for assessing reading, mathematics and science in 2009 are described in detail in PISA 2009
Assessment Framework: Key Competencies in Reading, Mathematics and Science (OECD, 2009).
Decisions about the scope and nature of the PISA assessments and the background information to be collected are
made by leading experts in participating countries. Governments guide these decisions based on shared, policy-
driven interests. Considerable efforts and resources are devoted to achieving cultural and linguistic breadth and
balance in the assessment materials. Stringent quality-assurance mechanisms are applied in designing the test, in
translation, sampling and data collection. As a result, PISA findings are valid and highly reliable.
Policy makers around the world use PISA findings to gauge the knowledge and skills of students in their own
country in comparison with those in the other countries. PISA reveals what is possible in education by showing
what students in the highest performing countries can do in reading, mathematics and science. PISA is also used to
gauge the pace of educational progress, by allowing policy makers to assess to what extent performance changes
observed nationally are in line with performance changes observed elsewhere. In a growing number of countries,
PISA is also used to set policy targets against measurable goals achieved by other systems, and to initiate research
and peer-learning designed to identify policy levers and to reform trajectories for improving education. While PISA
cannot identify cause-and-effect relationships between inputs, processes and educational outcomes, it can highlight
the key features in which education systems are similar and different, sharing those findings with educators, policy
makers and the general public.
The first report from the 2009 assessment
This volume is the fifth of six volumes that provide the first international report on results from the PISA 2009
assessment. It provides an overview of trends in student performance in reading, mathematics and science from
PISA 2000 to PISA 2009. It shows educational outcomes over time and tracks changes in factors related to student
and school performance, such as student background and school characteristics and practices.
The other volumes cover the following issues:
• Volume I, What Students Know and Can Do: Student Performance in Reading, Mathematics and Science,
summarises the performance of students in PISA 2009, starting with a focus on reading, and then reporting
on mathematics and science performance. It provides the results in the context of how performance is
defined, measured and reported, and then examines what students are able do in reading. After a summary of
reading performance, it examines the ways in which this performance varies on subscales representing three
aspects of reading. It then breaks down results by different formats of reading texts and considers gender
differences in reading, both generally and for different reading aspects and text formats. Any comparison
of the outcomes of education systems needs to take into consideration countries’ social and economic
circumstances and the resources they devote to education. To address this, the volume also interprets the
results within countries’ economic and social contexts. The chapter concludes with a description of student
results in mathematics and science.
• Volume II, Overcoming Social Background: Equity in Learning Opportunities and Outcomes, starts by closely
examining the performance variation shown in Volume I, particularly the extent to which the overall variation in
student performance relates to differences in results achieved by different schools. The volume then looks at how
factors such as socio-economic background and immigrant status affect student and school performance, and the
role that education policy can play in moderating the impact of these factors.
• Volume III, Learning to Learn: Student Engagement, Strategies and Practices, explores the information gathered
on students’ levels of engagement in reading activities and attitudes towards reading and learning. It describes
15-year-olds’ motivations, engagement and strategies to learn.
• Volume IV, What Makes a School Successful? Resources, Policies and Practices, explores the relationships between
student-, school- and system-level characteristics, and educational quality and equity. It explores what schools
and school policies can do to raise overall student performance and, at the same time, moderate the impact of
Introduction to PISA
19
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010
socio-economic background on student performance, with the aim of promoting a more equitable distribution of
learning opportunities.
• Volume VI, Students On Line: Reading and Using Digital Information, (OECD, forthcoming) explains how PISA
measures and reports student performance in digital reading and analyses what students in the 20 countries
participating in this assessment are able to do.
All data tables referred to in the analysis are included at the end of the respective volume. A Reader’s Guide is also
provided in each volume to aid in interpreting the tables and figures accompanying the report.
Technical annexes that describe the construction of the questionnaire indices, sampling issues, quality-assurance
procedures and the process followed for developing the assessment instruments, and information about reliability
of coding are posted on the OECD PISA website (www.pisa.oecd.org). Many of the issues covered in the technical
annexes will be elaborated in greater detail in the PISA 2009 Technical Report (OECD, forthcoming).
The PISA student population
In order to ensure the comparability of the results across countries, PISA devoted a great deal of attention to
assessing comparable target populations. Differences between countries in the nature and extent of pre-primary
education and care, in the age of entry to formal schooling, and in the structure of the education system do not allow
school grades levels to be defined so that they are internationally comparable. Valid international comparisons of
educational performance, therefore, need to define their populations with reference to a target age. PISA covers
students who are aged between 15 years 3 months and 16 years 2 months at the time of the assessment and who
have completed at least 6 years of formal schooling, regardless of the type of institution in which they are enrolled,
• Figure V.A •
A map of PISA countries and economies
OECD countries Partner countries and economies in PISA 2009* Partners countries in previous PISA surveys
Australia Japan Albania Mauritius* Dominican Republic
Austria Korea Argentina Miranda-Venezuela* Macedonia
Belgium Luxembourg Azerbaijan Montenegro Moldova
Canada Mexico Brazil Netherlands-Antilles*
Chile Netherlands Bulgaria Panama
Czech Republic New Zealand Colombia Peru
Denmark Norway Costa Rica* Qatar
Estonia Poland Croatia Romania
Finland Portugal Georgia* Russian Federation
France Slovak Republic Himachal Pradesh-India* Serbia
Germany Slovenia Hong Kong-China Shanghai-China
Greece Spain Indonesia Singapore
Hungary Sweden Jordan Tamil Nadu-India*
Iceland Switzerland Kazakhstan Chinese Taipei
Ireland Turkey Kyrgyzstan Thailand
Israel United Kingdom Latvia Trinidad and Tobago
Italy United States Liechtenstein Tunisia
Lithuania Uruguay
Macao-China United Arab Emirates*
Malaysia* Viet Nam* * These partner countries and economies carried out
the assessment in 2010 instead of 2009.
Malta*
Introduction to PISA
20 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V
whether they are in full-time or part-time education, whether they attend academic or vocational programmes, and
whether they attend public or private schools or foreign schools within the country. (For an operational definition
of this target population, see the PISA 2009 Technical Report [OECD, forthcoming].) The use of this age in PISA,
across countries and over time, allows the performance of students to be compared in a consistent manner before
they complete compulsory education.
As a result, this report can make statements about the knowledge and skills of individuals born in the same year who
are still at school at 15 years of age, despite having had different educational experiences, both in and outside school.
Stringent technical standards were established to define the national target populations and to identify permissible
exclusions from this definition (for more information, see the PISA website www.pisa.oecd.org). The overall exclusion
rate within a country was required to be below 5% to ensure that, under reasonable assumptions, any distortions in
national mean scores would remain within plus or minus 5 score points, i.e. typically within the order of magnitude of
two standard errors of sampling (see Annex A2). Exclusion could take place either through the schools that participated
or the students who participated within schools.There are several reasons why a school or a student could be excluded
from PISA. Schools might be excluded because they are situated in remote regions and are inaccessible or because
they are very small, or because of organisational or operational factors that precluded participation. Students might be
excluded because of intellectual disability or limited proficiency in the language of the test.
In 29 out of the 65 countries participating in PISA 2009, the percentage of school-level exclusions amounted to less
than 1%; it was less than 5% in all countries. When the exclusion of students who met the internationally established
exclusion criteria is also taken into account, the exclusion rates increase slightly. However, the overall exclusion
rate remains below 2% in 32 participating countries, below 5% in 60 participating countries, and below 7% in
all countries except Luxembourg (7.2%) and Denmark (8.6%). In 15 out of 34 OECD countries, the percentage of
school-level exclusions amounted to less than 1% and was less than 5% in all countries. When student exclusions
within schools are also taken into account, there were 9 OECD countries below 2% and 25 countries below 5%.
Restrictions on the level of exclusions in PISA 2009 are described in Annex A2.
The specific sample design and size for each country aimed to maximise sampling efficiency for student-level
estimates. In OECD countries, sample sizes ranged from 4 410 students in Iceland to 38 250 students in Mexico.
Countries with large samples have often implemented PISA both at national and regional/state levels (e.g. Australia,
Belgium, Canada, Italy, Mexico, Spain, Switzerland and the United Kingdom). This selection of samples was
monitored internationally and adhered to rigorous standards for the participation rate, both among schools selected
by the international contractor and among students within these schools, to ensure that the PISA results reflect
the skills of the 15-year-old students in participating countries. Countries were also required to administer the test
to students in identical ways to ensure that students receive the same information prior to and during the test (for
details, see Annex A4).
Introduction to PISA
21
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010
Box V.A Key features of PISA 2009
Content
• The main focus of PISA 2009 was reading. The survey also updated performance assessments in mathematics
and science. PISA considers students’ knowledge in these areas not in isolation, but in relation to their ability to
reflect on their knowledge and experience and to apply them to real-world issues.The emphasis is on mastering
processes, understanding concepts and functioning in various contexts within each assessment area.
• For the first time, the PISA 2009 survey also assessed 15-year-old students’ ability to read, understand and
apply digital texts.
Methods
• Around 470 000 students completed the assessment in 2009, representing about 26 million 15-year-olds in
the schools of the 65 participating countries and economies. Some 50 000 students took part in a second
round of this assessment in 2010, representing about 2 million 15 year-olds from 10 additional partner
countries and economies.
• Each participating student spent two hours carrying out pencil-and-paper tasks in reading, mathematics and
science. In 20 countries, students were given additional questions via computer to assess their capacity to
read digital texts.
• The assessment included tasks requiring students to construct their own answers as well as multiple-choice
questions. The latter were typically organised in units based on a written passage or graphic, much like the
kind of texts or figures that students might encounter in real life.
• Students also answered a questionnaire that took about 30 minutes to complete. This questionnaire focused
on their personal background, their learning habits, their attitudes towards reading, and their engagement
and motivation.
• School principals completed a questionnaire about their school that included demographic characteristics
and an assessment of the quality of the learning environment at school.
Outcomes
PISA 2009 results provide:
• a profile of knowledge and skills among 15-year-olds in 2009, consisting of a detailed profile for reading and
an update for mathematics and science;
• contextual indicators relating performance results to student and school characteristics;
• an assessment of students’ engagement in reading activities, and their knowledge and use of different learning
strategies;
• a knowledge base for policy research and analysis; and
• trend data on changes in student knowledge and skills in reading, mathematics and science, on changes
in student attitudes and socio-economic indicators, and in the impact of some indicators on performance
results.
Future assessments
• The PISA 2012 survey will return to mathematics as the major assessment area; PISA 2015 will focus on
science. Thereafter, PISA will turn to another cycle, beginning with reading again.
• Future tests will place greater emphasis on assessing students’ capacity to read and understand digital texts
and solve problems presented in a digital format, reflecting the importance of information and computer
technologies in modern societies.
Introduction to PISA
22 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V
Notes
1. The GDP of the countries that participated in PISA 2009 represents 87% of the 2007 world GDP. Some of the entities represented
in this report are referred to as partner economies. This is because they are not strictly national entities.
2. Thirty-one partner countries and economies originally participated in the PISA 2009 assessment and ten additional partner
countries and economies took part in a second round of the assessment.
3. Marks, G.N (2007); Bertschy, K., M.A. Cattaneo and S.C. Wolter (2009); OECD (2010a).
23
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010
Reader’s Guide
Data underlying the figures
The data referred to in this volume are presented in Annex B and, in greater detail, on the PISA website
(www.pisa.oecd.org).
Five symbols are used to denote missing data:
a	The category does not apply in the country concerned. Data are therefore missing.
c	There are too few observations or no observation to provide reliable estimates (i.e. there are fewer than
30 students or less than five schools with valid data).
m Data are not available. These data were not submitted by the country or were collected but subsequently
removed from the publication for technical reasons.
w Data have been withdrawn or have not been collected at the request of the country concerned.
x Data are included in another category or column of the table.
Country coverage
This publication features data on 65 countries and economies, including all 34 OECD countries and 31 partner
countries and economies (see Figure V.A). The data from another 10 partner countries were collected one year
later and will be published in 2011.
The statistical data for Israel are supplied by and under the responsibility of the relevant Israeli authorities.
The use of such data by the OECD is without prejudice to the status of the Golan Heights, East Jerusalem and
Israeli settlements in the West Bank under the terms of international law.
Calculating international averages
An OECD average was calculated for most indicators presented in this report. The OECD average corresponds
to the arithmetic mean of the respective country estimates.
Readers should, therefore, keep in mind that the term “OECD average” refers to the OECD countries included
in the respective comparisons.
Rounding figures
Because of rounding, some figures in tables may not exactly add up to the totals. Totals, differences and
averages are always calculated on the basis of exact numbers and are rounded only after calculation.
All standard errors in this publication have been rounded to one or two decimal places. Where the value 0.00
is shown, this does not imply that the standard error is zero, but that it is smaller than 0.005.
Reporting student data
The report uses “15-year-olds” as shorthand for the PISA target population. PISA covers students who are aged
between 15 years 3 months and 16 years 2 months at the time of assessment and who have completed at least
6 years of formal schooling, regardless of the type of institution in which they are enrolled and of whether
they are in full-time or part-time education, of whether they attend academic or vocational programmes, and
of whether they attend public or private schools or foreign schools within the country.
Reporting school data
The principals of the schools in which students were assessed provided information on their schools’
characteristics by completing a school questionnaire. Where responses from school principals are presented
in this publication, they are weighted so that they are proportionate to the number of 15-year-olds enrolled
in the school.
Reader’s Guide
24 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V
Focusing on statistically significant differences
This volume discusses only statistically significant differences or changes. These are denoted in darker colours
in figures and in bold font in tables. See Annex A3 for further information.
Abbreviations used in this report
ESCS PISA index of economic, social and cultural status
GDP Gross domestic product
ISCED International Standard Classification of Education
PPP Purchasing power parity
S.D. Standard deviation
S.E. Standard error
Further documentation
For further information on the PISA assessment instruments and the methods used in PISA, see the PISA 2009
Technical Report (OECD, forthcoming) and the PISA website (www.pisa.oecd.org).
This report uses the OECD’s StatLinks service. Below each table and chart is a url leading to a corresponding
Excel workbook containing the underlying data. These urls are stable and will remain unchanged over time.
In addition, readers of the e-books will be able to click directly on these links and the workbook will open in
a separate window, if their Internet browser is open and running.
1
25
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010
This chapter describes how PISA has measured trends in reading
performance between the first PISA assessment in 2000 and the latest
in 2009. Since reading was the focus of both assessments, it is possible
to obtain detailed comparisons of how student performance in reading
changed between 2000 and 2009. The chapter also discusses the
methods used for tracking trends in student performance in mathematics
and science.
Comparing Performance
over Time
1
Comparing Performance Over Time
26 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V
PISA 2009 is the fourth full assessment of reading since PISA was launched in 2000, the third assessment of
mathematics since PISA 2003, when the first full assessment of mathematics took place, and the second assessment
of science since PISA 2006, when the first full assessment of science took place.
Both PISA 2000 and PISA 2009 focus on reading, so it is possible to obtain detailed comparisons of how student
performance in reading changed over the 2000-2009 period. Comparisons over time in the areas of mathematics and
science are more limited, since there have not yet been two full assessments of either area in nine years of PISA testing.
Box V.1.1 Interpreting trends requires some caution
• The methodologies underlying the establishment of performance trends in international studies of education
are complex (Gebhardt and Adams, 2007). In order to ensure that the measurement of reading performance
in different surveys is comparable, a number of common assessment items are used in each survey. However,
the limited number of such items increases the risk of measurement errors.Therefore, the confidence band for
comparisons over time is wider than for single-year data, and only changes that are indicated as statistically
significant in this volume should be considered robust.1
• Some countries have not been included in comparisons between 2000, 2003, 2006 and 2009 for
methodological reasons. The PISA 2000 sample for the Netherlands did not meet the PISA response-rate
standards and mean scores for the Netherlands were therefore not reported for 2000. In Luxembourg, the
assessment conditions were changed in substantial ways between the 2000 and 2003 PISA surveys, thus
results are only comparable between 2003, 2006 and 2009.2
The PISA 2000 and PISA 2003 samples for
the United Kingdom did not meet the PISA response-rate standards, so data from the United Kingdom are
not comparable with other countries.3
For the United States, no reading results are available for 2006.4
The
sampling weights for the PISA 2000 assessment in Austria have been adjusted to allow for comparisons with
subsequent PISA assessments.5
For the PISA 2009 assessment, a dispute between teachers’ unions and the
education minister had led to a boycott of PISA, which was only withdrawn after the first week of testing.
The boycott required the OECD to remove identifiable cases from the dataset. Although the Austrian dataset
met the PISA 2009 technical standards after the removal of these cases, the negative atmosphere in regard to
educational assessment has affected the conditions under which the assessment was administered and could
have adversely affected student motivation to respond to the PISA tasks. The comparability of the 2009 data
with data from earlier PISA assessments can therefore not be ensured and data for Austria have therefore been
excluded from trend comparisons.
Some countries did not participate in all PISA assessments. When comparing trends in reading, this volume looks at
the 38 countries with valid results from the 2000 and 2009 assessments.6
When comparing trends in mathematics,
it considers 39 countries with valid results from the 2003 and 2009 assessments. PISA 2000 results in mathematics
are not considered, since the first full assessment in mathematics took place in 2003. Similarly, science performance
in 2009 cannot be compared to that of PISA 2000 or PISA 2003, since the first full science assessment took place
in 2006. Thus, when comparing trends in science, the 56 countries with valid results from the 2006 and 2009
assessments are included. Annex A5 provides a list of countries considered in this trends analysis.
Among OECD countries, the Slovak Republic and Turkey joined PISA in 2003, Chile and Israel did not participate
in the PISA 2003 assessment, and Estonia and Slovenia only participated in 2006 and 2009. The different number
of OECD countries participating in successive PISA assessments is reflected through separate OECD averages that
provide reference points for trend comparisons. For reading, the main reference point is the OECD average for
the 26 OECD countries that participated in both PISA 2000 and PISA 2009, but for comparisons involving all
four assessments, the average for the 23 OECD countries that participated in all of them is also provided. For
mathematics, trends can be calculated for the OECD average in 28 OECD countries that have valid results for both
PISA 2003 and PISA 2009. Thirty-three OECD countries have valid results for the 2006 and 2009 assessments in
science. Annex A5 gives more details on how the OECD average was calculated for different trend comparisons
presented in this volume.
1
Comparing Performance Over Time
27
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010
Figure V.1.1 summarises trends in reading performance. The first column provides information on whether reading
performance in PISA 2009 was above (blue), at (no colour) or below (grey) the average for OECD countries.
Countries are sorted by the magnitude of change in reading performance from PISA 2000 to PISA 2009, which is
reported in the second column. Increases in performance are indicated in blue; decreases are indicated in grey.
No colour means that there was no statistically significant change in performance. In addition, the chart highlights
changes in reading performance separately for boys and girls, changes in the proportion of lowest performers (below
proficiency Level 2) and in the proportion of top performers (students at proficiency Level 5 or 6). The last column
shows changes in the relationship between the socio-economic background of students and student performance,
which provides an indication of whether equity in the distribution of educational opportunities has increased (when
the relationship has weakened) or equity has decreased (when the relationship has strengthened).7
In all cases, blue
indicates positive change, grey indicates negative change, and no colour means that there has been no statistically
significant change.
• Figure V.1.1•
A summary of changes in reading performance
Mean score in reading 2009 is statistically significantly above the OECD average. Changes in reading and in the share of students at proficiency Level 5 or above are
statistically significantly positive. Changes in the share of students below proficiency Level 2 and in the association of socio-economic background with reading is
statistically significantly negative.
Mean score in reading 2009 is not statistically significantly different from the OECD average. Changes in reading, in the share of students at proficiency Level 5
or above, in the share of students below proficiency Level 2 and in the association of socio-economic background with reading are not statistically significantly
different.
Mean score in reading 2009 is statistically significantly below the OECD average. Changes in reading and in the share of students at proficiency Level 5 or above
are statistically significantly negative. Changes in the share of students below proficiency Level 2 and in the association of socio-economic background with
reading is statistically significantly positive.
Mean score
in reading 2009
Change in reading performance between 2000 to 2009
All students Boys Girls
Share of students
below proficiency
Level 2
Share of students at
proficiency Level 5
or above
Association of
socio-economic
background
with reading
performance
Peru 370 43 35 50 -14.8 0.4 0.1
Chile 449 40 42 40 -17.6 0.8 -7.6
Albania 385 36 35 39 -13.7 0.1 -9.9
Indonesia 402 31 23 39 -15.2 0.0 -6.9
Latvia 484 26 28 23 -12.5 -1.2 -11.0
Israel 474 22 9 35 -6.7 3.3 -8.4
Poland 500 21 14 28 -8.2 1.3 -1.5
Portugal 489 19 12 26 -8.6 0.6 -4.7
Liechtenstein 499 17 16 17 -6.4 -0.4 -13.3
Brazil 412 16 9 21 -6.2 0.8 -0.6
Korea 539 15 4 25 0.0 7.2 8.5
Hungary 494 14 11 17 -5.1 1.0 -4.2
Germany 497 13 10 15 -4.2 -1.2 -7.7
Greece 483 9 3 13 -3.1 0.6 2.0
Hong Kong-China 533 8 0 17 -0.8 2.9 -8.6
Switzerland 501 6 1 10 -3.6 -1.1 -2.3
Mexico 425 3 1 6 -4.0 -0.5 -7.3
Belgium 506 -1 0 -5 -1.2 -0.8 0.7
Bulgaria 429 -1 -8 6 0.7 0.6 -4.5
Italy 486 -1 -5 2 2.1 0.5 3.2
Denmark 495 -2 -5 -1 -2.7 -3.4 -3.2
Norway 503 -2 -5 -1 -2.5 -2.8 0.4
Russian Federation 459 -2 -6 1 -0.1 -0.0 1.4
Japan 520 -2 -6 3 3.5 3.6 c
Romania 424 -3 -18 11 -0.9 -1.5 10.7
United States 500 -5 -2 -6 -0.3 -2.4 -9.2
Iceland 500 -7 -10 -6 2.3 -0.5 5.4
New Zealand 521 -8 -8 -8 0.6 -3.0 4.9
France 496 -9 -15 -4 4.6 1.1 7.0
Thailand 421 -9 -6 -10 5.8 -0.2 -0.7
Canada 524 -10 -12 -10 0.7 -4.0 -6.4
Finland 536 -11 -12 -8 1.2 -4.0 5.8
Spain 481 -12 -14 -10 3.3 -0.9 1.5
Australia 515 -13 -17 -13 1.8 -4.9 -1.4
Czech Republic 478 -13 -17 -6 5.6 -1.9 -11.4
Sweden 497 -19 -24 -15 4.9 -2.2 7.7
Argentina 398 -20 -15 -22 7.7 -0.7 -1.7
Ireland 496 -31 -37 -26 6.2 -7.3 5.8
Countries are ranked in descending order of the change in reading performance between 2000 and 2009 for all students.
Source: OECD, PISA 2009 Database, Tables V.2.1, V.2.2, V.2.4 and V.4.3
12 http://dx.doi.org/10.1787/888932359948
A corrigendum has been issued for this page. See: http://www.oecd.org/dataoecd/43/61/49198566.pdf
1
Comparing Performance Over Time
28 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V
In several countries, student achievement has improved markedly across successive PISA assessments since 2000
(Table V.2.1). Each of these countries offers an example of an education system that succeeded in improving its
outcomes (see Chapter 2). This volume includes brief descriptions of some of the education systems that have seen
marked improvements in the performance of their students in PISA. Notes on Korea (Box V.B) and Poland (Box V.C)
appear between Chapters 1 and 2, notes on Portugal (BoxV.D) and Turkey (BoxV.E) appear between Chapters 3 and 4,
a note on Chile (Box V.F) appears between Chapters 4 and 5, and a note on Brazil (Box V.G) appears after Chapter 5.
School systems differ in many ways, including their overall performance level, the socio-economic background of
students and schools, the learning environment at school and how school systems are organised. Therefore, it is
important to interpret changes in learning outcomes in the context of the underlying characteristics of education
systems. In some of the education systems that have seen improvements or a decline in their performance, some of
the changes can be attributed to changes in the demographic profile of students. For example, in some countries,
student populations have become more socio-economically diverse over recent years, which, as Volume II,
Overcoming Social Background, shows, can be associated with performance disadvantages such that a decline in
performance may not necessarily be associated with a decline in the quality of the educational services provided, but
rather with a more challenging socio-economic context. To account for such changes, observed changes in reading
performance are discussed together with trend estimates that have been adjusted for changes in the demographic
and socio-economic profile of students and schools. More detailed descriptions of trends in equity in learning
opportunities and outcomes (see Chapter 4), and trends in the learning environment (see Chapter 5) that have been
observed since 2000 are also presented in this volume.
Annex A1 provides details on how performance scales were equated and on how trends were computed. Annex A6
provides details on how performance scales were adjusted for demographic and socio-economic context. Overall,
the evidence suggests that the performance trends reported in this volume are not affected by methodological
choices, and that in most countries, they are not driven by changes in the demographic and socio-economic
composition of the student population.
This volume also discusses trends in mathematics and science, although comparisons over time are much more
limited (see Chapter 3). Figure V.1.2 below summarises trends for all three assessment areas. Countries are sorted
by their reading performance in 2009. Since the trends for reading are calculated over a nine-year period for most
of the countries, and over a six-year or a three-year period for some of them, the trends have been annualised to
make them comparable across the three subject areas.8
Similarly, trends for mathematics and science were also
annualised as they are calculated over a six-year or three-year period for mathematics and over a three-year period
for science. Although the annualised figures ensure that the magnitude of changes is comparable across subject
areas, greater variability in reading trends is expected, as the longer reporting period for reading provides more
opportunites to reflect changes in education systems. This has indeed been observed.
Results are reported for all countries that participated in at least two assessments. The number of years for which
reading performance trends were calculated is given after the mean reading performance. Trends in mathematics
were calculated over six years if a country participated from at least 2003, or over three years if a country participated
in the last two assessments. All trends in science were calculated for three years between 2006 and 2009.
Among countries that scored at or above the OECD average, Portugal improved in all assessment areas, Korea
and Poland improved in both reading and science, Germany improved in reading and mathematics, Hungary and
Liecthenstein improved in reading, and Norway and the United States improved in science.
1
Comparing Performance Over Time
29
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010
• Figure V.1.2•
A summary of annualised performance trends in reading, mathematics and science
Mean score in reading 2009 is statistically significantly above the OECD average. Annualised score point changes in reading,
mathematics and science are statistically significantly positive.
Mean score in reading 2009 is not statistically significantly different from the OECD average. Annualised score point changes in
reading, mathematics and science are not statistically significantly different from zero.
Mean score in reading 2009 is statistically significantly below the OECD average. Annualised score point changes in reading,
mathematics and science are statistically significantly negative.
Mean score in reading
2009
Number of years for which
PISA results are available Reading Mathematics Science
Korea 539 9 1.6 0.7 5.3
Finland 536 9 -1.2 -0.6 -3.1
Hong Kong-China 533 8 1.0 0.7 2.3
Canada 524 9 -1.1 -0.9 -1.9
New Zealand 521 9 -0.9 -0.7 0.5
Japan 520 9 -0.3 -0.9 2.7
Australia 515 9 -1.5 -1.7 0.1
Netherlands 508 6 -0.8 -2.0 -0.9
Belgium 506 9 -0.1 -2.3 -1.3
Norway 503 9 -0.2 0.5 4.4
Estonia 501 3 0.1 -0.8 -1.2
Switzerland 501 9 0.7 1.2 1.7
Poland 500 9 2.4 0.8 3.4
Iceland 500 9 -0.7 -1.4 1.6
United States 500 9 -0.5 0.8 4.4
Liechtenstein 499 9 1.9 0.0 -0.7
Sweden 497 9 -2.1 -2.5 -2.7
Germany 497 9 1.5 1.6 1.6
Ireland 496 9 -3.4 -2.6 -0.1
France 496 9 -1.0 -2.3 1.0
Chinese Taipei 495 3 -0.3 -2.1 -4.0
Denmark 495 9 -0.2 -1.8 1.1
United Kingdom 494 3 -0.3 -1.0 -0.4
Hungary 494 9 1.6 0.0 -0.4
Portugal 489 9 2.1 3.5 6.2
Macao-China 487 6 -1.8 -0.3 0.1
Italy 486 9 -0.2 2.9 4.5
Latvia 484 9 2.9 -0.2 1.4
Slovenia 483 3 -3.8 -1.0 -2.4
Greece 483 9 1.0 3.5 -1.1
Spain 481 9 -1.3 -0.3 -0.1
Czech Republic 478 9 -1.5 -3.9 -4.1
Slovak Republic 477 6 1.4 -0.3 0.6
Croatia 476 3 -0.5 -2.4 -2.3
Israel 474 8 2.7 1.7 0.3
Luxembourg 472 6 -1.2 -0.7 -0.8
Lithuania 468 3 -0.5 -3.3 1.2
Turkey 464 6 3.9 3.7 10.0
Russian Federation 459 9 -0.3 -0.1 -0.4
Chile 449 8 5.0 3.2 3.1
Serbia 442 6 5.0 0.9 2.4
Bulgaria 429 8 -0.2 4.9 1.7
Uruguay 426 6 -1.4 0.8 -0.3
Mexico 425 9 0.4 5.5 2.1
Romania 424 7 -0.5 4.1 3.3
Thailand 421 8 -1.2 0.3 1.4
Colombia 413 3 9.3 3.6 4.6
Brazil 412 9 1.7 5.0 5.0
Montenegro 408 3 5.2 1.1 -3.5
Jordan 405 3 1.5 0.9 -2.2
Tunisia 404 6 4.8 2.1 5.1
Indonesia 402 8 3.9 1.9 -3.6
Argentina 398 8 -2.5 2.3 3.2
Albania 385 8 4.5 m m
Qatar 372 3 19.8 16.7 10.0
Peru 370 8 5.3 m m
Azerbaijan 362 3 2.9 -15.0 -3.1
Kyrgyzstan 314 3 9.8 6.9 2.5
Countries are ranked in descending order of the mean score in reading in 2009.
Source: OECD, PISA 2009 Database.
12 http://dx.doi.org/10.1787/888932359948
A corrigendum has been issued for this page. See: http://www.oecd.org/dataoecd/43/61/49198566.pdf
1
Comparing Performance Over Time
30 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V
Notes
1. Normally, when making comparisons between two concurrent means, the significance is indicated by calculating the ratio of
the difference of the means to the standard error of the difference of the means. If the absolute value of this ratio is greater than
1.96, then a true difference is indicated with 95% confidence. When comparing two means taken at different times, as in the
different PISA surveys, an extra error term, known as the linking error, is introduced and the resulting statement of significant
difference is more conservative.
2. For Luxembourg, changes were made in the organisational and linguistic aspects of the assessment conditions between PISA
2000 and PISA 2003 in order to improve compliance with OECD standards and to better reflect the national characteristics of the
school system. In PISA 2000, students in Luxembourg had been given one assessment booklet, with the language of assessment
having been chosen by each student one week before the assessment. In practice, however, a lack of familiarity with the language
of assessment was a significant barrier for a large proportion of students in Luxembourg in PISA 2000. In PISA 2003 and PISA
2006, each student was given two assessment booklets – one in each of the two languages of instruction – and the student could
choose his or her preferred language immediately prior to the assessment. This provided for assessment conditions that were
more comparable with those in countries that have only one language of instruction and resulted in a fairer assessment of the
performance of students in mathematics, science, reading and problem-solving. As a result of this change in procedures, the
assessment conditions, and hence the assessment results, for Luxembourg cannot be compared between PISA 2000 and PISA
2003. Assessment conditions between PISA 2003 and PISA 2006 were not changed and therefore those results can be compared.
3. In PISA 2000, the initial response rate for the United Kingdom fell 3.7% short of the minimum requirement. At that time, the
United Kingdom had provided evidence to the PISA Consortium that allowed for an assessment of the expected performance of
the non-participating schools. On the basis of that evidence, the PISA Consortium concluded that the response bias was likely
negligible and the results were included in the international report. In PISA 2003, the United Kingdom’s response rate was such
that sampling standards had not been met, and a further investigation by the PISA Consortium did not confirm that the resulting
response bias was negligible. Therefore, these data were not deemed internationally comparable and were not included in most
types of comparisons. For PISA 2006 and PISA 2009, more stringent standards were applied, and PISA 2000 and PISA 2003 data
for the United Kingdom are therefore not included in comparisons.
4. In the United States, because of an error in printing the test booklets, some of the reading items had incorrect instructions; as
a result, the mean performance in reading cannot be accurately estimated. The impact of the error on the estimates of student
performance is likely to exceed one standard error of sampling. This was not the case for science and mathematics items. For
details, see Annex A3.
5. As noted in the PISA 2000 Technical Report (OECD, 2002a), the Austrian sample for the PISA 2000 assessment did not cover
students enrolled in combined school and work-based vocational programmes as required by the technical standards for PISA.
The published PISA 2000 estimates for Austria were therefore biased (OECD, 2001). This non-conformity was corrected in the
PISA 2003 assessment. To enable reliable comparisons, adjustments and modified student weights were developed to make the
PISA 2000 estimates comparable to those obtained in PISA 2003 (Neuwirth, 2006, available at http://www.oecd-ilibrary.org/
education/oecd-education-working-papers_19939019).
6. Albania, Argentina, Bulgaria, Chile, Hong Kong-China, Indonesia, Israel, Peru and Thailand delayed the PISA 2000 assessment
to 2001, while Romania delayed it to 2002. Thus, for these countries, the period of time between PISA 2000 and PISA 2009
assessments is shorter.
7. The relationship between student socio-economic background and performance is captured by a slope co-efficient of the PISA
index of economic, social and cultural and educational status (ESCS) in a regression explaining student reading performance (see
Chapter 4).
8. Annualised trends that are reported here were calculated by dividing the change in performance by the number of years between
two assessments. For example, a change in reading performance between 2000 and 2009 was divided by nine for countries that
participated in the first and in the most recent assessments. For countries that participated in PISA 2003 and PISA 2009 but not
in PISA 2000, the change in reading performance between 2003 and 2009 was divided by six. Similarly, for participants in PISA
2006 and PISA 2009, a change in performance was divided by three. Although annualised trends were calculated for mathematics,
PISA 2000 results were not considered. For science, the change in performance between 2006 and 2009 was divided by three.
Country Boxes
31
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010
Box V.B Korea
In 2000, with PISA reading performance at 525 score points, Korea was already performing above the OECD
average. At that time, several countries had similar or even higher performance levels, including Australia,
Canada, Ireland, Japan, New Zealand and Finland, the highest-performing country that year. Nine years later,
Finland has retained its top performance level, but Korea now outperforms all of the other abovementioned
countries. Korea’s experience demonstrates that even at the highest performance level further improvements
are possible.
Despite the country’s strong performance in PISA 2000, Korean policy makers considered that students’ skills
needed further improvement to meet the changing demands of an internationally competitive labour market.
One approach was to shift the focus of the Korean Language Arts Curriculum from proficiency in grammar and
literature to skills and strategies needed for creative and critical understanding and representation, along the
lines of the approach underlying PISA. Diverse teaching methods and materials that reflected those changes
were developed, including investments in related digital and Internet infrastructure.
Recognising reading as a key competence in the 21st century, the government also developed and implemented
reading-related policies. Schools were requested to spend a fixed share of their budgets on reading education.
Training programmes for reading teachers were developed and distributed. Parents were encouraged to
participate more in school activities. They were also given information on how to support their children’s
schoolwork. In addition to that, socio-economically disadvantaged students were given support through various
after-school reading, writing and mathematics courses that had been put in place at the end of the 1990s.
The new “National Human Resources Development Strategies for Korea” defined policy objectives and
implementation strategies. As part of this, and following experiences with PISA and other instruments, the
government established the National Diagnostic Assessment of Basic Competency (NDABC) and strengthened
the National Assessment of Educational Achievement (NAEA) as measurement tools for monitoring the quality
of educational achievement. These instruments were implemented to ensure that all students had attained basic
competencies. The NDABC was implemented as a diagnostic tool in 2002 to measure basic competency in
reading, writing and mathematics for third-grade students. These measurement tools are now used locally to
diagnose the progress of elementary and middle-school students across different subjects.The NAEA programme
was introduced in 1998. Following changes in educational policy in 2002, the programme has expanded its
subject and grade coverage. NAEA assesses educational achievement and trends for 6th, 9th and 10th grade
students in Korean Language Arts, social studies, mathematics, science and English. With the help of NAEA, the
government monitors individual student performance levels and the accountability of public education.
Since 2000, Korea has seen significant improvements in both reading and science performance (see Figure
V.1.2 and Tables V.2.1 and V.3.4). The proportion of top performers in reading increased in Korea by more than
seven percentage points from 5.7% to 12.9% between 2000 and 2009 (see Figure V.2.5 and Table V.2.2). That
is the highest observed change among countries participating in PISA. Korea also experienced improved scores
in science from an already high level in 2006 (see Figure V.3.5 and Table V.3.4). Moreover, in 2006 11% of its
students scored below Level 2 in science, whereas in 2009 this proportion had been reduced to 6% - nearly the
lowest among the OECD countries (see Figure V.3.7 and Table V.3.5).
On the other hand, Korea is among countries that have seen the highest increase in variation of reading
performance (see Figure V.4.1 and Table V.4.1). A closer look reveals that the increase was driven by
improvements among high-achieving students that were not shared by low-achieving students (see Figure
V.2.11 and Table V.2.3). The 2009 results from Korea also show a modest increase in the association of socio-
economic background with PISA performance.
One factor that may have contributed to an increase in the number of top performers in reading is the introduction
of higher standards and the demand for language literacy. Korean Language Arts as a screening subject have
been strengthened in the College Scholastic Ability Test (CSAT), which students must take to get into university,
especially top-ranking institutions. Depending on what subjects they intend to take at university and on their
future careers, students generally select five to seven subjects on the assessment. However, almost all top-
Country Boxes
32 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V
ranking universities focus on Korean Language Arts, mathematics and English. The reading domain of Korean
Langauge Arts, in particular, is the largest and most important part of this assessment, while NAEA/NDABC tend
to evaluate the six domains of the Korean Language Arts Curriculum – listening, speaking, reading, writing,
literature, and grammar – equally. This provides additional incentives for high-achieving students in Korea to
spend more time studying the language arts and also mathematics and science.
Korea is also one of the countries with the highest number of students participating in after-school lessons. More
than two-thirds of students participate in such lessons for remedial purposes, while nearly half of the students
participate in after-school lessons for enrichment purposes in at least one of the following three subjects:
science, mathematics and reading (see Volume IV, What Makes a School Successful? Resources, Policies and
Practices, Table IV.3.17a). While private lessons are very popular in Korea among those who can afford them,
after-school group classes are often subsidised, so even disadvantaged students enrol frequently. For example, as
of June 2007, 99.8% of all primary and secondary schools were operating after-school programmes and about
50% of all primary and secondary students participated in after-school activities (see MEHRD, 2007). Many
observers suspect that high participation rates in after-school classes in Korea may be due to cultural factors
and an intense focus on preparation for university entrance examinations. PISA 2006 data show that Korean
students attending schools with socio-economically advantaged students are more likely to attend after-school
lessons with private teachers than students in other countries. On the other hand, disadvantaged students in
Korea are more likely to attend after-school group lessons more often than students in other countries. In both
cases, attending such extra lessons after school is associated with higher performance on PISA (OECD, 2010d).
The gender gap increased by 20 score points in Korea, mainly because of a marked improvement in girls’
performance that was not matched by a similar trend among boys (see Figure V.2.7 and Table V.2.4). The
percentage of top performers increased among girls by more than nine percentage points, while among boys
it rose by slightly less than five percentage points (see Tables V.2.5 and V.2.6). Overall, the average reading
performance improved only among girls, while it remained at similar levels among boys. The remarkable
improvement in girls’ performance was noticed not only in reading, but also in other assessment areas covered
by PISA and other international or national studies. The gender gap in mathematics and science has been
narrowing for a number of years, while PISA 2009 results show that the reading advantage of girls has become
even greater. National assessments demonstrated that the number of girls performing at the highest levels has
been gradually increasing since 2002.
Several changes could be associated with the more positive trend among girls. Since 2000, a more female-
friendly science and mathematics curriculum has been gradually introduced. For instance, women who were
scientists or engineers were promoted and thus became good role models for girls, a more gender-neutral
language was used in textbooks, and learning materials that were considered to be more interesting for girls
were introduced in science teaching. At the same time, national assessments such as the NAEA were re-
developed to better monitor how girls and boys acquire skills differently and to use formats that girls prefer,
including, for example, constructed response-item format. On the other hand, the trend may also be explained
partly by changes in the society. Over the past few years, the family structure in Korea has changed as the
number of children per household has rapidly decreased and the number of single-child families increased.
While traditionally girls from larger families were unlikely to get a good education, sociologists note that
parents in Korea today tend to value educating their children a great deal, regardless of gender. Smaller families
along with new opportunities and incentives for learning may also explain this trend.
Korean students’ lower performance in the PISA 2006 science assessment compared with the 2003 assessment
prompted policy makers to integrate modern science into school programmes. Although the number of Korean
students who performed below Level 2 in both mathematics and science was very small compared to other
countries, Korean officials considered the overall level of science performance too low. In 2007, the Korean
government decided to merge the Ministry of Science and Technology and the Ministry of Education into
one ministry and to improve and strengthen science education in order to enhance creativity and problem-
solving skills. Measures that have been undertaken cut across different activities, including providing new
mathematics and science textbooks that are more comprehensible and more interesting for students, but
also using teaching methods that encourage experimenting and inquiry-oriented science education. Recent
Country Boxes
33
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010
improvements in science, especially among top-performing students, could be associated with these latest
policy changes. Nevertheless, larger improvements are expected at all performance levels once the new policy
is fully implemented.
Box V.C Poland
In 2000, Poland’s 15-year-old students averaged 479 score points on the PISA reading assessment, well below
the OECD average of 500. More troubling for policy makers in Poland was the fact that over 23% of students
had not reached the baseline Level 2 in reading. The PISA results also showed large disparities in reading
performance between students attending various types of secondary schools. The mean score among students
enroled in basic vocational schools – who, at that time, constituted more than one-fifth of all students – was 358
score points, while the mean score among students enroled in general academic schools was 543 score points
and that of students in secondary vocational schools was around 480 score points.
Even prior to the release of the PISA results in 2000, plans were already under way in Poland to try and improve
student learning outcomes. In 1998, the Polish Ministry of Education presented the outline of a reform agenda
to: i) raise the level of education in Poland by increasing the number of people with secondary and higher-
education qualifications; ii) ensure equal educational opportunities; and iii) support improvements in the quality
of education. The reform was also part of a larger set of changes, including devolving more responsibilities for
education to local authorities, health reforms and pension-system reforms.
The education reform envisaged changes in the structure of the education system, reorganising the school
network and transportation; changes in administration and supervision methods; changes in the curriculum; a
new central examination system with independent student assessments; reorganising school finances through
local government subsidies; and new teacher incentives, such as alternative promotion paths and a revised
remuneration system. Although not all proposed changes were finally implemented as proposed, the reform
clearly changed the way schools in Poland were managed, financed and evaluated, while also affecting teaching
content and methods.
The structural changes resulted in a new type of school: the lower secondary “gymnasium” with the same
general education programme for all students, which became a symbol of the reform. The previous structure,
comprising eight years of primary school followed by four or five years of secondary school or a three-year basic
vocational school, was replaced by a system described as 6+3+3. This meant that education at primary school
was reduced from eight to six years. After completing primary school, a pupil would then continue his or her
education in a comprehensive three-year lower-secondary school. Thus, the period of general education, based
on the same curriculum and standards for all students, was extended by one year. Only after completing three
years of lower-secondary education would he or she move on to a three- or four-year upper-secondary school
that provided access to higher education or to a two- or three-year basic vocational school. In the new system,
each stage of education ends with a standardised national examination, which provides students, parents and
teachers with feedback. Policy makers can also use the results of the examination to monitor the school system
at the local or central level.
The reformers assumed that the lower secondary gymnasia would allow Poland to raise the level of education,
particularly in rural areas where schools were small. The new lower secondary schools would be larger;
they would also be well-equipped, with qualified teachers. Since the number of pupils in each school varies
depending on the catchment area, establishing the lower secondary gymnasia involved reorganising the school
network. Thus, since 2000, a number of small primary schools have been closed, with many more students
travelling to larger lower secondary schools.
Country Boxes
34 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V
The reform postponed choosing between an upper secondary general or vocational curriculum by one year
– giving all students one more year of a general lower secondary programme. The reform did not involve pre-
primary education, nor did it result in lowering the age at which compulsory schooling begins (seven years);
rather, it focused on primary and lower-secondary schools. In the meantime, enrolment in higher education
increased from roughly half a million students before 1993 to nearly two million 15 years later (see GUS,
2009). This also changed the environment in which newly established schools operated, with more parents
keen to provide their children with the best education and more students choosing schools carefully, taking into
consideration future career prospects. Education became highly valued in Poland as the economic returns of a
good education grew (see OECD, 2006a).
The reformers had two main arguments for proposing such changes. First, dividing education into stages would
allow teaching methods and curricula to better meet the specific needs of pupils of various ages. Second,
changing the structure of the education system would require teachers to adapt the curriculum and their
teaching methods, encouraging teachers to change not only what they taught but also how they taught.
After years of complaints about overloaded curricula and disputes about the way forward, the concept of a
core curriculum was adopted. This gave schools extensive autonomy to create their own curricula within a pre-
determined general framework, balancing the three goals of education: imparting knowledge, developing skills
and shaping attitudes. The curricular reform was designed not only to change the content of school education
and to encourage innovative teaching methods, but also to change the teaching philosophy and culture of
schools. Instead of passively following the instructions of the educational authorities, teachers were expected
to develop their own teaching styles, which would be tailored to the needs of their pupils.
Introducing curricular reform based on decentralisation required implementing a system for collecting
information and monitoring the education system at the same time. The reformers thus decided to organise
compulsory assessments to evaluate student achievement at the end of the primary and lower secondary cycles.
The results of the primary school assessments would not affect the students’ school career, as completing the
cycle would not depend on the results of those assessments. However, for admission to upper secondary schools,
the score earned on the lower secondary gymnasium final examination would be considered together with the
pupil’s final marks. Both of those examinations were first administered in 2002. Schooling would culminate
with the matura examination, first administered in 2005, which would be taken at the end of upper secondary
education. All of these examinations would be organised, set and corrected by the central examination board
and regional examination boards–new institutions that had been set up as part of the reform.
Introducing the national examination system not only provided an opportunity to monitor the outcomes of
schools centrally in a partly decentralised system, it also changed incentives for students and teachers. It sent a
clear signal to students that their success depended directly on their externally evaluated outcomes. It also made
it possible to assess teachers and schools on a comparable scale across the whole country. Finally, it provided
local governments with information on the outcomes of schools that were now under their organisational and
financial responsibility.
After the reform, local governments became an even more important part of the Polish school system. Although
by 1996 almost all primary schools were already under the responsibility of local governments, changes in the
financing scheme had been introduced together with the reform. The need to reorganise the school network
in order to create space for lower secondary schools provided additional incentives for local governments to
increase the efficiency and the quality of their local schools. After 2000, school funds were transferred to local
governments using a per-pupil formula. Those funds now constitute a large share of their budgets. After 2002,
some local governments also started using results from national examinations to evaluate their schools and to
shape pre-primary and upper secondary education in their local area.
The reform also introduced a new system of teacher development and evaluation. Initially, many teachers
upgraded their levels of education and professional skills to meet those new requirements. But the changes only
partly affected the remuneration system, which gives local governments and school principals little discretion.
This, together with high employment security and other benefits contained in the so-called Teacher Charter,
limited the impact of changes on the teaching profession (see OECD, 2006a).
Country Boxes
35
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010
The age cohorts covered by PISA in 2000, 2003 and 2006 have been affected by the reform in different ways.
The first group, those assessed in 2000, was not affected by the reform. The group of 15-year-olds in 2003 that
was covered by the second PISA assessment started primary school in the former system, but attended the new
lower secondary gymnasia. Those students all had the same educational curricula and were not divided into
different school types. The group covered by PISA 2006 had been part of the reformed educational system for
most of its school career, while those assessed in 2009 had been part of that system for their entire school career.
While it is not possible to establish a causal relationship between the reform and the outcomes measured by
PISA, reading performance in Poland has improved by 21 score points since 2000 (see Figure V.2.1 and Table
V.2.1).The largest improvement was observed in PISA 2003, right after the reform.The PISA 2009 results suggest
that the lowest-performing students appear to have benefited most from the reform. The share of students below
proficiency Level 2 decreased by eight percentage points and the performance of the lowest-achieving students
increased by 40 score points, while remaining at similar levels for the highest-achieving students (see Figure
V.2.4 and Tables V.2.2 and V.2.3).
Additional analyses suggest that students from former vocational schools benefited most from these reforms (see
Jakubowski, Patrinos, Porta, Wisniewski, 2010). Lower secondary school students assessed in 2006 with the
same background as students who were in basic vocational schools in 2000 performed higher by roughly one
standard deviation on the PISA reading scale. Smaller improvements were also apparent among 2006 lower
secondary school students who had a similar background to those in secondary vocational schools in 2000,
although the benefits to those who were similar to students in general upper secondary schools in 2000 were
negligible. This suggests that the reform improved outcomes for students who would end up in former basic
vocational schools under the old system and who were given a chance to acquire more general skills in newly
created lower secondary schools.
Poland reduced its total variation in reading performance by 20% (see Figure V.4.1 and Table V.4.1). This was
mainly achieved by reducing the differences in performance between schools and improving performance
among the lowest achievers. From a relatively high level in 2000, between-school variation decreased by
three-fourths to a level well below the OECD average. Moreover, by 2009, the association between a school’s
socio-economic background and its mean performance was three times weaker than that in 2000, although
the overall impact of socio-economic background on performance remained unchanged (see Figure V.4.3
and Table V.4.3). This suggests that the school reform in Poland had the effect of distributing students from
different backgrounds more evenly across schools. Nevertheless, the overall improvement in performance,
larger improvements among the lowest-achieving students, and a decrease in the total variation of student
performance, suggest that Poland improved markedly both with regard to its mean performance as well as to
the level of equity in the distribution of learning opportunities.
2
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010 37
This chapter highlights trends in reading performance between 2000
and 2009. It includes changes in performance among the lowest- and
highest-achieving students, boys and girls, students with an immigrant
background, socio-economically advantaged and disadvantaged students,
and among countries.
Trends in Reading
2
TRENDS IN READING
38 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V
Continuity and change in the reading literacy framework and assessment
Reading literacy includes a broad set of cognitive competencies, from basic decoding, through knowledge of
words, grammar and linguistic and textual structures and features, to knowledge about the world. It also includes
metacognitive competencies: the awareness of and ability to use a variety of appropriate strategies when processing
texts. Specifically, PISA defines reading literacy as understanding, using and reflecting on written texts in order to
achieve one’s goals, acquire knowledge, develop one’s potential and participate in society (OECD, 2006b). A more
detailed description of the conceptual framework underlying the PISA reading assessment is provided in Volume I
of this report, What Students Know and Can Do.
The framework and instruments for measuring reading literacy were developed for the PISA 2000 assessment.
The PISA 2000 mean score for reading for 28 OECD countries was set at 500 and the standard deviation was set
at 100, establishing the scale against which reading performance in PISA 2009 was compared. Two countries
that participated in PISA 2000 have joined the OECD since 2000, while results for four OECD countries were
excluded from comparisons over time. Thus, reading performance trends are discussed for the 26 OECD countries
that participated in and had comparable results from both the 2000 and 2009 assessments. The PISA 2000 OECD
average for these 26 OECD countries is now 496, while the reading performance scale remained unchanged.1
In
PISA 2003 and PISA 2006, when the focus shifted first to mathematics and then to science, reading was allocated
smaller amounts of assessment time than in 2000, allowing only for an update on overall performance rather than
the kind of in-depth analysis of knowledge and skills that was possible for the PISA 2000 and 2009 assessments. To
ensure comparability across successive reading assessments, 41 out of the 130 PISA reading items used in the 2009
assessment were taken from the PISA 2000 assessment. These items were selected to reflect the overall balance of
the framework so that the proportion of items contained in each type of task was similar. From the 41 items assessed
in both 2000 and 2009, 28 reading items were also used in PISA 2003 and 2006 to assure the comparability of
results for these assessments. Details of the equating methodology for reading performance trends are provided in
Annex A1.
The scale on which student performance is reported is thus the same as the one used in 2000. It can be compared
across all four cycles. Consequently, the proficiency levels are also the same, although in 2009 the reading scale was
extended with new proficiency levels, at both the top and bottom ends of the performance distribution, to reflect the
capacity of PISA 2009 to provide more detailed information about low- and high-performing students.
How student performance in reading has changed since 2000
The OECD’s average reading performance has remained similar since 2000, in relation to the 26 OECD countries that
had comparable results both in the 2000 and 2009 assessments. This, in itself, is noteworthy because in recent years,
most countries have increased their investment in education substantially. Between 1995 and 2007, expenditure per
primary and secondary student increased by 43% in real terms, on average, across OECD countries (OECD, 2010b,
Table B1.5). In the short period between 2000, when the first PISA assessment was undertaken, and 2007, increases
in expenditures on education averaged around 25%; eight OECD countries increased their expenditures by between
35% and 71%. While not all these expenditures were devoted to raising the performance of students assessed in
PISA, it is intriguing that in many countries, such major financial efforts have not yet translated into improvements
in performance.
However, some countries have seen marked improvements in learning outcomes. Among the 38 countries that can
be compared between 2000 and 2009, 13 have seen improvements in reading performance since 2000 (Figure
V.2.1, see alsoTableV.2.1). Of the 26 OECD countries with comparable results for both assessments, seven countries
have seen improvements: Chile, Israel and Poland all improved their reading performance by more than 20 score
points, and Portugal, Korea, Hungary and Germany by between 10 and 20 score points. Similarly, among the
partner countries, Peru, Albania, Indonesia and Latvia improved their performance by more than 20 score points,
and Liechtenstein and Brazil by between 10 and 20 score points.
Four countries saw a decline in their reading performance between 2000 and 2009. Among those, student
performance in Ireland decreased by 31 score points, in Sweden by 19 score points, and in Australia and the Czech
Republic by 13 score points.
PISA considers only those results as statistically significant, marking them as such, where the uncertainty in
measuring changes in performance implies that an increase or decrease would be identified in less than five out
2
TRENDS IN READING
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010 39
of 100 replications of PISA when, in fact, there is no change. It is possible to calculate the exact percentage of
replications in which a change would be reported when there is no real change. This so-called “p-value” is reported
in Figure V.2.1 (see also the last column in Table V.2.1). The smaller this percentage, the more confidence one can
have that the observed changes are real. The p-value allows readers to assess the reliability of observed performance
differences that are not identified as statistically significant by PISA, using the stringent criteria described above. For
example, the observed increase in performance is nine score points in Greece and eight score points in Hong Kong-
China. This is a sizeable magnitude but the p-values for these estimates suggest that, in 28 out of 100 replications
in the case of Greece and in 21 out of 100 replications in the case of Hong Kong-China, PISA could have identified
such a change even if there is, in fact, no change. Because of the magnitude of the potential error, PISA does not
identify these changes as statistically significant. However, readers who are satisfied with a lower level of confidence
can still take these values into consideration.
• Figure V.2.1•
Change in reading performance between 2000 and 2009
-30
-25
-20
-15
-10
-5
0
5
10
15
20
25
30
35
40
45
50
Score
point
change
in
reading
performance
between
2000
and
2009
-35
p-value
in %
Peru
0
Chile
0
Albania
0
Indonesia
0
Latvia
0
Israel
4
Poland
0
Portugal
1
Liechtenstein
2
Brazil
1
Korea
3
Hungary
4
Germany
3
Greece
28
Hong
Kong-­
C
hina
21
Switzerland
38
Mexico
60
OECD
average
–
26
90
Belgium
86
Bulgaria
89
Italy
81
Denmark
74
Norway
74
Russian
Federation
74
Japan
77
Romania
63
United
States
62
Iceland
21
New
Zealand
20
France
17
Thailand
15
Canada
6
Finland
8
Spain
5
Australia
4
Czech
Republic
3
Sweden
0
Argentina
9
Ireland
0
Note: Statistically significant score point changes are marked in a darker tone.
Countries are ranked in descending order of the score point change in reading performance between 2000 and 2009.
Source: OECD, PISA 2009 Database, Table V.2.1.
12 http://dx.doi.org/10.1787/888932359967
Countries differ in their absolute performance levels, so even with improvements in reading performance, some
countries still perform far below the OECD average, while some countries with a decline in reading performance
may still outperform many other countries. It is thus useful to examine both where countries stand and how
performance has changed.
Countries towards the right of FigureV.2.2 improved their performance between 2000 and 2009, while those towards
the left saw a decrease in student scores. Countries towards the top performed above the OECD average in 2009,
while those towards the bottom performed below the OECD average. Countries that improved their performance
between 2000 and 2009 can be classified into three groups, depending on their performance level in 2009. The
first group includes countries that improved their performance but still performed below the OECD average. These
countries are represented in the bottom-right corner of Figure V.2.2. The second group includes countries that
improved their performance so that they now perform close to the OECD average. These countries are represented
in the middle-right of Figure V.2.2. The third group contains countries that had already outperformed most of the
PISA participants but still improved their performance. These countries are on the top-right part of Figure V.2.2. For
countries with a white marker the changes were not statistically significant.
Among countries that scored above the OECD average in 2009, three countries improved their performance. Korea
improved its performance by 15 score points from an already high level in 2000. Poland improved its performance
by 21 score points and, from a country that performed below the OECD average in 2000, became a country that
2
TRENDS IN READING
40 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V
scored above the OECD average in 2009. The partner country Liechtenstein improved its performance by 17 score
points. More detailed discussions of the school systems in Korea and Poland are provided in Boxes V.B and V.C,
respectively, which appear between Chapters 1 and 2.
Among average-performing countries in 2009, reading performance improved in Portugal, Hungary and Germany.
Box V.D, which appears between Chapters 3 and 4, provides more details on reforms in Portugal.
Several countries with below-average performance in 2009 saw marked improvements. Among OECD countries,
student performance in Chile increased by 40 score points and is now close to 450 score points, while student
performance in Israel increased by 22 score points and is now equal to 474 score points. Chile’s school system is
briefly discussed in Box V.F, which appears after Chapter 4. The partner country Peru saw the largest improvement,
with an increase of 43 score points, although its overall performance is still below 400 score points. Albania and
Indonesia increased their performance by 30 to 40 score points, although both countries still perform at or below
400 score points. Brazil increased its performance by 16 score points and now performs above 400 score points (see
Box V.G, which appears after Chapter 5). Latvia increased its performance by 26 score points and now performs at
484 score points.
A number of countries performing above the average saw a decrease in reading scores. Australia’s performance
declined by 13 score points but the country still ranks among the top performers in reading. Performance in Ireland
and Sweden declined by 31 and 19 score points, respectively, and both countries now perform around the OECD
average. The Czech Republic also saw a decline in performance and now scores below the OECD average.
• Figure V.2.2 •
How countries perform in reading and how reading performance has changed since 2000
Greece
Hong Kong-China
Switzerland
Mexico
Belgium
Bulgaria
Italy
Denmark
Norway
Russian Federation
Japan
Romania
United States
Iceland
New Zealand
France
Thailand
Canada
Finland
Spain
Argentina
Chile
Albania
Indonesia
Latvia
Israel
Poland
Portugal
Liechtenstein
Brazil
Korea
Hungary
Germany
Australia
Czech Republic
Sweden
Ireland
350
400
450
500
550
600
-40 -30 -20 -10 0 10 20 30 40 50
Mean
score
in
reading
in
2009
Score point change in reading between 2000 and 2009
PISA 2009 performance above OECD average
Performance declined
PISA 2009 performance above OECD average
Performance improved
PISA 2009 performance below OECD average
Performance improved
PISA 2009 performance below OECD average
Performance declined
Peru
Note: Score point change in reading performance between 2000 and 2009 that are statistically significant are marked in a darker tone.
Source: OECD, PISA 2009 Database, Table V.2.1.
12 http://dx.doi.org/10.1787/888932359967
OECD average
Score point change in reading
performance between 2000 and 2009
2
TRENDS IN READING
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010 41
• Figure V.2.3•
Multiple comparisons between 2000 and 2009
Reading
performance
in
2000
Reading
performance
in
2009
Countries with lower
performance in 2000
and similar performance
in 2009
Countries with lower
or similar performance
in 2000 and higher
performance
in 2009
Countries with similar
performance in 2000
and 2009
Countries with similar
or higher performance
in 2000 and lower
performance
in 2009
Countries with higher
performance in 2000 and
similar performance
in 2009
Korea 525 539 Hong Kong-China
Japan, Canada, Ireland,
New Zealand, Australia
Finland
Finland 546 536 Korea, Hong Kong-China
Hong Kong-China 525 533 Korea
Japan, Canada, Ireland,
New Zealand, Australia
Finland
Canada 534 524 Japan Korea, Hong Kong-China New Zealand Australia
New Zealand 529 521 Korea, Hong Kong-China Japan, Canada, Australia Ireland
Japan 522 520 Korea, Hong Kong-China New Zealand, Australia Sweden, Ireland Canada
Australia 528 515
Canada, Korea, Hong
Kong-China
Japan, New Zealand Ireland
Belgium 507 506 Liechtenstein, Switzerland,
Poland
Norway, United States
Iceland, Sweden, Ireland,
France
Norway 505 503 Liechtenstein, Germany,
Switzerland, Poland
Iceland, Belgium, United
States, France
Sweden, Ireland
Switzerland 494 501
Liechtenstein, Germany,
Poland, Hungary
Denmark, United States
Italy Spain, Czech
Republic
Iceland, Norway, Belgium,
Sweden, Ireland, France
Poland 479 500 Liechtenstein, Germany,
Hungary
Italy Portugal, Spain,
Greece, Czech Republic
Iceland, Norway, Switzerland,
Belgium, Denmark, Sweden,
Ireland, United States, France
Iceland 507 500
Liechtenstein, Germany,
Switzerland, Poland,
Hungary
Belgium
Norway, United States,
France
Sweden, Ireland
United States 504 500
Liechtenstein, Germany,
Poland, Hungary
Iceland, Norway,
Switzerland, Belgium,
Denmark, Sweden, France
Spain, Czech Republic Ireland
Liechtenstein 483 499 Germany, Poland, Hungary
Italy Spain, Greece,
Czech Republic
Iceland, Norway, Switzerland,
Belgium, Denmark, Sweden,
Ireland, United States, France
Sweden 516 497
Iceland, Norway,
Liechtenstein, Germany,
Switzerland, Denmark,
Poland, Portugal, Hungary,
France
Japan, Belgium United States Ireland
Germany 484 497
Liechtenstein, Poland,
Hungary
Italy Spain, Greece,
Czech Republic
Iceland, Norway, Switzerland,
Denmark, Sweden, Ireland, United
States, France
Ireland 527 496
Iceland, Norway,
Liechtenstein, Germany,
Switzerland, Denmark,
Sweden, Poland, Portugal,
Hungary, United States,
France
Japan, Belgium, Korea,
Hong Kong-China, New
Zealand, Australia
France 505 496
Liechtenstein, Germany,
Switzerland, Denmark,
Poland, Portugal, Hungary
Belgium
Iceland, Norway, United
States
Sweden, Ireland
Denmark 497 495
Liechtenstein, Germany,
Poland, Portugal, Hungary
Switzerland, United States Spain, Czech Republic Sweden, Ireland, France
Hungary 480 494
Liechtenstein, Germany,
Poland, Portugal
Italy Spain, Greece,
Czech Republic
Iceland, Switzerland, Denmark,
Sweden, Ireland, United States,
France
Portugal 470 489 Poland Latvia, Greece, Hungary
Russian Federation, Israel
Spain, Czech Republic
Italy, Denmark, Sweden, Ireland,
France
Italy 487 486 Latvia, Portugal, Greece
Liechtenstein, Germany,
Switzerland, Poland,
Hungary
Spain Czech Republic
Latvia 458 484 Portugal Russian Federation, Israel
Italy, Spain, Greece, Czech
Republic
Greece 474 483 Latvia, Israel
Liechtenstein, Germany,
Poland, Hungary
Portugal Russian Federation Italy, Spain, Czech Republic
Spain 493 481 Latvia, Israel, Greece
Liechtenstein, Germany,
Switzerland, Denmark,
Poland, Portugal, Hungary,
United States
Italy, Czech Republic
Czech Republic 492 478 Latvia, Israel, Greece
Italy, Liechtenstein,
Germany, Switzerland,
Denmark, Poland, Portugal,
Hungary, United States
Spain
Israel 452 474 Latvia, Portugal Russian Federation Spain, Greece, Czech Republic
Russian Federation 462 459
Latvia, Israel, Portugal,
Greece
Chile 410 449
Argentina,Thailand,
Bulgaria, Romania,
Mexico
Bulgaria 430 429 Chile Thailand, Romania, Mexico Argentina
Mexico 422 425 Chile Thailand, Bulgaria, Romania Argentina
Romania 428 424 Chile Thailand, Bulgaria, Mexico Argentina
Thailand 431 421 Chile Bulgaria, Romania, Mexico Argentina
Brazil 396 412 Argentina
Indonesia 371 402 Argentina
Argentina 418 398 Indonesia
Thailand, Bulgaria,
Romania, Brazil, Mexico,
Chile
Albania 349 385
Peru 327 370
Source: OECD, PISA 2009 Database.
12 http://dx.doi.org/10.1787/888932359967
2
TRENDS IN READING
42 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V
Figure V.2.3. provides multiple comparisons of changes in the relative standing of countries in reading performance
in 2000 and 2009. Countries are sorted by their performance in 2009. For each country the figure identifies a list
of other countries or economies with similar performance. The first group includes comparisons between countries
that had lower scores in 2000 but have similar performance levels in 2009 as the country shown in the first column.
The second group includes countries with lower or similar scores in 2000 but that show higher scores in 2009.
The third group includes countries whose performance was similar in 2000 and 2009. The fourth group includes
countries with similar or higher scores in 2000 and lower scores in 2009. The fifth group includes countries with
higher scores in 2000 and similar scores in 2009. The figure includes all 38 countries that have comparable results
from the 2000 and 2009 assesments.
The chart can be used to see how the position of a country changed compared to other countries that are close in
relative performance.
Mean performance summarises overall student performance in PISA. While it gives a general idea of how countries
perform in comparison to others, mean performance can mask important variations in student performance. For
policy makers, information about the variability of student performance is important. For example, readers interested
in policies and practices relating to the most talented students might be interested in those countries in which the
highest-achieving students improved their performance, or countries in which the share of high-achieving students
grew. Similarly, readers interested in policies and practices relating to lower-performing students might examine
more closely those countries that have seen improvements among the lowest-achieving students, or where the share
of low-achieving students decreased.
Performance trends among low- and high-achieving students can be examined by considering changes in the
percentage of students at each of the PISA proficiency levels. As explained in Volume I, What Students Know and
Can Do, reading scores in 2009 are reported according to different levels of proficiency that correspond to tasks of
varying difficulty. Establishing proficiency levels in reading makes it possible not only to rank students’ performance
but also to describe what students at different levels of the reading scale can do.
As explained in Volume I, reading proficiency Level 2 can be considered a baseline level of proficiency, at which
students have learned to read and begin to demonstrate the kind of competencies that are required to use reading
for learning. Students below this level may still be capable of locating pieces of explicitly stated information that
are prominent in the text, recognising a main idea in a text about a familiar topic, or recognising the connection
between information in the text and their everyday experience. However, they have not acquired the level of
literacy that is required to participate effectively and productively in life. On average across the 26 OECD countries
with comparable results for both assessments, 18.1% of students performed below Level 2 in 2009, while the
corresponding percentage in 2000 was 19.3% (TableV.2.2). Although this percentage changed only slightly between
the two assessments, it varied noticeably among countries.
Reducing the percentage of poorly performing students is considered one of the most important tasks for school
systems in many countries, given the large economic and social costs associated with poor performance in school.
Following up on students who were assessed in PISA 2000, the Canadian Youth in Transitions Survey shows that
students scoring below Level 2 face a disproportionately higher risk of poor participation in post-secondary
education or low labour-market outcomes at age 19, and even worse outcomes at age 21, the latest age for which
these data are available (OECD, 2010a).
FigureV.2.4 shows changes in the share of students below Level 2. For each country, bars represent the percentage of
students performing below Level 2 in 2009, while markers denote that share in 2000. Countries are sorted according
to the percentage of students below Level 2 in 2009, with those that show fewer students at this low proficiency
level are on the left.
To make comparisons of changes in the percentage of students at different proficiency levels more meaningful,
countries can be grouped according to how many students in those countries performed at each level in 2000. In
2000, more than 60% of students in Peru, Albania and Indonesia performed below Level 2 (Table V.2.2). All three
countries have seen a reduction in this share of more than 10 percentage points. The proportion of lower-performing
students remained at relatively high levels in these countries, but this trend shows that real progress has been made
in all the PISA countries where the very highest percentages of 15-year-olds have limited reading skills.
2
TRENDS IN READING
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010 43
Among countries where between 40% and 60% of students performed below Level 2 in 2000, in Chile that
proportion decreased by 18 percentage points (see Box V.F), while the proportion decreased by smaller amounts in
Mexico and the partner country Brazil (see Box V.G).
Among countries where the proportion of students performing below Level 2 was smaller than 40% but still above
the OECD average of 19%, the partner country Latvia reduced the proportion by 13%, while Portugal, Poland,
Hungary, Germany, Switzerland, and the partner country Liechtenstein reduced it by smaller amounts (see Boxes
V.D for Portugal and V.C for Poland for examples of policies that might be associated with these trends). In the
partner country Thailand, the proportion of students performing below Level 2 increased by six percentage points
from a relatively high level of 37%. In countries where the proportion of students performing below Level 2 was
already below average in 2000, Denmark further reduced the proportion by three percentage points and now shows
15% of students below Level 2.
The proportion of students below Level 2 increased in Ireland, the Czech Republic, Sweden, France, Spain and
Iceland. While this proportion is still below the OECD average in Iceland, Ireland and Sweden, it is now above
average in France, Spain and the Czech Republic.
Students performing at Level 5 or 6 are frequently referred to as “top performers” in this report. These students can
handle texts that are unfamiliar in either form or content. They can find information in such texts, demonstrate
detailed understanding, and infer which information is relevant to the task. Using such texts, they are also able to
evaluate critically and to build hypotheses, draw on specialised knowledge and accommodate concepts that may
be contrary to expectations. A comparison of the kinds of tasks students at Level 5 or above are capable of suggests
that those who get to this level can be regarded as potential “world class” knowledge workers of tomorrow. Thus,
the proportion of a country’s students reaching this level is a good indicator of its future economic competitiveness.
On average across the 26 OECD countries with comparable results for both assessments, the combined percentage of
students performing at Level 5 or 6 was 9.0% in 2000 and decreased to 8.2% in 2009 (see Table V.2.2). Although the
proportion of students at this level changed only slightly between the assessments, it varies considerably across countries.
• Figure V.2.4 •
Percentage of students below proficiency Level 2 in reading in 2000 and 2009
Korea
0
Finland
0
Hong
Kong-­
C
hina
0
Canada
0
Japan
0
Australia
0
New
Zealand
0
Norway
0
Poland
-­
Denmark
-­
Liechtenstein
-­
Switzerland
-­
Iceland
+
Ireland
+
Sweden
+
Hungary
-­
Latvia
-­
United
States
0 Portugal
-­
Belgium
0
OECD
average
–­
26
-­
Germany
-­
Spain
+
France
+
Italy
0
Greece
0
Czech
Republic
+
Israel
0
Russian
Federation
0
Chile
-­
Mexico
-­
Romania
0
Bulgaria
0
Thailand
+
Brazil
-­
Argentina
0
Indonesia
-­
Albania
-­
Peru
-­
2009 higher
than 2000
2009 lower
than 2000
No statistically
significant difference
95% confidence level + - 0
Countries are ranked in ascending order of the percentage of students below proficiency Level 2 in reading in 2009.
Source: OECD, PISA 2009 Database, Table V.2.2.
12 http://dx.doi.org/10.1787/888932359967
Change
in
the
percentage
of
students
below
proficiency
Level
2
in
reading
between
2000
and
2009
0
10
20
30
40
50
60
70
80
90
Percentage
of
students
below
proficiency
Level
2
2009 2000
2
TRENDS IN READING
44 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V
Figure V.2.5 shows changes in the shares of top-performing students. For each country, blue bars represent the
percentage of students performing at Level 5 or 6 in 2009, while markers denote the corresponding proportion in
2000. Countries are sorted according to the percentage of students at Level 5 or above in 2009, with countries that
have the largest proportion of top performers on the left.
• Figure V.2.5 •
Percentage of top performers in reading in 2000 and 2009
Note: Countries are ranked in descending order of top performers in reading in 2009.
Source: OECD, PISA 2009 Database, Table V.2.2.
12 http://dx.doi.org/10.1787/888932359967
The proportion of top performers increased in Japan and Korea and the partner economy Hong Kong-China to one
of the higest levels among 2009 participants (Table V.2.2). In Japan, this proportion increased from nearly 10% to
above 13%. In Korea, it increased by more than seven percentage points from less than 6% to almost 13%, which
was the highest observed change among participating countries. Because of this improvement, Korea moved from
below to above the OECD average in the percentage of top performers (see also Box V.B). In Hong Kong-China,
this proportion increased by almost three percentage points to slightly more than 12%. Among countries that
have relatively low proportions of top performers, the percentage of students at Level 5 or above increased by
three percentage points in Israel, and by less than one percentage point in Chile and the partner country Brazil.
In several countries that had above-average proportions of top performers in 2000, this percentage decreased. The
most noticeable change was in Ireland, where the proportion of top performers decreased from 14% to 7%, which
is below the OECD-26 average. In Australia, Canada, Finland and New Zealand, the decrease was smaller and all
these countries still have more top performers than the OECD average for the 26 countries that have comparable
results from both assessments. This proportion decreased in Norway and Sweden from a similar level of 11% in
2000 to 9% in Sweden and 8% in Norway. The proportion of top performers decreased from 8% to less than 5% in
Denmark and from 7% to 5% in the Czech Republic. Interestingly, in Denmark, the proportion of students below
Level 2 also decreased. The partner country Romania is the only country where the proportion of top performers
decreased from an already low level, from 2% to less than 1%.
While trends in proficiency levels compare the highest- and the lowest-performing students with an absolute
measure, it is also possible to compare the top and bottom ends of the performance distribution relative to the
average student within a country. This is particularly useful in countries with very low or high overall levels of
student performance, in which international benchmarks for the highest- and the lowest-performing students
may be less relevant. Such within-country comparisons can be facilitated by analysing the percentiles of the
New
Zealand
-
Finland
-
Japan
+
Korea
+
Australia
-
Canada
-
Hong
Kong-­
C
hina
+
Belgium
0
United
States
0
France
0
Sweden
-
Iceland
0
Norway
-
Switzerland­
0­
Germany­
0­
Israel
+
Poland
0
Ireland
-
Hungary
0
Italy
0
Greece
0
Czech
Republic
-
Portugal­
0­
Denmark
-­ Liechtenstein
0
Spain
0
Russian
Federation
0
Latvia
0
Bulgaria
0
Brazil
+
Chile
+
Argentina
0
Romania
-
Peru
­
0­
Mexico
0
Thailand
0
Albania
0
Indonesia­
0­
2009 higher
than 2000
2009 lower
than 2000
No statistically
significant difference
95% confidence level + - 0
0
2
4
6
8
10
12
14
16
18
20
Percentage
of
top
performers
2009 2000
Change
in
the
percentage
of
top
performers
in
reading
between
2000
and
2009
2
TRENDS IN READING
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010 45
student performance distribution within a country. Percentiles do not indicate what students can do; they provide
quantitative information on the performance of the lowest- or the highest-achieving students relative to other
students in a country.
The 90th percentile indicates the point on the PISA performance scale below which 90% of students in a country
score or which only 10% of students exceed. Changes in the value of the 90th percentile show whether a country
saw an increase or decrease in the performance level of its highest-performing students. Similarly, the 10th percentile
indicates the point on the PISA performance scale below which only 10% of students in a country score. A change
in the value of the 10th percentile indicates whether a country sees an increase or decrease in the performance level
of its lowest-performing students.
The difference between the 90th and 10th percentiles can be used as a measure of the range of performance in each
country. Trends in this difference show whether the variation in student performance within the country is changing.
Performance at key percentile ranks can change even if a country’s mean performance remains the same.
Figure V.2.6 classifies countries into four groups (see also Table V.2.3). Countries in the top-right corner show
improved performance among both their highest- and lowest-achieving students, while countries in the bottom-
• Figure V.2.6 •
Performance changes among the lowest- and highest-achieving students in reading
between 2000 and 2009
Albania
Chile
Denmark
Indonesia
Ireland
Peru
Argentina
Belgium
Bulgaria
Czech Republic
Finland
Greece
Hong Kong-China
Hungary
Iceland
Italy
Mexico
New Zealand
Russian Federation
Spain
United States
Australia
Brazil
Israel
Japan
Korea
Romania
France
Germany
Latvia
Liechtenstein
Norway
Poland
Portugal
Sweden
Switzerland
-40
-30
-20
-10
0
10
20
30
40
50
-40 -30 -20 -10 0 10 20 30 40 50 60 70
Change
in
the
90th
percentile
between
2000
and
2009
Change in the 10th percentile between 2000 and 2009
Changes for lowest- and highest-achieving students
Decrease in lowest-achieving students
Decrease in highest-achieving students
Increase in lowest-achieving students
Decrease in highest-achieving students
Increase in highest-achieving students
Increase in lowest-achieving students
Increase in highest-achieving students
Decrease in lowest-achieving students
Thailand
Canada
Note: Changes for both lowest- and highest-achieving students that are statistically significant are marked in darker tone.
Source: OECD, PISA 2009 Database, Table V.2.3.
12 http://dx.doi.org/10.1787/888932359967
2
TRENDS IN READING
46 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V
left corner show a decline in performance among both groups of students. Countries in the top-left corner show
improvements in performance among their highest-achieving students and a decline in the performance of their
lowest-achieving students. In these countries, variation in performance increased because of the widening gap between
the top and the bottom levels of student performance. Countries in the bottom-right corner show an improvement in
performance among their lowest-achieving students and a decline among their highest-achieving students. In these
countries, the variation in performance diminished. Most of the countries, however, are situated in the top-right or
bottom-left corners, indicating that performance trends among their lowest- and highest-achieving students in these
countries are similar. Countries indicated with blue markers showed statistically significant changes in the performance
of both their highest- and lowest-achieving students. Countries indicated with white markers did not see statistically
significant changes or saw them for either the highest- or the lowest-achieving students, but not for both.
Chile and three partner countries, Indonesia, Albania and Peru, all show marked improvements in reading
performance among both their lowest- and highest-achieving students. These countries are also among those that
show the largest improvement in mean performance and in which the percentage of students performing below
Level 2 decreased. The lowest-achieving students show relatively larger improvements than the highest-achieving
students in Chile and Indonesia, while in Peru and Albania both groups of students show similar levels of
improvement. In short, in these countries, students across the entire performance scale improved.
Six countries – Poland, Portugal, Germany, Switzerland, and the partner countries Latvia and Liechtenstein – saw
improvements in the performance of their lowest-achieving students while maintaining the performance level
among the highest-achieving students.
Korea, Israel, and the partner country Brazil raised the performance of their highest-achieving students while
maintaining the performance level among the lowest-achieving students.
In Denmark, the performance of the lowest-achieving students improved, while the performance of the highest-
achieving students declined. Similarly, in Norway, the performance of the lowest-achieving students improved and
the share of top performers decreased. As a consequence, the performance gap between the lowest- and the highest-
achieving students narrowed markedly in these two countries, while their mean performance did not change.
In Australia and Canada, and the partner country Romania, performance among their highest-achieving students
declined while performance among their lowest-achieving students remained largely unchanged.
In France, the performance of the lowest-achieving students declined while the performance of the highest-achieving
students remained the same.
In Ireland and to some extent in Sweden, the performance of both the lowest- and highest-achieving students
declined. These countries are also among those that show the greatest decrease in mean performance results and
are among those in which the percentage of students at the highest proficiency levels fell while the percentage of
those below Level 2 rose.
Fortherestofthecountries,performanceamongthelowest-andthehighest-achievingstudentsdidnotchangemeasurably.
How gender differences in reading have evolved
The gender gap is far wider in reading than it is in either mathematics or science, and this has been true since
the first PISA assessment in 2000. Girls outperform boys in reading in all countries participating in 2009, with an
average advantage of 39 score points across OECD countries (see Table V.2.4). In 2000, the corresponding gender
gap was 32 score points, on average, across OECD countries.
The gender gap widened in some countries, but it did not narrow in any country. It increased by more than 20 score
points in Israel and Korea and the partner country Romania. In all of these countries, the score point difference
between boys and girls at least doubled. In Israel and Korea, the gap widened because of a marked improvement
in girls’ performance that was not matched by a similar trend among boys (see Box V.B, which discusses changes
in girls’ performance in Korea). The performance advantage among girls also increased in Portugal, the partner
economy, Hong Kong-China, and the partner countries, Indonesia and Brazil, where the overall positive trend was
due, in part, to a greater improvement among girls in comparison with boys. The gender gap also widened in France
and Sweden, mainly because of a decline in boys’ performance.
2
TRENDS IN READING
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010 47
None of the countries where the advantage of girls increased is among those with the widest gender gaps. However,
after the changes in the relative performance of boys and girls in Romania and Israel, the gender gap has become
wider in these countries than on average across OECD countries, while it had previously been narrower.
In general, girls’ performance advantage in reading is most pronounced in the percentage of students who perform
below Level 2 (Tables V.2.5 and V.2.6). Across OECD countries, 24% of boys perform below Level 2 compared to
only 12% of girls. Policy makers in many countries are already concerned about the large percentage of boys who
lack basic reading skills. Therefore, any increase in this share is worth noting.
Figure V.2.8 shows changes in the percentages of boys and girls who perform below Level 2 in reading. Countries
are sorted according to the overall trend among lower-performing students, with those where their numbers have
fallen most shown on the left.
Across OECD countries, the percentage of girls performing below Level 2 decreased by two percentage points,
while the share of lower-performing boys did not change.
In nearly all countries where there was a decrease in the percentage of students performing below Level 2, this
trend was usually more apparent among girls. In Indonesia, the overall decrease in the percentage of students
performing below Level 2 was around 15 percentage points; but while the percentage of girls performing below
Level 2 decreased by 21 percentage points, the percentage of boys performing at that level decreased by only
9 percentage points. Similarly, in Peru and Albania, the share of girls performing below Level 2 decreased by 19 and
17 percentage points, respectively, whereas the corresponding share of boys decreased by 11 and 12 percentage
points, respectively. In Israel and Brazil, the overall decrease in the share of students performing below Level 2
• Figure V.2.7 •
Comparison of gender differences in reading between 2000 and 2009
0
10
20
30
40
50
60
70
Score
point
difference
Gender difference in performance in 2000
Gender difference in performance in 2009
Chile
0
Peru­
0­
United
States
0
Mexico
0
Belgium
0
Brazil
+
Denmark
0
Spain
0
Liechtenstein
0
Hong
Kong-­
C
hina
+
Canada
0
Korea
+
Indonesia
+
­
Argentina
0
Australia
0
Thailand
0
Hungary­
0­
Portugal­
+­
Switzerland­
0­
Japan
0
Ireland
0
Germany­
0­
France
+
Israel
+
Romania
+
Iceland
0
Russian
Federation
0
Sweden
+
New
Zealand
0
Italy
0
Greece
0
Norway
0
Latvia
0
Czech
Republic
0
Poland
0
Finland
0
Bulgaria
0
Albania
0
Notes: All gender differences in PISA 2009 are significant. Gender differences in 2000 that are statistically significant are marked in a darker tone.
Countries are ranked in ascending order of gender differences (girls - boys) in 2009.
Source: OECD, PISA Database 2009, Table V.2.4.
12 http://dx.doi.org/10.1787/888932359967
Change
in
the
perfor-
mance
gap
between
boys
and
girls
in
reading
between
2000
and
2009
Girls perform better in all
countries/economies
2009 higher
than 2000
No statistically
significant difference
95% confidence level + 0
2
TRENDS IN READING
48 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V
was also mainly the result of improvements among girls, with 11 and 9 percentage points fewer girls, respectively,
performing below Level 2. The decrease in the percentage of boys performing below Level 2 in these countries was
more modest, at two and three percentage points, respectively.
In Chile and Poland, the percentage of boys and girls below Level 2 decreased by about the same amount.
In another set of countries, the percentage of students below Level 2 has risen. In Sweden, France and Spain, this
increase has occurred for both boys and girls although it has been greater for boys. In Ireland, the Czech Republic
and Iceland, only the percentage of boys with a reading proficiency below Level 2 has risen. In Thailand, on the
other hand, it has risen slightly for girls but not for boys.
In most countries, changes in the percentage of top-performing students, those at reading proficiency Level 5 or
6, are quite similar among boys and girls, but in a few countries they differ noticeably (Tables V.2.5 and V.2.6). For
example, while in Denmark and Romania the decrease in the percentage of top performers was almost identical
among boys and girls, it differed in magnitude in Finland, Australia, Canada and Ireland. In New Zealand, only the
percentage of top performers among girls decreased significantly, while in the Czech Republic and Germany, only
the percentage of top performers among boys decreased significantly.
Although the percentage of top performers increased in Japan and Korea and the partner economy Hong Kong-
China to similarly high levels, the increase was very different among boys and girls. In Korea, the increase was the
largest when looking at all students, but also when looking separately at boys and girls. Nonetheless, the percentage
of top performers increased among girls by more than nine percentage points and among boys by slightly less than
five percentage points. In Hong Kong-China, the percentage of top performers among girls increased by more than
six percentage points, while it did not change among boys. Similarly, in Japan, this proportion increased by almost
five percentage points among girls, more than among boys. Effectively, the gap in the proportion of top performers
among boys and girls widened in these countries.
• Figure V.2.8 •
Change in the share of boys and girls who are low performers in reading
between 2000 and 2009
-25
-20
-15
-10
-5
0
5
10
15
Percentage
change
between
2000
and
2009
Share of students below
proficiency Level 2 increased
Share of students below proficiency
Level 2 decreased
Chile
Indonesia
Peru
Albania
Latvia
Portugal
Poland
Israel
Liechtenstein
Brazil
Hungary
Germany
Mexico
Switzerland
Greece
Denmark
Norway
Belgium
OECD
average
–
26
Romania
Hong
Kong-China
United
States
Russian
Federation
Korea
New
Zealand
Bulgaria
Canada
Finland
Australia
Italy
Iceland
Spain
Japan
France
Sweden
Czech
Republic
Thailand
Ireland
Argentina
Change in the percentage of boys below proficiency Level 2
Change in the percentage of girls below proficiency Level 2
Note: Changes in the share of students below proficiency Level 2 that are statistically significant are marked in a darker tone.
Countries are ranked in ascending order of change in the percentage of all students below Level 2 on the reading scale between 2000 and 2009.
Source: OECD, PISA 2009 Database, Table V.2.2. Table V.2.5 and Table V.2.6.
12 http://dx.doi.org/10.1787/888932359967
2
TRENDS IN READING
PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010 49
Changes in performance and changes in student populations
The PISA assessments continue to evolve, to capture newly emerging knowledge and skills as the learning goals
and instructional practices of countries change, reflecting methodological advances. At the same time, PISA
implements high technical standards and coherence in methodologies across successive assessments, ensuring
that performance can be monitored reliably over time and that the samples of students are representative of the
same populations.
However, in many countries the demographic and socio-economic context of student populations has changed.
Thus, observed changes in learning outcomes may not only reflect changes in the quality of the educational services
provided for 15-year-olds, but also changes in the composition of the student populations. For example, if migration
into a country has been significant over the past ten years, it might influence learning outcomes. Similarly, if the
student population has become more socio-economically diverse, then this too can influence outcomes.
This section discusses how trends are affected by changes in student populations. It also provides an overall trend
line that summarises information across all PISA assessments. Annex A6 provides details on methods used in this
section. It also discusses any impact that technical changes in the national samples of students may have on the
comparability of student performance over time.
The impact of changes in the socio-economic composition of student
populations on trends in reading performance
In the following section, changes in the age and gender composition of students, the socio-economic background
of student populations, changes in the share of students who always or almost always speak the language of the
assessment at home, and changes in the share of students with foreign-born parents are accounted for when
interpreting changes in student performance. The corresponding demographic data for 2000 and 2009 are presented
in Annex A6 where the adjustment method is also explained in detail. The data on changes in socio-economic
background are provided in Table V.4.2.
Figure V.2.9 shows both the observed change in student performance and the predicted performance change if the
composition of the student population in 2000 had been similar to the one in 2009, that is, if the student population
in 2000 had the same age and gender composition, the same socio-economic background and the same share of
• Figure V.2.9 •
Changes in reading performance between 2000 and 2009
-40
-30
-20
-10
0
10
20
30
40
50
60
Score
point
change
in
reading
performance
between
2000
and
2009
Note: Observed score point changes that are statistically significant are marked in a darker tone.
Countries are ranked in descending order of the observed score point change between 2000 and 2009.
Source: OECD, PISA 2009 Database, Table V.2.7.
12 http://dx.doi.org/10.1787/888932359967
Observed score point change
Score point change adjusted for socio-­
demographic changes
Peru
Chile
Albania
Indonesia
Latvia
Israel
Poland
Portugal
Liechtenstein
Brazil
Korea
Hungary
Germany
Greece
Hong
Kong-­
C
hina
Switzerland
Mexico
OECD
average
–
­
2
6
Belgium
Bulgaria
Italy
Denmark
Norway
Russian
Federation
Japan
Romania
United
States
Iceland
New
Zealand
France
Thailand
Canada
Finland
Spain
Australia
Czech
Republic
Sweden
Argentina
Austria
Ireland
Exploring the Variety of Random
Documents with Different Content
A painted whore at half a crown.
The bright mind fouled, the beauty gay
All eaten out and fallen away,
By drunken days and weary tramps
From pub to pub by city lamps,
Till men despise the game they started
Till health and beauty are departed,
And in a slum the reeking hag
Mumbles a crust with toothy jag,
Or gets the river's help to end
The life too wrecked for man to mend.
We spat and smoked and took our swipe
Till Silas up and tap his pipe,
And begged us all to pay attention
Because he'd several things to mention.
We'd seen the fight (Hear, hear. That's you);
But still one task remained to do;
That task was his, he didn't shun it,
To give the purse to him as won it;
With this remark, from start to out
He'd never seen a brisker bout.
There was the purse. At that he'd leave it.
Let Kane come forward to receive it.
I took the purse and hemmed and bowed,
And called for gin punch for the crowd;
And when the second bowl was done,
I called, 'Let's have another one.'
Si's wife come in and sipped and sipped
(As women will) till she was pipped.
And Si hit Dicky Twot a clouter
Because he put his arm about her;
But after Si got overtasked
She sat and kissed whoever asked.
My Doxy Jane was splashed by this,
I took her on my knee to kiss.
And Tom cried out, 'O damn the gin;
Why can't we all have women in?
Bess Evans, now, or Sister Polly,
Or those two housemaids at the Folly?
Let someone nip to Biddy Price's,
They'd all come in a brace of trices.
Rose Davies, Sue, and Betsy Perks;
One man, one girl, and damn all Turks.'
But, no. 'More gin,' they cried; 'Come on.
We'll have the girls in when it's gone.'
So round the gin went, hot and heady,
Hot Hollands punch on top of deady.
Hot Hollands punch on top of stout
Puts madness in and wisdom out.
From drunken man to drunken man
The drunken madness raged and ran.
'I'm climber Joe who climbed the spire.'
'You're climber Joe the bloody liar.'
'Who says I lie?'
'I do.'
'You lie,
I climbed the spire and had a fly.'
'I'm French Suzanne, the Circus Dancer,
I'm going to dance a bloody Lancer.'
'If I'd my rights I'm Squire's heir.'
'By rights I'd be a millionaire.'
'By rights I'd be the lord of you,
But Farmer Scriggins had his do,
He done me, so I've had to hoove it,
I've got it all wrote down to prove it.
And one of these dark winter nights
He'll learn I mean to have my rights;
I'll bloody him a bloody fix,
I'll bloody burn his bloody ricks.'
From three long hours of gin and smokes,
And two girls' breath and fifteen blokes',
A warmish night, and windows shut,
The room stank like a fox's gut.
The heat and smell and drinking deep
Began to stun the gang to sleep.
Some fell downstairs to sleep on the mat,
Some snored it sodden where they sat.
Dick Twot had lost a tooth and wept,
But all the drunken others slept.
Jane slept beside me in the chair,
And I got up; I wanted air.
I opened window wide and leaned
Out of that pigstye of the fiend
And felt a cool wind go like grace
About the sleeping market-place.
The clock struck three, and sweetly, slowly,
The bells chimed Holy, Holy, Holy;
And in a second's pause there fell
The cold note of the chapel bell,
And then a cock crew, flapping wings,
And summat made me think of things
How long those ticking clocks had gone
From church and chapel, on and on,
Ticking the time out, ticking slow
To men and girls who'd come and go,
And how they ticked in belfry dark
When half the town was bishop's park,
And how they'd rung a chime full tilt
The night after the church was built,
And how that night was Lambert's Feast,
The night I'd fought and been a beast.
And how a change had come. And then
I thought, 'You tick to different men.'
What with the fight and what with drinking
And being awake alone there thinking,
My mind began to carp and tetter,
'If this life's all, the beasts are better.'
And then I thought, 'I wish I'd seen
The many towns this town has been;
I wish I knew if they'd a-got
A kind of summat we've a-not,
If them as built the church so fair
Were half the chaps folk say they were;
For they'd the skill to draw their plan,
And skill's a joy to any man;
And they'd the strength, not skill alone,
To build it beautiful in stone;
And strength and skill together thus...
O, they were happier men than us.
'But if they were, they had to die
The same as every one and I.
And no one lives again, but dies,
And all the bright goes out of eyes,
And all the skill goes out of hands,
And all the wise brain understands,
And all the beauty, all the power
Is cut down like a withered flower.
In all the show from birth to rest
I give the poor dumb cattle best.'
I wondered, then, why life should be,
And what would be the end of me
When youth and health and strength were gone
And cold old age came creeping on?
A keeper's gun? The Union ward?
Or that new quod at Hereford?
And looking round I felt disgust
At all the nights of drink and lust,
And all the looks of all the swine
Who'd said that they were friends of mine;
And yet I knew, when morning came,
The morning would be just the same,
For I'd have drinks and Jane would meet me
And drunken Silas Jones would greet me,
And I'd risk quod and keeper's gun
Till all the silly game was done.
'For parson chaps are mad supposin'
A chap can change the road he's chosen.'
And then the Devil whispered 'Saul,
Why should you want to live at all?
Why fret and sweat and try to mend?
It's all the same thing in the end.
But when it's done,' he said, 'it's ended.
Why stand it, since it can't be mended?'
And in my heart I heard him plain,
'Throw yourself down and end it, Kane.'
'Why not?' said I. 'Why not? But no.
I won't. I've never had my go.
I've not had all the world can give.
Death by and by, but first I'll live.
The world owes me my time of times,
And that time's coming now, by crimes.'
A madness took me then. I felt
I'd like to hit the world a belt.
I felt that I could fly through air,
A screaming star with blazing hair,
A rushing comet, crackling, numbing
The folk with fear of judgment coming,
A 'Lijah in a fiery car
Coming to tell folk what they are.
'That's what I'll do,' I shouted loud,
'I'll tell this sanctimonious crowd,
This town of window-peeping, prying,
Maligning, peering, hinting, lying,
Male and female human blots
Who would, but daren't be, whores and sots,
That they're so steeped in petty vice
That they're less excellent than lice,
That they're so soaked in petty virtue
That touching one of them will dirt you,
Dirt you with the stain of mean
Cheating trade and going between,
Pinching, starving, scraping, hoarding
Spying through the chinks of boarding
To see if Sue the prentice lean
Dares to touch the margarine.
Fawning, cringing, oiling boots,
Raging in the crowd's pursuits,
Flinging stones at all the Stephens,
Standing firm with all the evens,
Making hell for all the odd,
All the lonely ones of God,
Those poor lonely ones who find
Dogs more mild than human kind.
For dogs,' I said, 'are nobles born
To most of you, you cockled corn.
I've known dogs to leave their dinner,
Nosing a kind heart in a sinner.
Poor old Crafty wagged his tail
The day I first came home from jail,
When all my folk, so primly clad,
Glowered black and thought me mad,
And muttered how they'd been respected,
While I was what they'd all expected.
(I've thought of that old dog for years,
And of how near I come to tears.)
'But you, you minds of bread and cheese,
Are less divine than that dog's fleas.
You suck blood from kindly friends,
And kill them when it serves your ends.
Double traitors, double black,
Stabbing only in the back,
Stabbing with the knives you borrow
From the friends you bring to sorrow.
You stab all that's true and strong;
Truth and strength you say are wrong;
Meek and mild, and sweet and creeping,
Repeating, canting, cadging, peeping,
That's the art and that's the life
To win a man his neighbour's wife.
All that's good and all that's true,
You kill that, so I'll kill you.'
At that I tore my clothes in shreds
And hurled them on the window leads;
I flung my boots through both the winders
And knocked the glass to little flinders;
The punch bowl and the tumblers followed,
And then I seized the lamps and holloed.
And down the stairs, and tore back bolts,
As mad as twenty blooded colts;
And out into the street I pass,
As mad as two-year-olds at grass,
A naked madman waving grand
A blazing lamp in either hand.
I yelled like twenty drunken sailors,
'The devil's come among the tailors.'
A blaze of flame behind me streamed,
And then I clashed the lamps and screamed
'I'm Satan, newly come from hell.'
And then I spied the fire-bell.
I've been a ringer, so I know
How best to make a big bell go.
So on to bell-rope swift I swoop,
And stick my one foot in the loop
And heave a down-swig till I groan,
'Awake, you swine, you devil's own.'
I made the fire-bell awake,
I felt the bell-rope throb and shake;
I felt the air mingle and clang
And beat the walls a muffled bang,
And stifle back and boom and bay
Like muffled peals on Boxing Day,
And then surge up and gather shape,
And spread great pinions and escape;
And each great bird of clanging shrieks
O Fire, Fire! from iron beaks.
My shoulders cracked to send around
Those shrieking birds made out of sound
With news of fire in their bills.
(They heard 'em plain beyond Wall Hills.)
Up go the winders, out come heads,
I heard the springs go creak in beds;
But still I heave and sweat and tire,
And still the clang goes 'Fire, Fire!'
'Where is it, then? Who is it, there?
You ringer, stop, and tell us where.'
'Run round and let the Captain know.'
'It must be bad, he's ringing so.'
'It's in the town, I see the flame;
Look there! Look there, how red it came.'
'Where is it, then 'O stop the bell.'
I stopped and called: 'It's fire of hell;
And this is Sodom and Gomorrah,
And now I'll burn you up, begorra.'
By this the firemen were mustering,
The half-dressed stable men were flustering,
Backing the horses out of stalls
While this man swears and that man bawls,
'Don't take th'old mare. Back, Toby, back.
Back, Lincoln. Where's the fire, Jack?'
'Damned if I know. Out Preston way.'
'No. It's at Chancey's Pitch, they say.'
'It's sixteen ricks at Pauntley burnt.'
'You back old Darby out, I durn't.'
They ran the big red engine out,
And put 'em to with damn and shout.
And then they start to raise the shire,
'Who brought the news, and where's the fire?'
They'd moonlight, lamps, and gas to light 'em.
I give a screech-owl's screech to fright 'em,
And snatch from underneath their noses
The nozzles of the fire hoses.
'I am the fire. Back, stand back,
Or else I'll fetch your skulls a crack;
D'you see these copper nozzles here?
They weigh ten pounds apiece, my dear;
I'm fire of hell come up this minute
To burn this town, and all that's in it.
To burn you dead and burn you clean,
You cogwheels in a stopped machine,
You hearts of snakes, and brains of pigeons,
You dead devout of dead religions,
You offspring of the hen and ass,
By Pilate ruled, and Caiaphas.
Now your account is totted. Learn
Hell's flames are loose and you shall burn.'
At that I leaped and screamed and ran,
I heard their cries go 'Catch him, man.'
'Who was it?' 'Down him.' 'Out him, Ern.
'Duck him at pump, we'll see who'll burn.'
A policeman clutched, a fireman clutched,
A dozen others snatched and touched.
'By God, he's stripped down to his buff.'
'By God, we'll make him warm enough.'
'After him.' 'Catch him,' 'Out him,' 'Scrob him.
'We'll give him hell.' 'By God, we'll mob him.'
'We'll duck him, scrout him, flog him, fratch him.
'All right,' I said. 'But first you'll catch him.'
The men who don't know to the root
The joy of being swift of foot,
Have never known divine and fresh
The glory of the gift of flesh,
Nor felt the feet exult, nor gone
Along a dim road, on and on,
Knowing again the bursting glows,
The mating hare in April knows,
Who tingles to the pads with mirth
At being the swiftest thing on earth.
O, if you want to know delight,
Run naked in an autumn night,
And laugh, as I laughed then, to find
A running rabble drop behind,
And whang, on every door you pass,
Two copper nozzles, tipped with brass,
And doubly whang at every turning,
And yell, 'All hell's let loose, and burning.'
I beat my brass and shouted fire
At doors of parson, lawyer, squire,
At all three doors I threshed and slammed
And yelled aloud that they were damned.
I clodded squire's glass with turves
Because he spring-gunned his preserves.
Through parson's glass my nozzle swishes
Because he stood for loaves and fishes,
But parson's glass I spared a tittle.
He give me an orange once when little,
And he who gives a child a treat
Makes joy-bells ring in Heaven's street,
And he who gives a child a home
Builds palaces in Kingdom come,
And she who gives a baby birth
Brings Saviour Christ again to Earth,
For life is joy, and mind is fruit,
And body's precious earth and root.
But lawyer's glass--well, never mind,
Th'old Adam's strong in me, I find.
God pardon man, and may God's son
Forgive the evil things I've done.
What more? By Dirty Lane I crept
Back to the Lion, where I slept.
The raging madness hot and floodin'
Boiled itself out and left me sudden,
Left me worn out and sick and cold,
Aching as though I'd all grown old;
So there I lay, and there they found me
On door-mat, with a curtain round me.
Si took my heels and Jane my head
And laughed, and carried me to bed.
And from the neighbouring street they reskied
My boots and trousers, coat and weskit;
They bath-bricked both the nozzles bright
To be mementoes of the night,
And knowing what I should awake with
They flannelled me a quart to slake with,
And sat and shook till half-past two
Expecting Police Inspector Drew.
I woke and drank, and went to meat
In clothes still dirty from the street.
Down in the bar I heard 'em tell
How someone rang the fire-bell,
And how th'inspector's search had thriven,
And how five pounds reward was given.
And Shepherd Boyce, of Marley, glad us
By saying it was blokes from mad'us,
Or two young rips lodged at the Prince
Whom none had seen nor heard of since,
Or that young blade from Worcester Walk
(You know how country people talk).
Young Joe the ostler come in sad,
He said th'old mare had bit his dad.
He said there'd come a blazing screeching
Daft Bible-prophet chap a-preaching,
Had put th'old mare in such a taking
She'd thought the bloody earth was quaking.
And others come and spread a tale
Of cut-throats out of Gloucester jail,
And how we needed extra cops
With all them Welsh come picking hops;
With drunken Welsh in all our sheds
We might be murdered in our beds.
By all accounts, both men and wives
Had had the scare up of their lives.
I ate and drank and gathered strength,
And stretched along the bench full length,
Or crossed to window seat to pat
Black Silas Jones's little cat.
At four I called, 'You devil's own,
The second trumpet shall be blown.
The second trump, the second blast;
Hell's flames are loosed, and judgment's passed.
Too late for mercy now. Take warning
I'm death and hell and Judgment morning.'
I hurled the bench into the settle,
I banged the table on the kettle,
I sent Joe's quart of cider spinning.
'Lo, here begins my second inning.'
Each bottle, mug, and jug and pot
I smashed to crocks in half a tot;
And Joe, and Si, and Nick, and Percy
I rolled together topsy versy.
And as I ran I heard 'em call,
'Now damn to hell, what's gone with Saul?'
Out into street I ran uproarious
The devil dancing in me glorious.
And as I ran I yell and shriek
'Come on, now, turn the other cheek.'
Across the way by almshouse pump
I see old puffing parson stump.
Old parson, red-eyed as a ferret
From nightly wrestlings with the spirit;
I ran across, and barred his path.
His turkey gills went red as wrath
And then he froze, as parsons can.
'The police will deal with you, my man.'
'Not yet,' said I, 'not yet they won't;
And now you'll hear me, like or don't.
The English Church both is and was
A subsidy of Caiaphas.
I don't believe in Prayer nor Bible,
They're lies all through, and you're a libel,
A libel on the Devil's plan
When first he miscreated man.
You mumble through a formal code
To get which martyrs burned and glowed.
I look on martyrs as mistakes,
But still they burned for it at stakes;
Your only fire's the jolly fire
Where you can guzzle port with Squire,
And back and praise his damned opinions
About his temporal dominions.
You let him give the man who digs,
A filthy hut unfit for pigs,
Without a well, without a drain,
With mossy thatch that lets in rain,
Without a 'lotment, 'less he rent it,
And never meat, unless he scent it,
But weekly doles of 'leven shilling
To make a grown man strong and willing,
To do the hardest work on earth
And feed his wife when she gives birth,
And feed his little children's bones.
I tell you, man, the Devil groans.
With all your main and all your might
You back what is against what's right;
You let the Squire do things like these,
You back him in't and give him ease,
You take his hand, and drink his wine,
And he's a hog, but you're a swine.
For you take gold to teach God's ways
And teach man how to sing God's praise.
And now I'll tell you what you teach
In downright honest English speech.
'You teach the ground-down starving man
That Squire's greed's Jehovah's plan.
You get his learning circumvented
Lest it should make him discontented
(Better a brutal, starving nation
Than men with thoughts above their station),
You let him neither read nor think,
You goad his wretched soul to drink
And then to jail, the drunken boor;
O sad intemperance of the poor.
You starve his soul till it's rapscallion,
Then blame his flesh for being stallion.
You send your wife around to paint
The golden glories of "restraint."
How moral exercise bewild'rin'
Would soon result in fewer children.
You work a day in Squire's fields
And see what sweet restraint it yields;
A woman's day at turnip picking,
Your heart's too fat for plough or ricking.
'And you whom luck taught French and Greek
Have purple flaps on either cheek,
A stately house, and time for knowledge,
And gold to send your sons to college,
That pleasant place, where getting learning
Is also key to money earning.
But quite your damn'dest want of grace
Is what you do to save your face;
The way you sit astride the gates
By padding wages out of rates;
Your Christmas gifts of shoddy blankets
That every working soul may thank its
Loving parson, loving squire
Through whom he can't afford a fire.
Your well-packed bench, your prison pen,
To keep them something less than men;
Your friendly clubs to help 'em bury,
Your charities of midwifery.
Your bidding children duck and cap
To them who give them workhouse pap.
O, what you are, and what you preach,
And what you do, and what you teach
Is not God's Word, nor honest schism,
But Devil's cant and pauperism.'
By this time many folk had gathered
To listen to me while I blathered;
I said my piece, and when I'd said it,
I'll do old purple parson credit,
He sunk (as sometimes parsons can)
His coat's excuses in the man.
'You think that Squire and I are kings
Who made the existing state of things,
And made it ill. I answer, No,
States are not made, nor patched; they grow,
Grow slow through centuries of pain
And grow correctly in the main,
But only grow by certain laws
Of certain bits in certain jaws.
You want to doctor that. Let be.
You cannot patch a growing tree.
Put these two words beneath your hat,
These two: securus judicat.
The social states of human kinds
Are made by multitudes of minds.
And after multitudes of years
A little human growth appears
Worth having, even to the soul
Who sees most plain it's not the whole.
This state is dull and evil, both,
I keep it in the path of growth;
You think the Church an outworn fetter;
Kane, keep it, till you've built a better.
And keep the existing social state;
I quite agree it's out of date,
One does too much, another shirks,
Unjust, I grant; but still ... it works.
To get the whole world out of bed
And washed, and dressed, and warmed, and fed,
To work, and back to bed again,
Believe me, Saul, costs worlds of pain.
Then, as to whether true or sham
That book of Christ, Whose priest I am;
The Bible is a lie, say you,
Where do you stand, suppose it true?
Good-bye. But if you've more to say,
My doors are open night and day.
Meanwhile, my friend, 'twould be no sin
To mix more water in your gin.
We're neither saints nor Philip Sidneys,
But mortal men with mortal kidneys.'
He took his snuff, and wheezed a greeting,
And waddled off to mothers' meeting;
I hung my head upon my chest,
I give old purple parson best.
For while the Plough tips round the Pole
The trained mind outs the upright soul,
As Jesus said the trained mind might,
Being wiser than the sons of light,
But trained men's minds are spread so thin
They let all sorts of darkness in;
Whatever light man finds they doubt it,
They love not light, but talk about it.
But parson'd proved to people's eyes
That I was drunk, and he was wise;
And people grinned and women tittered,
And little children mocked and twittered
So blazing mad, I stalked to bar
To show how noble drunkards are,
And guzzled spirits like a beast,
To show contempt for Church and priest,
Until, by six, my wits went round
Like hungry pigs in parish pound.
At half-past six, rememb'ring Jane,
I staggered into street again
With mind made up (or primed with gin)
To bash the cop who'd run me in;
For well I knew I'd have to cock up
My legs that night inside the lock-up,
And it was my most fixed intent
To have a fight before I went.
Our Fates are strange, and no one knows his;
Our lovely Saviour Christ disposes.
Jane wasn't where we'd planned, the jade.
She'd thought me drunk and hadn't stayed.
So I went up the Walk to look for her
And lingered by the little brook for her,
And dowsed my face, and drank at spring,
And watched two wild duck on the wing.
The moon come pale, the wind come cool,
A big pike leapt in Lower Pool,
The peacock screamed, the clouds were straking,
My cut cheek felt the weather breaking;
An orange sunset waned and thinned
Foretelling rain and western wind,
And while I watched I heard distinct
The metals on the railway clinked.
The blood-edged clouds were all in tatters,
The sky and earth seemed mad as hatters;
They had a death look, wild and odd,
Of something dark foretold by God.
And seeing it so, I felt so shaken
I wouldn't keep the road I'd taken,
But wandered back towards the inn
Resolved to brace myself with gin.
And as I walked, I said, 'It's strange,
There's Death let loose to-night, and Change.'
In Cabbage Walk I made a haul
Of two big pears from lawyer's wall,
And, munching one, I took the lane
Back into Market-place again.
Lamp-lighter Dick had passed the turning
And all the Homend lamps were burning,
The windows shone, the shops were busy,
But that strange Heaven made me dizzy.
The sky had all God's warning writ
In bloody marks all over it,
And over all I thought there was
A ghastly light beside the gas.
The Devil's tasks and Devil's rages
Were giving me the Devil's wages.
In Market-place it's always light,
The big shop windows make it bright;
And in the press of people buying
I spied a little fellow crying
Because his mother'd gone inside
And left him there, and so he cried.
And mother'd beat him when she found him,
And mother's whip would curl right round him,
And mother'd say he'd done't to crost her,
Though there being crowds about he'd lost her.
Lord, give to men who are old and rougher
The things that little children suffer,
And let keep bright and undefiled
The young years of the little child.
I pat his head at edge of street
And gi'm my second pear to eat.
Right under lamp, I pat his head,
'I'll stay till mother come,' I said,
And stay I did, and joked and talked,
And shoppers wondered as they walked.
'There's that Saul Kane, the drunken blaggard,
Talking to little Jimmy Jaggard.
The drunken blaggard reeks of drink.'
'Whatever will his mother think?'
'Wherever has his mother gone?
Nip round to Mrs Jaggard's, John,
And say her Jimmy's out again,
In Market-place, with boozer Kane.'
'When he come out to-day he staggered.
O, Jimmy Jaggard, Jimmy Jaggard.'
'His mother's gone inside to bargain,
Run in and tell her, Polly Margin,
And tell her poacher Kane is tipsy
And selling Jimmy to a gipsy.'
'Run in to Mrs Jaggard, Ellen,
Or else, dear knows, there'll be no tellin',
And don't dare leave yer till you've fount her,
You'll find her at the linen counter.'
I told a tale, to Jim's delight,
Of where the tom-cats go by night,
And how when moonlight come they went
Among the chimneys black and bent,
From roof to roof, from house to house,
With little baskets full of mouse
All red and white, both joint and chop
Like meat out of a butcher's shop;
Then all along the wall they creep
And everyone is fast asleep,
And honey-hunting moths go by,
And by the bread-batch crickets cry;
Then on they hurry, never waiting
To lawyer's backyard cellar grating
Where Jaggard's cat, with clever paw,
Unhooks a broke-brick's secret door;
Then down into the cellar black,
Across the wood slug's slimy track,
Into an old cask's quiet hollow,
Where they've got seats for what's to follow;
Then each tom-cat lights little candles,
And O, the stories and the scandals,
And O, the songs and Christmas carols,
And O, the milk from little barrels.
They light a fire fit for roasting
(And how good mouse-meat smells when toasting),
Then down they sit to merry feast
While moon goes west and sun comes east.
Sometimes they make so merry there
Old lawyer come to head of stair
To 'fend with fist and poker took firm
His parchments channelled by the bookworm,
And all his deeds, and all his packs
Of withered ink and sealing wax;
And there he stands, with candle raised,
And listens like a man amazed,
Or like a ghost a man stands dumb at,
He says, 'Hush! Hush! I'm sure there's summat!'
He hears outside the brown owl call,
He hears the death-tick tap the wall,
The gnawing of the wainscot mouse,
The creaking up and down the house,
The unhooked window's hinges ranging,
The sounds that say the wind is changing.
At last he turns, and shakes his head,
'It's nothing, I'll go back to bed.'
And just then Mrs Jaggard came
To view and end her Jimmy's shame.
She made one rush and gi'm a bat
And shook him like a dog a rat.
'I can't turn round but what you're straying.
I'll give you tales and gipsy playing.
I'll give you wand'ring off like this
And listening to whatever 't is,
You'll laugh the little side of the can,
You'll have the whip for this, my man;
And not a bite of meat nor bread
You'll touch before you go to bed.
Some day you'll break your mother's heart,
After God knows she's done her part,
Working her arms off day and night
Trying to keep your collars white.
Look at your face, too, in the street.
What dirty filth 've you found to eat?
Now don't you blubber here, boy, or
I'll give you sum't to blubber for.'
She snatched him off from where we stand
And knocked the pear-core from his hand,
And looked at me, 'You Devil's limb,
How dare you talk to Jaggard's Jim;
You drunken, poaching, boozing brute, you.
If Jaggard was a man he'd shoot you.'
She glared all this, but didn't speak,
She gasped, white hollows in her cheek;
Jimmy was writhing, screaming wild,
The shoppers thought I'd killed the child.
I had to speak, so I begun.
'You'd oughtn't beat your little son;
He did no harm, but seeing him there
I talked to him and gi'm a pear;
I'm sure the poor child meant no wrong,
It's all my fault he stayed so long,
He'd not have stayed, mum, I'll be bound
If I'd not chanced to come around.
It's all my fault he stayed, not his.
I kept him here, that's how it is.'
'Oh! And how dare you, then?' says she,
'How dare you tempt my boy from me?
How dare you do't, you drunken swine,
Is he your child or is he mine?
A drunken sot they've had the beak to,
Has got his dirty whores to speak to,
His dirty mates with whom he drink,
Not little children, one would think.
Look on him, there,' she says, 'look on him
And smell the stinking gin upon him,
The lowest sot, the drunk'nest liar,
The dirtiest dog in all the shire:
Nice friends for any woman's son
After ten years, and all she's done.
'For I've had eight, and buried five,
And only three are left alive.
I've given them all we could afford,
I've taught them all to fear the Lord.
They've had the best we had to give,
The only three the Lord let live.
'For Minnie whom I loved the worst
Died mad in childbed with her first.
And John and Mary died of measles,
And Rob was drownded at the Teasels.
And little Nan, dear little sweet,
A cart run over in the street;
Her little shift was all one stain,
I prayed God put her out of pain.
And all the rest are gone or going
The road to hell, and there's no knowing
For all I've done and all I've made them
I'd better not have overlaid them.
For Susan went the ways of shame
The time the 'till'ry regiment came,
And t'have her child without a father
I think I'd have her buried rather.
And Dicky boozes, God forgimme,
And now't's to be the same with Jimmy.
And all I've done and all I've bore
Has made a drunkard and a whore,
A bastard boy who wasn't meant,
And Jimmy gwine where Dicky went;
For Dick began the self-same way
And my old hairs are going gray,
And my poor man's a withered knee,
And all the burden falls on me.
'I've washed eight little children's limbs,
I've taught eight little souls their hymns,
I've risen sick and lain down pinched
And borne it all and never flinched;
But to see him, the town's disgrace,
With God's commandments broke in's face,
Who never worked, not he, nor earned,
Nor will do till the seas are burned,
Who never did since he was whole
A hand's turn for a human soul,
But poached and stole and gone with women,
And swilled down gin enough to swim in;
To see him only lift one finger
To make my little Jimmy linger.
In spite of all his mother's prayers,
And all her ten long years of cares,
And all her broken spirit's cry
That drunkard's finger puts them by,
And Jimmy turns. And now I see
That just as Dick was, Jim will be,
And all my life will have been vain.
Welcome to our website – the perfect destination for book lovers and
knowledge seekers. We believe that every book holds a new world,
offering opportunities for learning, discovery, and personal growth.
That’s why we are dedicated to bringing you a diverse collection of
books, ranging from classic literature and specialized publications to
self-development guides and children's books.
More than just a book-buying platform, we strive to be a bridge
connecting you with timeless cultural and intellectual values. With an
elegant, user-friendly interface and a smart search system, you can
quickly find the books that best suit your interests. Additionally,
our special promotions and home delivery services help you save time
and fully enjoy the joy of reading.
Join us on a journey of knowledge exploration, passion nurturing, and
personal growth every day!
ebookbell.com

Pisa 2009 Results Learning Trends Changes In Student Performance Since 2000 Oecd

  • 1.
    Pisa 2009 ResultsLearning Trends Changes In Student Performance Since 2000 Oecd download https://ebookbell.com/product/pisa-2009-results-learning-trends- changes-in-student-performance-since-2000-oecd-6770630 Explore and download more ebooks at ebookbell.com
  • 2.
    Here are somerecommended products that we believe you will be interested in. You can click the link to download. Pisa 2009 Results Learning To Learn Student Engagement Strategies And Practices Volume Iii Oecd https://ebookbell.com/product/pisa-2009-results-learning-to-learn- student-engagement-strategies-and-practices-volume-iii-oecd-1974840 Pisa 2009 Results Vol 2 Overcoming Social Background Equity In Learning Opportunities And Outcomes Oecd https://ebookbell.com/product/pisa-2009-results-vol-2-overcoming- social-background-equity-in-learning-opportunities-and-outcomes- oecd-6770528 Learners For Life Results From Pisa 2000 Student Approaches To Learning Prepared By Organisation For Economic Cooperation And Development Oecd https://ebookbell.com/product/learners-for-life-results-from- pisa-2000-student-approaches-to-learning-prepared-by-organisation-for- economic-cooperation-and-development-oecd-6781354 Learning For Tomorrows World First Reseults From Pisa 2003 Programme For International Student Assessment Oecd https://ebookbell.com/product/learning-for-tomorrows-world-first- reseults-from-pisa-2003-programme-for-international-student- assessment-oecd-6781388
  • 3.
    Pisa 2009 ResultsStudents On Line Digital Technologies And Performance Oecd https://ebookbell.com/product/pisa-2009-results-students-on-line- digital-technologies-and-performance-oecd-2449032 Pisa 2009 Results Vol 4 What Makes A School Successful Resources Policies And Practices Oecd https://ebookbell.com/product/pisa-2009-results-vol-4-what-makes-a- school-successful-resources-policies-and-practices-oecd-6770620 Pisa 2009 Results What Students Know And Can Do Student Performance In Reading Mathematics And Science Volume I Oecd https://ebookbell.com/product/pisa-2009-results-what-students-know- and-can-do-student-performance-in-reading-mathematics-and-science- volume-i-oecd-1728666 Reading For Change Performance And Engagement Across Countries Results From Pisa 2000 Oecd https://ebookbell.com/product/reading-for-change-performance-and- engagement-across-countries-results-from-pisa-2000-oecd-6781330 Literacy Skills For The World Of Tomorrow Further Results From Pisa 2000 Oecd https://ebookbell.com/product/literacy-skills-for-the-world-of- tomorrow-further-results-from-pisa-2000-oecd-6781352
  • 5.
    www.oecd.org/publishing ISBN 978-92-64-09149-8 98 201011 1 P -:HSTCQE=U^VY^]: PISA 2009 Results: Learning Trends CHANGES IN STUDENT PERFORMANCE SINCE 2000 – VOLUME V Please cite this publication as: OECD (2010), PISA 2009 Results: Learning Trends: Changes in Student Performance Since 2000 (Volume V), PISA, OECD Publishing. http://dx.doi.org/10.1787/9789264091580-en This work is published on the OECD iLibrary, which gathers all OECD books, periodicals and statistical databases. Visit www.oecd-ilibrary.org, and do not hesitate to contact us for more information. PISA 2009 Results: Learning Trends CHANGES IN STUDENT PERFORMANCE SINCE 2000 VOLUME V Are students well prepared to meet the challenges of the future? Can they analyse, reason and communicate their ideas effectively? Have they found the kinds of interests they can pursue throughout their lives as productive members of the economy and society? The OECD Programme for International Student Assessment (PISA) seeks to answer these questions through the most comprehensive and rigorous international assessment of student knowledge and skills. Together, the group of countries and economies participating in PISA represents nearly 90% of the world economy. PISA 2009 Results presents the findings from the most recent PISA survey, which focused on reading and also assessed mathematics and science performance. The report comprises six volumes: • Volume I, What Students Know and Can Do: Student Performance in Reading, Mathematics and Science, compares the knowledge and skills of students across countries. • Volume II, Overcoming Social Background: Equity in Learning Opportunities and Outcomes, looks at how successful education systems moderate the impact of social background and immigrant status on student and school performance. • Volume III, Learning to Learn: Student Engagement, Strategies and Practices, examines 15-year-olds’ motivation, their engagement with reading and their use of effective learning strategies. • Volume IV, What Makes a School Successful? Resources, Policies and Practices, examines how human, financial and material resources, and education policies and practices shape learning outcomes. • Volume V, Learning Trends: Changes in Student Performance Since 2000, looks at the progress countries have made in raising student performance and improving equity in the distribution of learning opportunities. • Volume VI, Students on Line: Reading and Using Digital Information, explores students’ use of information technologies to learn. PISA 2009 marks the beginning of the second cycle of surveys, with an assessment in mathematics scheduled for 2012 and one in science for 2015. THE OECD PROGRAMME FOR INTERNATIONAL STUDENT ASSESSMENT (PISA) PISA focuses on young people’s ability to use their knowledge and skills to meet real-life challenges. This orientation reflects a change in the goals and objectives of curricula themselves, which are increasingly concerned with what students can do with what they learn at school and not merely with whether they have mastered specific curricular content. PISA’s unique features include its: – Policy orientation, which highlights differences in performance patterns and identifies features common to high-performing students, schools and education systems by linking data on learning outcomes with data on student characteristics and other key factors that shape learning in and outside of school. – Innovative concept of “literacy”, which refers both to students’ capacity to apply knowledge and skills in key subject areas and to their ability to analyse, reason and communicate effectively as they pose, interpret and solve problems in a variety of situations. – Relevance to lifelong learning, which goes beyond assessing students’ competencies in school subjects by asking them to report on their motivation to learn, their beliefs about themselves and their learning strategies. – Regularity, which enables countries to monitor their progress in meeting key learning objectives. – Breadth of geographical coverage and collaborative nature, which, in PISA 2009, encompasses the 34 OECD member countries and 41 partner countries and economies. P r o g r a m m e f o r I n t e r n a t i o n a l S t u d e n t A s s e s s m e n t PISA 2009 Results: Learning Trends CHANGES IN STUDENT PERFORMANCE SINCE 2000 VOLUME V 982010111cov.indd 1 29-Nov-2010 2:31:06 PM
  • 6.
    PISA 2009 Results: LearningTrends CHANGES IN STUDENT PERFORMANCE SINCE 2000 (Volume V)
  • 7.
    The statistical datafor Israel are supplied by and under the responsibility of the relevant Israeli authorities.The use of such data by the OECD is without prejudice to the status of the Golan Heights, East Jerusalem and Israeli settlements in the West Bank under the terms of international law. Photo credits: Getty Images © Ariel Skelley Getty Images © Geostock Getty Images © Jack Hollingsworth Stocklib Image Bank © Yuri Arcurs Corrigenda to OECD publications may be found on line at: www.oecd.org/publishing/corrigenda. PISATM , OECD/PISATM and the PISA logo are trademaks of the Organisation for Economic Co-operation and Development (OECD). All use of OECD trademarks is prohibited without written permission from the OECD. © OECD 2010 You can copy, download or print OECD content for your own use, and you can include excerpts from OECD publications, databases and multimedia products in your own documents, presentations, blogs, websites and teaching materials, provided that suiTabelle acknowledgment of OECD as source and copyright owner is given. All requests for public or commercial use and translation rights should be submitted to [email protected]. Requests for permission to photocopy portions of this material for public or commercial use shall be addressed directly to the Copyright Clearance Center (CCC) at [email protected] or the Centre français d’exploitation du droit de copie (CFC) at [email protected]. This work is published on the responsibility of the Secretary-General of the OECD. The opinions expressed and arguments employed herein do not necessarily reflect the official views of the Organisation or of the governments of its member countries. Please cite this publication as: OECD (2010), PISA 2009 Results: Learning Trends: Changes in Student Performance Since 2000 (Volume V) http://dx.doi.org/10.1787/9789264091580-en ISBN 978-92-64-09149-8 (print) ISBN 978-92-64-09158-0 (PDF)
  • 8.
    Foreword 3 PISA 2009 Results:LEARNING TRENDS – VOLUME V © OECD 2010 One of the ultimate goals of policy makers is to enable citizens to take advantage of a globalised world economy. This is leading them to focus on the improvement of education policies, ensuring the quality of service provision, a more equitable distribution of learning opportunities and stronger incentives for greater efficiency in schooling. Such policies hinge on reliable information on how well education systems prepare students for life. Most countries monitor students’ learning and the performance of schools. But in a global economy, the yardstick for success is no longer improvement by national standards alone, but how education systems perform internationally. The OECD has taken up that challenge by developing PISA, the Programme for International Student Assessment, which evaluates the quality, equity and efficiency of school systems in some 70 countries that, together, make up nine- tenths of the world economy. PISA represents a commitment by governments to monitor the outcomes of education systems regularly within an internationally agreed framework and it provides a basis for international collaboration in defining and implementing educational policies. The results from the PISA 2009 assessment reveal wide differences in educational outcomes, both within and across countries. The education systems that have been able to secure strong and equitable learning outcomes, and to mobilise rapid improvements, show others what is possible to achieve. Naturally, GDP per capita influences educational success, but this only explains 6% of the differences in average student performance. The other 94% reflect the potential for public policy to make a difference. The stunning success of Shanghai-China, which tops every league table in this assessment by a clear margin, shows what can be achieved with moderate economic resources in a diverse social context. In mathematics, more than a quarter of Shanghai-China’s 15-year-olds can conceptualise, generalise, and creatively use information based on their own investigations and modelling of complex problem situations. They can apply insight and understanding and develop new approaches and strategies when addressing novel situations. In the OECD area, just 3% of students reach this level of performance. While better educational outcomes are a strong predictor of economic growth, wealth and spending on education alone are no guarantee for better educational outcomes. Overall, PISA shows that an image of a world divided neatly into rich and well-educated countries and poor and badly-educated countries is out of date. This finding represents both a warning and an opportunity. It is a warning to advanced economies that they cannot take for granted that they will forever have “human capital” superior to that in other parts of the world. At a time of intensified global competition, they will need to work hard to maintain a knowledge and skill base that keeps up with changing demands. PISA underlines, in particular, the need for many advanced countries to tackle educational underperformance so that as many members of their future workforces as possible are equipped with at least the baseline competencies that enable them to participate in social and economic development. Otherwise, the high social and economic cost of poor educational performance in advanced economies risks becoming a significant drag on economic development. At the same time, the findings show that poor skills are not an inevitable consequence of low national income – an important outcome for countries that need to achieve more with less. But PISA also shows that there is no reason for despair. Countries from a variety of starting points have shown the potential to raise the quality of educational outcomes substantially. Korea’s average performance was already high in 2000, but Korean policy makers were concerned that only a narrow elite achieved levels of excellence in PISA. Within less than a decade, Korea was able to double the share of students demonstrating excellence in reading literacy. A major overhaul of Poland’s school system helped to dramatically reduce performance variability among
  • 9.
    Foreword 4 © OECD2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V schools, reduce the share of poorly performing students and raise overall performance by the equivalent of more than half a school year. Germany was jolted into action when PISA 2000 revealed a below-average performance and large social disparities in results, and has been able to make progress on both fronts. Israel, Italy and Portugal have moved closer to the OECD average and Brazil, Chile, Mexico and Turkey are among the countries with impressive gains from very low levels of performance. But the greatest value of PISA lies in inspiring national efforts to help students to learn better, teachers to teach better, and school systems to become more effective. A closer look at high-performing and rapidly improving education systems shows that these systems have many commonalities that transcend differences in their history, culture and economic evolution. First, while most nations declare their commitment to education, the test comes when these commitments are weighed against others. How do they pay teachers compared to the way they pay other highly-skilled workers? How are education credentials weighed against other qualifications when people are being considered for jobs? Would you want your child to be a teacher? How much attention do the media pay to schools and schooling? Which matters more, a community’s standing in the sports leagues or its standing in the student academic achievement league tables? Are parents more likely to encourage their children to study longer and harder or to spend more time with their friends or in sports activities? In the most successful education systems, the political and social leaders have persuaded their citizens to make the choices needed to show that they value education more than other things. But placing a high value on education will get a country only so far if the teachers, parents and citizens of that country believe that only some subset of the nation’s children can or need to achieve world class standards. This report shows clearly that education systems built around the belief that students have different pre-ordained professional destinies to be met with different expectations in different school types tend to be fraught with large social disparities. In contrast, the best-performing education systems embrace the diversity in students’ capacities, interests and social background with individualised approaches to learning. Second, high-performing education systems stand out with clear and ambitious standards that are shared across the system, focus on the acquisition of complex, higher-order thinking skills, and are aligned with high stakes gateways and instructional systems. In these education systems, everyone knows what is required to get a given qualification, in terms both of the content studied and the level of performance that has to be demonstrated to earn it. Students cannot go on to the next stage of their life – be it work or further education – unless they show that they are qualified to do so. They know what they have to do to realise their dream and they put in the work that is needed to achieve it. Third, the quality of an education system cannot exceed the quality of its teachers and principals, since student learning is ultimately the product of what goes on in classrooms. Corporations, professional partnerships and national governments all know that they have to pay attention to how the pool from which they recruit is established; how they recruit; the kind of initial training their recruits receive before they present themselves for employment; how they mentor new recruits and induct them into their service; what kind of continuing training they get; how their compensation is structured; how they reward their best performers and how they improve the performance of those who are struggling; and how they provide opportunities for the best performers to acquire more status and responsibility. Many of the world’s best-performing education systems have moved from bureaucratic “command and control” environments towards school systems in which the people at the frontline have much more control of the way resources are used, people are deployed, the work is organised and the way in which the work gets done. They provide considerable discretion to school heads and school faculties in determining how resources are allocated, a factor which the report shows to be closely related to school performance when combined with effective accountability systems. And they provide an environment in which teachers work together to frame what they believe to be good practice, conduct field-based research to confirm or disprove the approaches they develop, and then assess their colleagues by the degree to which they use practices proven effective in their classrooms. Last but not least, the most impressive outcome of world-class education systems is perhaps that they deliver high- quality learning consistently across the entire education system, such that every student benefits from excellent learning opportunities. To achieve this, they invest educational resources where they can make the greatest difference, they attract the most talented teachers into the most challenging classrooms, and they establish effective spending choices that prioritise the quality of teachers.
  • 10.
    Foreword 5 PISA 2009 Results:LEARNING TRENDS – VOLUME V © OECD 2010 These are, of course, not independently conceived and executed policies. They need to be aligned across all aspects of the system, they need to be coherent over sustained periods of time, and they need to be consistently implemented. The path of reform can be fraught with political and practical obstacles. Moving away from administrative and bureaucratic control toward professional norms of control can be counterproductive if a nation does not yet have teachers and schools with the capacity to implement these policies and practices. Pushing authority down to lower levels can be as problematic if there is not agreement on what the students need to know and should be able to do. Recruiting high-quality teachers is not of much use if those who are recruited are so frustrated by what they perceive to be a mindless system of initial teacher education that they will not participate in it and turn to another profession. Thus a country’s success in making these transitions depends greatly on the degree to which it is successful in creating and executing plans that, at any given time, produce the maximum coherence in the system. These are daunting challenges and thus devising effective education policies will become ever more difficult as schools need to prepare students to deal with more rapid change than ever before, for jobs that have not yet been created, to use technologies that have not yet been invented and to solve economic and social challenges that we do not yet know will arise. But those school systems that do well today, as well as those that have shown rapid improvement, demonstrate that it can be done. The world is indifferent to tradition and past reputations, unforgiving of frailty and complacency and ignorant of custom or practice. Success will go to those individuals and countries that are swift to adapt, slow to complain and open to change. The task of governments will be to ensure that countries rise to this challenge. The OECD will continue to support their efforts. *** This report is the product of a collaborative effort between the countries participating in PISA, the experts and institutions working within the framework of the PISA Consortium, and the OECD Secretariat. The report was drafted by Andreas Schleicher, Francesca Borgonovi, Michael Davidson, Miyako Ikeda, Maciej Jakubowski, Guillermo Montt, Sophie Vayssettes and Pablo Zoido of the OECD Directorate for Education, with advice as well as analytical and editorial support from Marilyn Achiron, Simone Bloem, Marika Boiron, Henry Braun, Nihad Bunar, Niccolina Clements, Jude Cosgrove, John Cresswell, Aletta Grisay, Donald Hirsch, David Kaplan, Henry Levin, Juliette Mendelovitz, Christian Monseur, Soojin Park, Pasi Reinikainen, Mebrak Tareke, Elisabeth Villoutreix and Allan Wigfield. Volume II also draws on the analytic work undertaken by Jaap Scheerens and Douglas Willms in the context of PISA 2000. Administrative support was provided by Juliet Evans and Diana Morales. The PISA assessment instruments and the data underlying the report were prepared by the PISA Consortium, under the direction of Raymond Adams at the Australian Council for Educational Research (ACER) and Henk Moelands from the Dutch National Institute for Educational Measurement (CITO).The expert group that guided the preparation of the reading assessment framework and instruments was chaired by Irwin Kirsch. The development of the report was steered by the PISA Governing Board, which is chaired by Lorna Bertrand (United Kingdom), with Beno Csapo (Hungary), Daniel McGrath (United States) and Ryo Watanabe (Japan) as vice chairs. Annex C of the volumes lists the members of the various PISA bodies, as well as the individual experts and consultants who have contributed to this report and to PISA in general. Angel Gurría OECD Secretary-General
  • 12.
    Table of Contents 7 PISA2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010 Executive Summary............................................................................................................................................................................................................13 Introduction to PISA....................................................................................................................................................................................................17 The PISA surveys.........................................................................................................................................................................................................................17 The first report from the 2009 assessment..............................................................................................................................................................18 The PISA student population............................................................................................................................................................................................19 Reader’s Guide.........................................................................................................................................................................................................................23 Chapter 1 Comparing Performance over Time.................................................................................................................................25 Chapter 2 Trends in Reading................................................................................................................................................................................37 Continuity and change in the reading literacy framework and assessment................................................................................38 How student performance in reading has changed since 2000.............................................................................................................38 How gender differences in reading have evolved............................................................................................................................................46 Changes in performance and changes in student populations.............................................................................................................49 The impact of changes in the socio-economic composition of student populations on trends in reading performance.....................................................................................................................................................................................................................................49 Establishing an overall estimate of reading performance trends.........................................................................................................50 Country-by-country comparison of reading trends........................................................................................................................................50 Chapter 3 Trends in Mathematics and Science...............................................................................................................................59 Trends in mathematics ...........................................................................................................................................................................................................60 • How student performance in mathematics has changed since 2003.............................................................................................60 Trends in science.........................................................................................................................................................................................................................64 • How student performance in science has changed since 2006.........................................................................................................64 Chapter 4 Trends in Equity....................................................................................................................................................................................73 Trends in the variation of student performance ...............................................................................................................................................74 Trends in student background factors and their relation to reading performance...............................................................77 • Socio-economic status....................................................................................................................................................................................................77 • Immigrant status and home language...................................................................................................................................................................80
  • 13.
    Table of Contents 8© OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V Chapter 5 Trends in Attitudes and Student-School Relations.................................................................................87 Trends in reading engagement........................................................................................................................................................................................88 • Changes in whether students read for enjoyment........................................................................................................................................88 • Changes in how much students enjoy reading..............................................................................................................................................90 • Changes in what students read for enjoyment................................................................................................................................................93 • Changes in socio-economically disadvantaged students’ engagement in reading................................................................96 • Changes in the reading performance of students who read fiction..................................................................................................97 Trends in student views on schools and teachers............................................................................................................................................98 • Changes in teacher-student relations....................................................................................................................................................................98 • Changes in disciplinary climate............................................................................................................................................................................100 Conclusions and Policy Implications..................................................................................................................................................105 Changing conditions for learning..............................................................................................................................................................................105 Progress towards raising performance and levelling the playing field.........................................................................................106 References ................................................................................................................................................................................................................................109 Annex A Technical background.................................................................................................................................................................111 Annex A1: Construction of reading scales and indices from the student context questionnaires .......................................112 Annex A2: The PISA target population, the PISA samples and the definition of schools...........................................................120 Annex A3: Standard errors, significance tests and subgroup comparisons..........................................................................................133 Annex A4: Quality assurance.............................................................................................................................................................................................134 Annex A5: Participation of countries across PISA assessments....................................................................................................................136 Annex A6: Linear and adjusted trends..........................................................................................................................................................................138 Annex B Tables of results......................................................................................................................................................................................145 Annex B1: Results for countries and economies....................................................................................................................................................146 Annex B2: Subnational tables.............................................................................................................................................................................................191 Annex C The development and implementation of pisa – a collaborative effort.....................................205 This book has... StatLinks 2 A service that delivers Excel ® files from the printed page! Look for the StatLinks at the bottom left-hand corner of the tables or graphs in this book. To download the matching Excel® spreadsheet, just type the link into your Internet browser, starting with the http://dx.doi.org prefix. If you’re reading the PDF e-book edition, and your PC is connected to the Internet, simply click on the link. You’ll find StatLinks appearing in more OECD books.
  • 14.
    Table of Contents 9 PISA2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010 Boxes Box V. A Key features of PISA 2009...................................................................................................................................................21 Box V.1.1 Interpreting trends requires some caution............................................................................................................................26 Box V.B Korea .............................................................................................................................................................................................................................31 Box V.C Poland ...............................................................................................................................................................................33 Box V.D Portugal .............................................................................................................................................................................68 Box V.E Turkey ................................................................................................................................................................................70 Box V.F Chile ..................................................................................................................................................................................85 Box V.G Brazil ...............................................................................................................................................................................102 Figures Figure V. A A map of PISA countries and economies.............................................................................................................................19 Figure V.1.1 A summary of changes in reading performance...................................................................................................................27 Figure V.1.2 A summary of annualised performance trends in reading, mathematics and science...........................................................29 Figure V.2.1 Change in reading performance between 2000 and 2009 ..................................................................................................39 Figure V.2.2 How countries perform in reading and how reading performance has changed since 2000................................................40 Figure V.2.3 Multiple comparisons between 2000 and 2009 .................................................................................................................41 Figure V.2.4 Percentage of students below proficiency Level 2 in reading in 2000 and 2009 .................................................................43 Figure V.2.5 Percentage of top performers in reading in 2000 and 2009 ................................................................................................44 Figure V.2.6 Performance changes among the lowest- and highest-achieving students in reading between 2000 and 2009 ..................45 Figure V.2.7 Comparison of gender differences in reading between 2000 and 2009 ..............................................................................47 Figure V.2.8 Change in the share of boys and girls who are low performers in reading between 2000 and 2009 ...................................48 Figure V.2.9 Changes in reading performance between 2000 and 2009 .................................................................................................49 Figure V.2.10 Linear trends and performance differences between 2000 and 2009 ..................................................................................51 Figure V.2.11 Trends in reading performance: countries above the OECD average ...................................................................................52 Figure V.2.12 Trends in reading performance: countries at the OECD average .........................................................................................54 Figure V.2.13 Trends in reading performance: countries below the OECD average .................................................................................55 Figure V.3.1 Change in mathematics performance between 2003 and 2009...........................................................................................60 Figure V.3.2 How countries perform in mathematics and how mathematics performance has changed since 2003 ...............................61 Figure V.3.3 Percentage of students performing below proficiency Level 2 in mathematics in 2003 and 2009 .......................................62 Figure V.3.4 Percentage of top performers in mathematics in 2003 and 2009.........................................................................................63 Figure V.3.5 Change in science performance between 2006 and 2009 ..................................................................................................64 Figure V.3.6 How countries perform in science and how science performance has changed since 2006 ...............................................65 Figure V.3.7 Percentage of students performing below proficiency Level 2 in science in 2006 and 2009 ...............................................66 Figure V.3.8 Percentage of top performers in science in 2006 and 2009.................................................................................................67 Figure V.4.1 Comparison of the variation in student performance in reading between 2000 and 2009...................................................74 Figure V.4.2 Change in variation and change in reading performance between 2000 and 2009..............................................................76 Figure V.4.3 Variation in reading performance between and within schools in 2000 and 2009...............................................................77 Figure V.4.4 Relationship between students’ socio-economic background and their reading performance in 2000 and 2009.................78 Figure V.4.5 Relationship between socio-economic background and reading performance between and with in schools in 2000 and 2009...79 Figure V.4.6 Percentage of students with an immigrant background in 2000 and 2009...........................................................................80 Figure V.4.7 Immigrant background and reading performance in 2000 and 2009...................................................................................81 Figure V.4.8 Percentage of students who speak a language at home that is different from the language of assessment in 2000 and 2009....83 Figure V.4.9 Home language and reading performance in 2000 and 2009..............................................................................................83
  • 15.
    Table of Contents 10© OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V Figure V.5.1 Percentage of students who read for enjoyment in 2000 and 2009......................................................................................88 Figure V.5.2 Changes in the percentage of boys and girls who read for enjoyment between 2000 and 2009...........................................89 Figure V.5.3 Percentage of students who read only if they have to and percentage of students who enjoy going to a bookstore or a library in 2000 and 2009....................................................................................................................................................91 Figure V.5.4 Index of enjoyment of reading in 2000 and 2009................................................................................................................92 Figure V.5.5 Change in the index of enjoyment of reading for boys and girls between 2000 and 2009...................................................92 Figure V.5.6 Change in the index of enjoyment of reading and the proportion of students who read for enjoyment between 2000 and 2009.....................................................................................................................................................93 Figure V.5.7 Percentage of students who read fiction in 2000 and 2009.................................................................................................94 Figure V.5.8 Percentage of students who read comic books in 2000 and 2009.......................................................................................95 Figure V.5.9 Percentage of students who read for enjoyment in 2000 and 2009, by socio-economic background...................................96 Figure V.5.10 Change in the percentage of boys and girls who read for enjoyment between 2000 and 2009, by socio-economic background.........................................................................................................................................................................97 Figure V.5.11 Teacher-student relations in PISA 2000 and 2009................................................................................................................99 Figure V.5.12 Disciplinary climate in PISA 2000 and 2009.....................................................................................................................101 Figure A6.1 Observed score change and score point change adjusted for sampling differences between 2000 and 2009.....................140 Tables Table A1.1 Link Error Estimates...........................................................................................................................................................113 Table A1.2 Levels of parental education converted into years of schooling.........................................................................................116 Table A1.3 A multilevel model to estimate grade effects in reading, accounting for some background variables.................................117 Table A2.1 PISA target populations and samples................................................................................................................................................................122 Table A2.2 Exclusions.........................................................................................................................................................................124 Table A2.3 Response rates..................................................................................................................................................................126 Table A2.4a Percentage of students at each grade level.........................................................................................................................129 Table A2.4b Percentage of students at each grade level, by gender.......................................................................................................130 Table A2.5 Percentage of students and mean scores in reading, mathematics and science, according to whether students are in or out of the regular education system in Argentina.....................................................................................................................132 Table A5.1 Participation of countries in different PISA assessments..............................................................................................................................137 Table A6.1 Student background characteristics in PISA 2000 and 2009.....................................................................................................................141 Table A6.2 Trends adjusted for sampling differences...........................................................................................................................144 Table V.2.1 Mean reading performance in PISA 2000, 2003, 2006 and 2009...........................................................................................................146 Table V.2.2 Percentage of students below Level 2 and at Level 5 or above on the reading scale in PISA 2000 and 2009.....................147 Table V.2.3 Percentiles on the reading scale in PISA 2000 and 2009...................................................................................................148 Table V.2.4 Gender differences in reading performance in PISA 2000 and 2009.................................................................................150 Table V.2.5 Percentage of boys below Level 2 and at Level 5 or above on the reading scale in PISA 2000 and 2009...........................151 Table V.2.6 Percentage of girls below Level 2 and at Level 5 or above on the reading scale in PISA 2000 and 2009............................152 Table V.2.7 Trends in reading performance adjusted for demographic changes....................................................................................153 Table V.2.8 Linear trends and annual changes in reading performance across all PISA assessments.....................................................154 Table V.2.9 Mean reading score change between 2003 and 2009 and between 2006 and 2009.........................................................155 Table V.3.1 Mean mathematics performance in PISA 2003, 2006 and 2009.............................................................................................................156 Table V.3.2 Percentage of students below Level 2 and at Level 5 or above on the mathematics scale in PISA 2003 and 2009.............157 Table V.3.3 Annualised changes in mathematics since 2003...............................................................................................................158 Table V.3.4 Mean science performance in PISA 2006 and 2009..........................................................................................................159 Table V.3.5 Percentage of students below Level 2 and at Level 5 or above on the science scale in PISA 2006 and 2009.....................160
  • 16.
    Table of Contents 11 PISA2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010 Table V.4.1 Between- and within-school variance in reading performance in PISA 2000 and 2009..................................................................161 Table V.4.2 Socio-economic background of students in PISA 2000 and 2009 .....................................................................................162 Table V.4.3 Relationship between reading performance and the PISA index of economic, social, and cultural status (ESCS) in PISA 2000 and 2009 ....................................................................................................................................................163 Table V.4.4 Percentage of students and reading performance by immigrant status in PISA 2000 and 2009 .........................................164 Table V.4.5 Language spoken at home and reading performance in PISA 2000 and 2009 ...................................................................165 Table V.5.1 Percentage of students reading for enjoyment in PISA 2000 and 2009, by gender ..........................................................................166 Table V.5.2 Index of enjoyment of reading in PISA 2000 and 2009, by gender ...................................................................................167 Table V.5.3 Percentage of students for several items in the index of enjoyment of reading in PISA 2000 and 2009 .............................168 Table V.5.4 Percentage of students reading for enjoyment in PISA 2000 and 2009, by socio-economic background and gender ........171 Table V.5.5 Index of enjoyment of reading in PISA 2000 and 2009, by socio-economic background and gender ...............................174 Table V.5.6 Percentage of students who read diverse materials in PISA 2000 and 2009 ......................................................................177 Table V.5.7 Percentage of students who read diverse materials in PISA 2000 and 2009, by gender .....................................................179 Table V.5.8 Reading performance of students who read fiction in PISA 2000 and 2009 ......................................................................183 Table V.5.9 Performance of students who read fiction in PISA 2000 and 2009, by gender ..................................................................184 Table V.5.10 Diversity of reading materials in PISA 2000 and 2009, by gender ....................................................................................186 Table V.5.11 Teacher-student relations in PISA 2000 and 2009 ............................................................................................................187 Table V.5.12 Disciplinary climate in PISA 2000 and 2009 ....................................................................................................................188 Table S.V.a Mean reading performance in PISA 2000, 2003, 2006 and 2009...........................................................................................................191 Table S.V.b Percentage of students below Level 2 and at Level 5 and above on the reading scale in PISA 2000 and 2009...................191 Table S.V.c Percentiles on the reading scale in PISA 2000 and 2009...................................................................................................191 Table S.V.d Percentage of girls below Level 2 and at Level 5 and above on the reading scale in PISA 2000 and 2009.........................192 Table S.V.e Gender differences in reading performance in PISA 2000 and 2009.................................................................................192 Table S.V.f Percentage of boys below Level 2 and at Level 5 and above on the reading scale in PISA 2000 and 2009........................192 Table S.V.g Mean mathematics performance in PISA 2003, 2006 and 2009........................................................................................193 Table S.V.h Percentage of students below Level 2 and at Level 5 and above on the mathematics scale in PISA 2000 and 2009...........194 Table S.V.i Mean science performance in PISA 2006 and 2009..........................................................................................................195 Table S.V.j Mean mathematics performance in PISA 2003, 2006 and 2009........................................................................................196 Table S.V.k Between- and within-school variance in reading performance in PISA 2000 and 2009.....................................................197 Table S.V.l Socio-economic background of students in PISA 2000 and 2009......................................................................................197 Table S.V.m Relationship between reading performance and the PISA index of economic, social and cultural status (ESCS) in PISA 2000 and 2009.....................................................................................................................................................198 Table S.V.n Percentage of students and reading performance by immigrant status in PISA 2000 and 2009..........................................199 Table S.V.o Language spoken at home and reading performance in PISA 2000 and 2009....................................................................199 Table S.V.p Between- and within-school variance in reading performance in PISA 2000 and 2009.....................................................200 Table S.V.q Index of enjoyment of reading in PISA 2000 and 2009, by gender (results based on students’ self-reports)........................200 Table S.V.r Percentage of students who read diverse materials in PISA 2000 and 2009.......................................................................201 Table S.V.s Relationship between reading performance and the PISA index of economic, social and cultural status (ESCS) in PISA 2000 and 2009....................................................................................................................................................202 Table S.V.t Teacher-student relations in PISA 2000 and 2009..............................................................................................................202 Table S.V.u Disciplinary climate in PISA 2000 and 2009.....................................................................................................................203
  • 18.
    13 PISA 2009 Results:LEARNING TRENDS – VOLUME V © OECD 2010 The design of PISA does not just allow for a comparison of the relative standing of countries in terms of their learning outcomes; it also enables each country to monitor changes in those outcomes over time. Such changes indicate how successful education systems have been in developing the knowledge and skills of 15-year-olds. Indeed, some countries have seen impressive improvements in performance over the past decade, sometimes exceeding the equivalent of an average school year’s progress for the entire 15-year-old student population. Some of these countries have been catching up from comparatively low performance levels while others have been advancing further from already high levels. All countries seeking to improve their results can draw encouragement – and learn lessons – from those that have succeeded in doing so in a relatively short period of time. Changes in student performance over time prove that a country’s performance in reading is not set in stone. In both absolute and relative terms, educational results can improve, and they cannot be regarded either as part of fixed “cultural” differences between countries or as inevitably linked to each country’s state of economic development. Since both PISA 2000 and PISA 2009 focused on reading, it is possible to track how student performance in reading changed over that period. Among the 26 OECD countries with comparable results in both assessments, Chile, Israel, Poland, Portugal, Korea, Hungary and Germany as well as the partner countries Peru, Albania, Indonesia, Latvia, Liechtenstein and Brazil all improved their reading performance between 2000 and 2009, while performance declined in Ireland, Sweden, the Czech Republic and Australia. Between 2000 and 2009, the percentage of low performers in Chile dropped by more than 17 percentage points, while the share of top performers in Korea grew by more than 7 percentage points. In many countries, improvements in results were largely driven by improvements at the bottom end of the performance distribution, signalling progress towards greater equity in learning outcomes. Among OECD countries, variation in student performance fell by 3%. On average across the 26 OECD countries with comparable data for both assessments, 18% of students performed below the baseline reading proficiency Level 2 in 2009, while 19% did so in 2000. Among countries where between 40% and 60% of students performed below Level 2 in 2000, Chile reduced that proportion by the largest amount, and Mexico and the partner country Brazil also show important decreases in their share of low performers. Among countries where the proportion of students performing below Level 2 was smaller than 40% but still above the OECD average of 19%, the partner country Latvia reduced the proportion by 13 percentage points, while Portugal, Poland, Hungary, Germany, Switzerland and the partner country Liechtenstein reduced the share by smaller amounts. In Denmark, the percentage of students below Level 2 fell from an already below-average level. The share of top performers – those students who attain reading proficiency Level 5 or 6 in reading – increased in Japan, Korea and the partner economy Hong Kong-China such that these countries now have the largest proportions of high-achieving students among the countries participating in the 2009 assessment. Several countries that had above-average proportions of top performers in 2000 saw those proportions decrease in 2009. Notable among them was Ireland, where the proportion of top performers fell from 14% to 7%, which is below the OECD average. Executive Summary
  • 19.
    Executive Summary 14 ©OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V Between 2000 and 2009, Poland, Portugal, Germany, Switzerland and the partner countries Latvia and Liechtenstein raised the performance of their lowest-achieving students while maintaining the performance level among their highest-achieving students. Korea, Israel and the partner country Brazil raised the performance of their highest- achieving students while maintaining the performance level among their lowest-achieving students. Chile and the partner countries Indonesia, Albania and Peru showed improvements in reading performance among students at all proficiency levels. On average, OECD countries narrowed the gap in scores between their highest- and lowest-performing students between 2000 and 2009; some also improved overall performance. In Chile, Germany, Hungary, Poland, Portugal, and the partner countries Indonesia, Latvia and Liechtenstein, overall performance improved while the variation in performance decreased. In many cases, this was the result of improvements among low-achieving students. The gender gap in reading performance did not narrow in any country between 2000 and 2009. The gender gap in reading performance widened in Israel, Korea, Portugal, France and Sweden, and in the partner countries and economies Romania, Hong Kong-China, Indonesia and Brazil between 2000 and 2009. The fact that girls outperform boys in reading is most evident in the proportion of girls and boys who perform below baseline proficiency Level 2. Across OECD countries, 24% of boys perform below Level 2 compared to only 12% of girls. The proportion of girls performing below this level decreased by two percentage points between 2000 and 2009, while the share of low-achieving boys did not change during the period. Across the OECD area, the percentage of students with an immigrant background increased by an average of two percentage points between 2000 and 2009. The performance gap between students with and without an immigrant background remained broadly similar over the period. However, some countries noted large reductions in the performance advantage of students without an immigrant background. In Belgium, Switzerland and Germany, the gap narrowed by between 28 and 38 score points due to improvements in reading proficiency among students with an immigrant background. However, the gap is still relatively wide in these countries. Across OECD countries, overall performance in mathematics remained unchanged between 2003 and 2009, as did performance in science between 2006 and 2009. In mathematics, students in Mexico, Turkey, Greece, Portugal, Italy, Germany and the partner countries Brazil and Tunisia improved their mathematics scores considerably, while students in the Czech Republic, Ireland, Sweden, France, Belgium, the Netherlands, Denmark, Australia and Iceland saw declines in their performance. On average across the 28 OECD countries with comparable results in the 2003 and 2009 assessments, the share of students below mathematics proficiency Level 2 remained broadly similar over the period, with a minor decrease from 21.6% to 20.8%. Among the OECD countries in which more than half of students performed below mathematics proficiency Level 2 in 2003, Mexico shrunk this proportion by 15 percentage points, from 66% to 51%, by 2009 while Turkey reduced it from 52% to 42% during the same period. Meanwhile, the percentage of top performers in mathematics in those 28 OECD countries decreased slightly, from 14.7% in 2003 to 13.4% in 2009. Portugal showed the largest increase – four percentage points – in top performers in mathematics. In science, 11 of the 56 countries that participated in both the 2006 and 2009 assessments show improvements in student performance. Turkey, for example, saw a 30 score point increase, nearly half a proficiency level, in just three years. Turkey also reduced the percentage of students below science proficiency Level 2 by almost 17 percentage points, from 47% to 30%. Portugal, Chile, the United States, Norway, Korea and Italy all reduced the share of lowest performers in science by around five percentage points or more, as did the partner countries Qatar, Tunisia, Brazil and Colombia. Performance in science declined considerably in five countries. On average across OECD countries, the percentage of students who report reading for enjoyment daily dropped by five percentage points. Enjoyment of reading tends to have deteriorated, especially among boys, signalling the challenge for schools to engage students in reading activities that 15-year-olds find relevant and interesting. On average across OECD countries, the percentage of students who said they read for enjoyment every day fell from 69% in 2000 to 64% in 2009. On the other hand, changes in student-teacher relations and classroom climate have generally been favourable or, at least, they have not deteriorated as many would have expected. Generally, students have become more confident that they can get help from their teachers. Across the 26 OECD countries that participated in both assessments, 74% of students in 2000 agreed or strongly agreed with the statements, “If I need extra help, I will
  • 20.
    Executive Summary 15 PISA 2009Results: LEARNING TRENDS – VOLUME V © OECD 2010 receive it from my teachers” or “Most of my teachers treat me fairly”, while in 2009, 79% of students agreed or strongly agreed with those statements. Overall, aspects of classroom discipline have also improved. Thus there is no evidence to justify the notion that students are becoming progressively more disengaged from school.
  • 22.
    17 PISA 2009 Results:LEARNING TRENDS – VOLUME V © OECD 2010 Introduction to PISA The PISA surveys Are students well prepared to meet the challenges of the future? Can they analyse, reason and communicate their ideas effectively? Have they found the kinds of interests they can pursue throughout their lives as productive members of the economy and society? The OECD Programme for International Student Assessment (PISA) seeks to answer these questions through its triennial surveys of key competencies of 15-year-old students in OECD member countries and partner countries/economies. Together, the group of countries participating in PISA represents nearly 90% of the world economy.1 PISA assesses the extent to which students near the end of compulsory education have acquired some of the knowledge and skills that are essential for full participation in modern societies, with a focus on reading, mathematics and science. PISA has now completed its fourth round of surveys. Following the detailed assessment of each of PISA’s three main subjects – reading, mathematics and science – in 2000, 2003 and 2006, the 2009 survey marks the beginning of a new round with a return to a focus on reading, but in ways that reflect the extent to which reading has changed since 2000, including the prevalence of digital texts. PISA 2009 offers the most comprehensive and rigorous international measurement of student reading skills to date. It assesses not only reading knowledge and skills, but also students’ attitudes and their learning strategies in reading. PISA 2009 updates the assessment of student performance in mathematics and science as well. The assessment focuses on young people’s ability to use their knowledge and skills to meet real-life challenges. This orientation reflects a change in the goals and objectives of curricula themselves, which are increasingly concerned with what students can do with what they learn at school and not merely with whether they have mastered specific curricular content. PISA’s unique features include its: • Policy orientation, which connects data on student learning outcomes with data on students’ characteristics and on key factors shaping their learning in and out of school in order to draw attention to differences in performance patterns and identify the characteristics of students, schools and education systems that have high performance standards. • Innovative concept of “literacy”, which refers to the capacity of students to apply knowledge and skills in key subject areas and to analyse, reason and communicate effectively as they pose, interpret and solve problems in a variety of situations. • Relevance to lifelong learning, which does not limit PISA to assessing students’ competencies in school subjects, but also asks them to report on their own motivations to learn, their beliefs about themselves and their learning strategies. • Regularity, which enables countries to monitor their progress in meeting key learning objectives. • Breadth of geographical coverage and collaborative nature, which, in PISA 2009, encompasses the 34 OECD member countries and 41 partner countries and economies.2
  • 23.
    Introduction to PISA 18© OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V The relevance of the knowledge and skills measured by PISA is confirmed by studies tracking young people in the years after they have been assessed by PISA. Longitudinal studies in Australia, Canada and Switzerland display a strong relationship between performance in reading on the PISA 2000 assessment at age 15 and future educational attainment and success in the labour market (see Volume I, Chapter 2).3 The frameworks for assessing reading, mathematics and science in 2009 are described in detail in PISA 2009 Assessment Framework: Key Competencies in Reading, Mathematics and Science (OECD, 2009). Decisions about the scope and nature of the PISA assessments and the background information to be collected are made by leading experts in participating countries. Governments guide these decisions based on shared, policy- driven interests. Considerable efforts and resources are devoted to achieving cultural and linguistic breadth and balance in the assessment materials. Stringent quality-assurance mechanisms are applied in designing the test, in translation, sampling and data collection. As a result, PISA findings are valid and highly reliable. Policy makers around the world use PISA findings to gauge the knowledge and skills of students in their own country in comparison with those in the other countries. PISA reveals what is possible in education by showing what students in the highest performing countries can do in reading, mathematics and science. PISA is also used to gauge the pace of educational progress, by allowing policy makers to assess to what extent performance changes observed nationally are in line with performance changes observed elsewhere. In a growing number of countries, PISA is also used to set policy targets against measurable goals achieved by other systems, and to initiate research and peer-learning designed to identify policy levers and to reform trajectories for improving education. While PISA cannot identify cause-and-effect relationships between inputs, processes and educational outcomes, it can highlight the key features in which education systems are similar and different, sharing those findings with educators, policy makers and the general public. The first report from the 2009 assessment This volume is the fifth of six volumes that provide the first international report on results from the PISA 2009 assessment. It provides an overview of trends in student performance in reading, mathematics and science from PISA 2000 to PISA 2009. It shows educational outcomes over time and tracks changes in factors related to student and school performance, such as student background and school characteristics and practices. The other volumes cover the following issues: • Volume I, What Students Know and Can Do: Student Performance in Reading, Mathematics and Science, summarises the performance of students in PISA 2009, starting with a focus on reading, and then reporting on mathematics and science performance. It provides the results in the context of how performance is defined, measured and reported, and then examines what students are able do in reading. After a summary of reading performance, it examines the ways in which this performance varies on subscales representing three aspects of reading. It then breaks down results by different formats of reading texts and considers gender differences in reading, both generally and for different reading aspects and text formats. Any comparison of the outcomes of education systems needs to take into consideration countries’ social and economic circumstances and the resources they devote to education. To address this, the volume also interprets the results within countries’ economic and social contexts. The chapter concludes with a description of student results in mathematics and science. • Volume II, Overcoming Social Background: Equity in Learning Opportunities and Outcomes, starts by closely examining the performance variation shown in Volume I, particularly the extent to which the overall variation in student performance relates to differences in results achieved by different schools. The volume then looks at how factors such as socio-economic background and immigrant status affect student and school performance, and the role that education policy can play in moderating the impact of these factors. • Volume III, Learning to Learn: Student Engagement, Strategies and Practices, explores the information gathered on students’ levels of engagement in reading activities and attitudes towards reading and learning. It describes 15-year-olds’ motivations, engagement and strategies to learn. • Volume IV, What Makes a School Successful? Resources, Policies and Practices, explores the relationships between student-, school- and system-level characteristics, and educational quality and equity. It explores what schools and school policies can do to raise overall student performance and, at the same time, moderate the impact of
  • 24.
    Introduction to PISA 19 PISA2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010 socio-economic background on student performance, with the aim of promoting a more equitable distribution of learning opportunities. • Volume VI, Students On Line: Reading and Using Digital Information, (OECD, forthcoming) explains how PISA measures and reports student performance in digital reading and analyses what students in the 20 countries participating in this assessment are able to do. All data tables referred to in the analysis are included at the end of the respective volume. A Reader’s Guide is also provided in each volume to aid in interpreting the tables and figures accompanying the report. Technical annexes that describe the construction of the questionnaire indices, sampling issues, quality-assurance procedures and the process followed for developing the assessment instruments, and information about reliability of coding are posted on the OECD PISA website (www.pisa.oecd.org). Many of the issues covered in the technical annexes will be elaborated in greater detail in the PISA 2009 Technical Report (OECD, forthcoming). The PISA student population In order to ensure the comparability of the results across countries, PISA devoted a great deal of attention to assessing comparable target populations. Differences between countries in the nature and extent of pre-primary education and care, in the age of entry to formal schooling, and in the structure of the education system do not allow school grades levels to be defined so that they are internationally comparable. Valid international comparisons of educational performance, therefore, need to define their populations with reference to a target age. PISA covers students who are aged between 15 years 3 months and 16 years 2 months at the time of the assessment and who have completed at least 6 years of formal schooling, regardless of the type of institution in which they are enrolled, • Figure V.A • A map of PISA countries and economies OECD countries Partner countries and economies in PISA 2009* Partners countries in previous PISA surveys Australia Japan Albania Mauritius* Dominican Republic Austria Korea Argentina Miranda-Venezuela* Macedonia Belgium Luxembourg Azerbaijan Montenegro Moldova Canada Mexico Brazil Netherlands-Antilles* Chile Netherlands Bulgaria Panama Czech Republic New Zealand Colombia Peru Denmark Norway Costa Rica* Qatar Estonia Poland Croatia Romania Finland Portugal Georgia* Russian Federation France Slovak Republic Himachal Pradesh-India* Serbia Germany Slovenia Hong Kong-China Shanghai-China Greece Spain Indonesia Singapore Hungary Sweden Jordan Tamil Nadu-India* Iceland Switzerland Kazakhstan Chinese Taipei Ireland Turkey Kyrgyzstan Thailand Israel United Kingdom Latvia Trinidad and Tobago Italy United States Liechtenstein Tunisia Lithuania Uruguay Macao-China United Arab Emirates* Malaysia* Viet Nam* * These partner countries and economies carried out the assessment in 2010 instead of 2009. Malta*
  • 25.
    Introduction to PISA 20© OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V whether they are in full-time or part-time education, whether they attend academic or vocational programmes, and whether they attend public or private schools or foreign schools within the country. (For an operational definition of this target population, see the PISA 2009 Technical Report [OECD, forthcoming].) The use of this age in PISA, across countries and over time, allows the performance of students to be compared in a consistent manner before they complete compulsory education. As a result, this report can make statements about the knowledge and skills of individuals born in the same year who are still at school at 15 years of age, despite having had different educational experiences, both in and outside school. Stringent technical standards were established to define the national target populations and to identify permissible exclusions from this definition (for more information, see the PISA website www.pisa.oecd.org). The overall exclusion rate within a country was required to be below 5% to ensure that, under reasonable assumptions, any distortions in national mean scores would remain within plus or minus 5 score points, i.e. typically within the order of magnitude of two standard errors of sampling (see Annex A2). Exclusion could take place either through the schools that participated or the students who participated within schools.There are several reasons why a school or a student could be excluded from PISA. Schools might be excluded because they are situated in remote regions and are inaccessible or because they are very small, or because of organisational or operational factors that precluded participation. Students might be excluded because of intellectual disability or limited proficiency in the language of the test. In 29 out of the 65 countries participating in PISA 2009, the percentage of school-level exclusions amounted to less than 1%; it was less than 5% in all countries. When the exclusion of students who met the internationally established exclusion criteria is also taken into account, the exclusion rates increase slightly. However, the overall exclusion rate remains below 2% in 32 participating countries, below 5% in 60 participating countries, and below 7% in all countries except Luxembourg (7.2%) and Denmark (8.6%). In 15 out of 34 OECD countries, the percentage of school-level exclusions amounted to less than 1% and was less than 5% in all countries. When student exclusions within schools are also taken into account, there were 9 OECD countries below 2% and 25 countries below 5%. Restrictions on the level of exclusions in PISA 2009 are described in Annex A2. The specific sample design and size for each country aimed to maximise sampling efficiency for student-level estimates. In OECD countries, sample sizes ranged from 4 410 students in Iceland to 38 250 students in Mexico. Countries with large samples have often implemented PISA both at national and regional/state levels (e.g. Australia, Belgium, Canada, Italy, Mexico, Spain, Switzerland and the United Kingdom). This selection of samples was monitored internationally and adhered to rigorous standards for the participation rate, both among schools selected by the international contractor and among students within these schools, to ensure that the PISA results reflect the skills of the 15-year-old students in participating countries. Countries were also required to administer the test to students in identical ways to ensure that students receive the same information prior to and during the test (for details, see Annex A4).
  • 26.
    Introduction to PISA 21 PISA2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010 Box V.A Key features of PISA 2009 Content • The main focus of PISA 2009 was reading. The survey also updated performance assessments in mathematics and science. PISA considers students’ knowledge in these areas not in isolation, but in relation to their ability to reflect on their knowledge and experience and to apply them to real-world issues.The emphasis is on mastering processes, understanding concepts and functioning in various contexts within each assessment area. • For the first time, the PISA 2009 survey also assessed 15-year-old students’ ability to read, understand and apply digital texts. Methods • Around 470 000 students completed the assessment in 2009, representing about 26 million 15-year-olds in the schools of the 65 participating countries and economies. Some 50 000 students took part in a second round of this assessment in 2010, representing about 2 million 15 year-olds from 10 additional partner countries and economies. • Each participating student spent two hours carrying out pencil-and-paper tasks in reading, mathematics and science. In 20 countries, students were given additional questions via computer to assess their capacity to read digital texts. • The assessment included tasks requiring students to construct their own answers as well as multiple-choice questions. The latter were typically organised in units based on a written passage or graphic, much like the kind of texts or figures that students might encounter in real life. • Students also answered a questionnaire that took about 30 minutes to complete. This questionnaire focused on their personal background, their learning habits, their attitudes towards reading, and their engagement and motivation. • School principals completed a questionnaire about their school that included demographic characteristics and an assessment of the quality of the learning environment at school. Outcomes PISA 2009 results provide: • a profile of knowledge and skills among 15-year-olds in 2009, consisting of a detailed profile for reading and an update for mathematics and science; • contextual indicators relating performance results to student and school characteristics; • an assessment of students’ engagement in reading activities, and their knowledge and use of different learning strategies; • a knowledge base for policy research and analysis; and • trend data on changes in student knowledge and skills in reading, mathematics and science, on changes in student attitudes and socio-economic indicators, and in the impact of some indicators on performance results. Future assessments • The PISA 2012 survey will return to mathematics as the major assessment area; PISA 2015 will focus on science. Thereafter, PISA will turn to another cycle, beginning with reading again. • Future tests will place greater emphasis on assessing students’ capacity to read and understand digital texts and solve problems presented in a digital format, reflecting the importance of information and computer technologies in modern societies.
  • 27.
    Introduction to PISA 22© OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V Notes 1. The GDP of the countries that participated in PISA 2009 represents 87% of the 2007 world GDP. Some of the entities represented in this report are referred to as partner economies. This is because they are not strictly national entities. 2. Thirty-one partner countries and economies originally participated in the PISA 2009 assessment and ten additional partner countries and economies took part in a second round of the assessment. 3. Marks, G.N (2007); Bertschy, K., M.A. Cattaneo and S.C. Wolter (2009); OECD (2010a).
  • 28.
    23 PISA 2009 Results:LEARNING TRENDS – VOLUME V © OECD 2010 Reader’s Guide Data underlying the figures The data referred to in this volume are presented in Annex B and, in greater detail, on the PISA website (www.pisa.oecd.org). Five symbols are used to denote missing data: a The category does not apply in the country concerned. Data are therefore missing. c There are too few observations or no observation to provide reliable estimates (i.e. there are fewer than 30 students or less than five schools with valid data). m Data are not available. These data were not submitted by the country or were collected but subsequently removed from the publication for technical reasons. w Data have been withdrawn or have not been collected at the request of the country concerned. x Data are included in another category or column of the table. Country coverage This publication features data on 65 countries and economies, including all 34 OECD countries and 31 partner countries and economies (see Figure V.A). The data from another 10 partner countries were collected one year later and will be published in 2011. The statistical data for Israel are supplied by and under the responsibility of the relevant Israeli authorities. The use of such data by the OECD is without prejudice to the status of the Golan Heights, East Jerusalem and Israeli settlements in the West Bank under the terms of international law. Calculating international averages An OECD average was calculated for most indicators presented in this report. The OECD average corresponds to the arithmetic mean of the respective country estimates. Readers should, therefore, keep in mind that the term “OECD average” refers to the OECD countries included in the respective comparisons. Rounding figures Because of rounding, some figures in tables may not exactly add up to the totals. Totals, differences and averages are always calculated on the basis of exact numbers and are rounded only after calculation. All standard errors in this publication have been rounded to one or two decimal places. Where the value 0.00 is shown, this does not imply that the standard error is zero, but that it is smaller than 0.005. Reporting student data The report uses “15-year-olds” as shorthand for the PISA target population. PISA covers students who are aged between 15 years 3 months and 16 years 2 months at the time of assessment and who have completed at least 6 years of formal schooling, regardless of the type of institution in which they are enrolled and of whether they are in full-time or part-time education, of whether they attend academic or vocational programmes, and of whether they attend public or private schools or foreign schools within the country. Reporting school data The principals of the schools in which students were assessed provided information on their schools’ characteristics by completing a school questionnaire. Where responses from school principals are presented in this publication, they are weighted so that they are proportionate to the number of 15-year-olds enrolled in the school.
  • 29.
    Reader’s Guide 24 ©OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V Focusing on statistically significant differences This volume discusses only statistically significant differences or changes. These are denoted in darker colours in figures and in bold font in tables. See Annex A3 for further information. Abbreviations used in this report ESCS PISA index of economic, social and cultural status GDP Gross domestic product ISCED International Standard Classification of Education PPP Purchasing power parity S.D. Standard deviation S.E. Standard error Further documentation For further information on the PISA assessment instruments and the methods used in PISA, see the PISA 2009 Technical Report (OECD, forthcoming) and the PISA website (www.pisa.oecd.org). This report uses the OECD’s StatLinks service. Below each table and chart is a url leading to a corresponding Excel workbook containing the underlying data. These urls are stable and will remain unchanged over time. In addition, readers of the e-books will be able to click directly on these links and the workbook will open in a separate window, if their Internet browser is open and running.
  • 30.
    1 25 PISA 2009 Results:LEARNING TRENDS – VOLUME V © OECD 2010 This chapter describes how PISA has measured trends in reading performance between the first PISA assessment in 2000 and the latest in 2009. Since reading was the focus of both assessments, it is possible to obtain detailed comparisons of how student performance in reading changed between 2000 and 2009. The chapter also discusses the methods used for tracking trends in student performance in mathematics and science. Comparing Performance over Time
  • 31.
    1 Comparing Performance OverTime 26 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V PISA 2009 is the fourth full assessment of reading since PISA was launched in 2000, the third assessment of mathematics since PISA 2003, when the first full assessment of mathematics took place, and the second assessment of science since PISA 2006, when the first full assessment of science took place. Both PISA 2000 and PISA 2009 focus on reading, so it is possible to obtain detailed comparisons of how student performance in reading changed over the 2000-2009 period. Comparisons over time in the areas of mathematics and science are more limited, since there have not yet been two full assessments of either area in nine years of PISA testing. Box V.1.1 Interpreting trends requires some caution • The methodologies underlying the establishment of performance trends in international studies of education are complex (Gebhardt and Adams, 2007). In order to ensure that the measurement of reading performance in different surveys is comparable, a number of common assessment items are used in each survey. However, the limited number of such items increases the risk of measurement errors.Therefore, the confidence band for comparisons over time is wider than for single-year data, and only changes that are indicated as statistically significant in this volume should be considered robust.1 • Some countries have not been included in comparisons between 2000, 2003, 2006 and 2009 for methodological reasons. The PISA 2000 sample for the Netherlands did not meet the PISA response-rate standards and mean scores for the Netherlands were therefore not reported for 2000. In Luxembourg, the assessment conditions were changed in substantial ways between the 2000 and 2003 PISA surveys, thus results are only comparable between 2003, 2006 and 2009.2 The PISA 2000 and PISA 2003 samples for the United Kingdom did not meet the PISA response-rate standards, so data from the United Kingdom are not comparable with other countries.3 For the United States, no reading results are available for 2006.4 The sampling weights for the PISA 2000 assessment in Austria have been adjusted to allow for comparisons with subsequent PISA assessments.5 For the PISA 2009 assessment, a dispute between teachers’ unions and the education minister had led to a boycott of PISA, which was only withdrawn after the first week of testing. The boycott required the OECD to remove identifiable cases from the dataset. Although the Austrian dataset met the PISA 2009 technical standards after the removal of these cases, the negative atmosphere in regard to educational assessment has affected the conditions under which the assessment was administered and could have adversely affected student motivation to respond to the PISA tasks. The comparability of the 2009 data with data from earlier PISA assessments can therefore not be ensured and data for Austria have therefore been excluded from trend comparisons. Some countries did not participate in all PISA assessments. When comparing trends in reading, this volume looks at the 38 countries with valid results from the 2000 and 2009 assessments.6 When comparing trends in mathematics, it considers 39 countries with valid results from the 2003 and 2009 assessments. PISA 2000 results in mathematics are not considered, since the first full assessment in mathematics took place in 2003. Similarly, science performance in 2009 cannot be compared to that of PISA 2000 or PISA 2003, since the first full science assessment took place in 2006. Thus, when comparing trends in science, the 56 countries with valid results from the 2006 and 2009 assessments are included. Annex A5 provides a list of countries considered in this trends analysis. Among OECD countries, the Slovak Republic and Turkey joined PISA in 2003, Chile and Israel did not participate in the PISA 2003 assessment, and Estonia and Slovenia only participated in 2006 and 2009. The different number of OECD countries participating in successive PISA assessments is reflected through separate OECD averages that provide reference points for trend comparisons. For reading, the main reference point is the OECD average for the 26 OECD countries that participated in both PISA 2000 and PISA 2009, but for comparisons involving all four assessments, the average for the 23 OECD countries that participated in all of them is also provided. For mathematics, trends can be calculated for the OECD average in 28 OECD countries that have valid results for both PISA 2003 and PISA 2009. Thirty-three OECD countries have valid results for the 2006 and 2009 assessments in science. Annex A5 gives more details on how the OECD average was calculated for different trend comparisons presented in this volume.
  • 32.
    1 Comparing Performance OverTime 27 PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010 Figure V.1.1 summarises trends in reading performance. The first column provides information on whether reading performance in PISA 2009 was above (blue), at (no colour) or below (grey) the average for OECD countries. Countries are sorted by the magnitude of change in reading performance from PISA 2000 to PISA 2009, which is reported in the second column. Increases in performance are indicated in blue; decreases are indicated in grey. No colour means that there was no statistically significant change in performance. In addition, the chart highlights changes in reading performance separately for boys and girls, changes in the proportion of lowest performers (below proficiency Level 2) and in the proportion of top performers (students at proficiency Level 5 or 6). The last column shows changes in the relationship between the socio-economic background of students and student performance, which provides an indication of whether equity in the distribution of educational opportunities has increased (when the relationship has weakened) or equity has decreased (when the relationship has strengthened).7 In all cases, blue indicates positive change, grey indicates negative change, and no colour means that there has been no statistically significant change. • Figure V.1.1• A summary of changes in reading performance Mean score in reading 2009 is statistically significantly above the OECD average. Changes in reading and in the share of students at proficiency Level 5 or above are statistically significantly positive. Changes in the share of students below proficiency Level 2 and in the association of socio-economic background with reading is statistically significantly negative. Mean score in reading 2009 is not statistically significantly different from the OECD average. Changes in reading, in the share of students at proficiency Level 5 or above, in the share of students below proficiency Level 2 and in the association of socio-economic background with reading are not statistically significantly different. Mean score in reading 2009 is statistically significantly below the OECD average. Changes in reading and in the share of students at proficiency Level 5 or above are statistically significantly negative. Changes in the share of students below proficiency Level 2 and in the association of socio-economic background with reading is statistically significantly positive. Mean score in reading 2009 Change in reading performance between 2000 to 2009 All students Boys Girls Share of students below proficiency Level 2 Share of students at proficiency Level 5 or above Association of socio-economic background with reading performance Peru 370 43 35 50 -14.8 0.4 0.1 Chile 449 40 42 40 -17.6 0.8 -7.6 Albania 385 36 35 39 -13.7 0.1 -9.9 Indonesia 402 31 23 39 -15.2 0.0 -6.9 Latvia 484 26 28 23 -12.5 -1.2 -11.0 Israel 474 22 9 35 -6.7 3.3 -8.4 Poland 500 21 14 28 -8.2 1.3 -1.5 Portugal 489 19 12 26 -8.6 0.6 -4.7 Liechtenstein 499 17 16 17 -6.4 -0.4 -13.3 Brazil 412 16 9 21 -6.2 0.8 -0.6 Korea 539 15 4 25 0.0 7.2 8.5 Hungary 494 14 11 17 -5.1 1.0 -4.2 Germany 497 13 10 15 -4.2 -1.2 -7.7 Greece 483 9 3 13 -3.1 0.6 2.0 Hong Kong-China 533 8 0 17 -0.8 2.9 -8.6 Switzerland 501 6 1 10 -3.6 -1.1 -2.3 Mexico 425 3 1 6 -4.0 -0.5 -7.3 Belgium 506 -1 0 -5 -1.2 -0.8 0.7 Bulgaria 429 -1 -8 6 0.7 0.6 -4.5 Italy 486 -1 -5 2 2.1 0.5 3.2 Denmark 495 -2 -5 -1 -2.7 -3.4 -3.2 Norway 503 -2 -5 -1 -2.5 -2.8 0.4 Russian Federation 459 -2 -6 1 -0.1 -0.0 1.4 Japan 520 -2 -6 3 3.5 3.6 c Romania 424 -3 -18 11 -0.9 -1.5 10.7 United States 500 -5 -2 -6 -0.3 -2.4 -9.2 Iceland 500 -7 -10 -6 2.3 -0.5 5.4 New Zealand 521 -8 -8 -8 0.6 -3.0 4.9 France 496 -9 -15 -4 4.6 1.1 7.0 Thailand 421 -9 -6 -10 5.8 -0.2 -0.7 Canada 524 -10 -12 -10 0.7 -4.0 -6.4 Finland 536 -11 -12 -8 1.2 -4.0 5.8 Spain 481 -12 -14 -10 3.3 -0.9 1.5 Australia 515 -13 -17 -13 1.8 -4.9 -1.4 Czech Republic 478 -13 -17 -6 5.6 -1.9 -11.4 Sweden 497 -19 -24 -15 4.9 -2.2 7.7 Argentina 398 -20 -15 -22 7.7 -0.7 -1.7 Ireland 496 -31 -37 -26 6.2 -7.3 5.8 Countries are ranked in descending order of the change in reading performance between 2000 and 2009 for all students. Source: OECD, PISA 2009 Database, Tables V.2.1, V.2.2, V.2.4 and V.4.3 12 http://dx.doi.org/10.1787/888932359948 A corrigendum has been issued for this page. See: http://www.oecd.org/dataoecd/43/61/49198566.pdf
  • 33.
    1 Comparing Performance OverTime 28 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V In several countries, student achievement has improved markedly across successive PISA assessments since 2000 (Table V.2.1). Each of these countries offers an example of an education system that succeeded in improving its outcomes (see Chapter 2). This volume includes brief descriptions of some of the education systems that have seen marked improvements in the performance of their students in PISA. Notes on Korea (Box V.B) and Poland (Box V.C) appear between Chapters 1 and 2, notes on Portugal (BoxV.D) and Turkey (BoxV.E) appear between Chapters 3 and 4, a note on Chile (Box V.F) appears between Chapters 4 and 5, and a note on Brazil (Box V.G) appears after Chapter 5. School systems differ in many ways, including their overall performance level, the socio-economic background of students and schools, the learning environment at school and how school systems are organised. Therefore, it is important to interpret changes in learning outcomes in the context of the underlying characteristics of education systems. In some of the education systems that have seen improvements or a decline in their performance, some of the changes can be attributed to changes in the demographic profile of students. For example, in some countries, student populations have become more socio-economically diverse over recent years, which, as Volume II, Overcoming Social Background, shows, can be associated with performance disadvantages such that a decline in performance may not necessarily be associated with a decline in the quality of the educational services provided, but rather with a more challenging socio-economic context. To account for such changes, observed changes in reading performance are discussed together with trend estimates that have been adjusted for changes in the demographic and socio-economic profile of students and schools. More detailed descriptions of trends in equity in learning opportunities and outcomes (see Chapter 4), and trends in the learning environment (see Chapter 5) that have been observed since 2000 are also presented in this volume. Annex A1 provides details on how performance scales were equated and on how trends were computed. Annex A6 provides details on how performance scales were adjusted for demographic and socio-economic context. Overall, the evidence suggests that the performance trends reported in this volume are not affected by methodological choices, and that in most countries, they are not driven by changes in the demographic and socio-economic composition of the student population. This volume also discusses trends in mathematics and science, although comparisons over time are much more limited (see Chapter 3). Figure V.1.2 below summarises trends for all three assessment areas. Countries are sorted by their reading performance in 2009. Since the trends for reading are calculated over a nine-year period for most of the countries, and over a six-year or a three-year period for some of them, the trends have been annualised to make them comparable across the three subject areas.8 Similarly, trends for mathematics and science were also annualised as they are calculated over a six-year or three-year period for mathematics and over a three-year period for science. Although the annualised figures ensure that the magnitude of changes is comparable across subject areas, greater variability in reading trends is expected, as the longer reporting period for reading provides more opportunites to reflect changes in education systems. This has indeed been observed. Results are reported for all countries that participated in at least two assessments. The number of years for which reading performance trends were calculated is given after the mean reading performance. Trends in mathematics were calculated over six years if a country participated from at least 2003, or over three years if a country participated in the last two assessments. All trends in science were calculated for three years between 2006 and 2009. Among countries that scored at or above the OECD average, Portugal improved in all assessment areas, Korea and Poland improved in both reading and science, Germany improved in reading and mathematics, Hungary and Liecthenstein improved in reading, and Norway and the United States improved in science.
  • 34.
    1 Comparing Performance OverTime 29 PISA 2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010 • Figure V.1.2• A summary of annualised performance trends in reading, mathematics and science Mean score in reading 2009 is statistically significantly above the OECD average. Annualised score point changes in reading, mathematics and science are statistically significantly positive. Mean score in reading 2009 is not statistically significantly different from the OECD average. Annualised score point changes in reading, mathematics and science are not statistically significantly different from zero. Mean score in reading 2009 is statistically significantly below the OECD average. Annualised score point changes in reading, mathematics and science are statistically significantly negative. Mean score in reading 2009 Number of years for which PISA results are available Reading Mathematics Science Korea 539 9 1.6 0.7 5.3 Finland 536 9 -1.2 -0.6 -3.1 Hong Kong-China 533 8 1.0 0.7 2.3 Canada 524 9 -1.1 -0.9 -1.9 New Zealand 521 9 -0.9 -0.7 0.5 Japan 520 9 -0.3 -0.9 2.7 Australia 515 9 -1.5 -1.7 0.1 Netherlands 508 6 -0.8 -2.0 -0.9 Belgium 506 9 -0.1 -2.3 -1.3 Norway 503 9 -0.2 0.5 4.4 Estonia 501 3 0.1 -0.8 -1.2 Switzerland 501 9 0.7 1.2 1.7 Poland 500 9 2.4 0.8 3.4 Iceland 500 9 -0.7 -1.4 1.6 United States 500 9 -0.5 0.8 4.4 Liechtenstein 499 9 1.9 0.0 -0.7 Sweden 497 9 -2.1 -2.5 -2.7 Germany 497 9 1.5 1.6 1.6 Ireland 496 9 -3.4 -2.6 -0.1 France 496 9 -1.0 -2.3 1.0 Chinese Taipei 495 3 -0.3 -2.1 -4.0 Denmark 495 9 -0.2 -1.8 1.1 United Kingdom 494 3 -0.3 -1.0 -0.4 Hungary 494 9 1.6 0.0 -0.4 Portugal 489 9 2.1 3.5 6.2 Macao-China 487 6 -1.8 -0.3 0.1 Italy 486 9 -0.2 2.9 4.5 Latvia 484 9 2.9 -0.2 1.4 Slovenia 483 3 -3.8 -1.0 -2.4 Greece 483 9 1.0 3.5 -1.1 Spain 481 9 -1.3 -0.3 -0.1 Czech Republic 478 9 -1.5 -3.9 -4.1 Slovak Republic 477 6 1.4 -0.3 0.6 Croatia 476 3 -0.5 -2.4 -2.3 Israel 474 8 2.7 1.7 0.3 Luxembourg 472 6 -1.2 -0.7 -0.8 Lithuania 468 3 -0.5 -3.3 1.2 Turkey 464 6 3.9 3.7 10.0 Russian Federation 459 9 -0.3 -0.1 -0.4 Chile 449 8 5.0 3.2 3.1 Serbia 442 6 5.0 0.9 2.4 Bulgaria 429 8 -0.2 4.9 1.7 Uruguay 426 6 -1.4 0.8 -0.3 Mexico 425 9 0.4 5.5 2.1 Romania 424 7 -0.5 4.1 3.3 Thailand 421 8 -1.2 0.3 1.4 Colombia 413 3 9.3 3.6 4.6 Brazil 412 9 1.7 5.0 5.0 Montenegro 408 3 5.2 1.1 -3.5 Jordan 405 3 1.5 0.9 -2.2 Tunisia 404 6 4.8 2.1 5.1 Indonesia 402 8 3.9 1.9 -3.6 Argentina 398 8 -2.5 2.3 3.2 Albania 385 8 4.5 m m Qatar 372 3 19.8 16.7 10.0 Peru 370 8 5.3 m m Azerbaijan 362 3 2.9 -15.0 -3.1 Kyrgyzstan 314 3 9.8 6.9 2.5 Countries are ranked in descending order of the mean score in reading in 2009. Source: OECD, PISA 2009 Database. 12 http://dx.doi.org/10.1787/888932359948 A corrigendum has been issued for this page. See: http://www.oecd.org/dataoecd/43/61/49198566.pdf
  • 35.
    1 Comparing Performance OverTime 30 © OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V Notes 1. Normally, when making comparisons between two concurrent means, the significance is indicated by calculating the ratio of the difference of the means to the standard error of the difference of the means. If the absolute value of this ratio is greater than 1.96, then a true difference is indicated with 95% confidence. When comparing two means taken at different times, as in the different PISA surveys, an extra error term, known as the linking error, is introduced and the resulting statement of significant difference is more conservative. 2. For Luxembourg, changes were made in the organisational and linguistic aspects of the assessment conditions between PISA 2000 and PISA 2003 in order to improve compliance with OECD standards and to better reflect the national characteristics of the school system. In PISA 2000, students in Luxembourg had been given one assessment booklet, with the language of assessment having been chosen by each student one week before the assessment. In practice, however, a lack of familiarity with the language of assessment was a significant barrier for a large proportion of students in Luxembourg in PISA 2000. In PISA 2003 and PISA 2006, each student was given two assessment booklets – one in each of the two languages of instruction – and the student could choose his or her preferred language immediately prior to the assessment. This provided for assessment conditions that were more comparable with those in countries that have only one language of instruction and resulted in a fairer assessment of the performance of students in mathematics, science, reading and problem-solving. As a result of this change in procedures, the assessment conditions, and hence the assessment results, for Luxembourg cannot be compared between PISA 2000 and PISA 2003. Assessment conditions between PISA 2003 and PISA 2006 were not changed and therefore those results can be compared. 3. In PISA 2000, the initial response rate for the United Kingdom fell 3.7% short of the minimum requirement. At that time, the United Kingdom had provided evidence to the PISA Consortium that allowed for an assessment of the expected performance of the non-participating schools. On the basis of that evidence, the PISA Consortium concluded that the response bias was likely negligible and the results were included in the international report. In PISA 2003, the United Kingdom’s response rate was such that sampling standards had not been met, and a further investigation by the PISA Consortium did not confirm that the resulting response bias was negligible. Therefore, these data were not deemed internationally comparable and were not included in most types of comparisons. For PISA 2006 and PISA 2009, more stringent standards were applied, and PISA 2000 and PISA 2003 data for the United Kingdom are therefore not included in comparisons. 4. In the United States, because of an error in printing the test booklets, some of the reading items had incorrect instructions; as a result, the mean performance in reading cannot be accurately estimated. The impact of the error on the estimates of student performance is likely to exceed one standard error of sampling. This was not the case for science and mathematics items. For details, see Annex A3. 5. As noted in the PISA 2000 Technical Report (OECD, 2002a), the Austrian sample for the PISA 2000 assessment did not cover students enrolled in combined school and work-based vocational programmes as required by the technical standards for PISA. The published PISA 2000 estimates for Austria were therefore biased (OECD, 2001). This non-conformity was corrected in the PISA 2003 assessment. To enable reliable comparisons, adjustments and modified student weights were developed to make the PISA 2000 estimates comparable to those obtained in PISA 2003 (Neuwirth, 2006, available at http://www.oecd-ilibrary.org/ education/oecd-education-working-papers_19939019). 6. Albania, Argentina, Bulgaria, Chile, Hong Kong-China, Indonesia, Israel, Peru and Thailand delayed the PISA 2000 assessment to 2001, while Romania delayed it to 2002. Thus, for these countries, the period of time between PISA 2000 and PISA 2009 assessments is shorter. 7. The relationship between student socio-economic background and performance is captured by a slope co-efficient of the PISA index of economic, social and cultural and educational status (ESCS) in a regression explaining student reading performance (see Chapter 4). 8. Annualised trends that are reported here were calculated by dividing the change in performance by the number of years between two assessments. For example, a change in reading performance between 2000 and 2009 was divided by nine for countries that participated in the first and in the most recent assessments. For countries that participated in PISA 2003 and PISA 2009 but not in PISA 2000, the change in reading performance between 2003 and 2009 was divided by six. Similarly, for participants in PISA 2006 and PISA 2009, a change in performance was divided by three. Although annualised trends were calculated for mathematics, PISA 2000 results were not considered. For science, the change in performance between 2006 and 2009 was divided by three.
  • 36.
    Country Boxes 31 PISA 2009Results: LEARNING TRENDS – VOLUME V © OECD 2010 Box V.B Korea In 2000, with PISA reading performance at 525 score points, Korea was already performing above the OECD average. At that time, several countries had similar or even higher performance levels, including Australia, Canada, Ireland, Japan, New Zealand and Finland, the highest-performing country that year. Nine years later, Finland has retained its top performance level, but Korea now outperforms all of the other abovementioned countries. Korea’s experience demonstrates that even at the highest performance level further improvements are possible. Despite the country’s strong performance in PISA 2000, Korean policy makers considered that students’ skills needed further improvement to meet the changing demands of an internationally competitive labour market. One approach was to shift the focus of the Korean Language Arts Curriculum from proficiency in grammar and literature to skills and strategies needed for creative and critical understanding and representation, along the lines of the approach underlying PISA. Diverse teaching methods and materials that reflected those changes were developed, including investments in related digital and Internet infrastructure. Recognising reading as a key competence in the 21st century, the government also developed and implemented reading-related policies. Schools were requested to spend a fixed share of their budgets on reading education. Training programmes for reading teachers were developed and distributed. Parents were encouraged to participate more in school activities. They were also given information on how to support their children’s schoolwork. In addition to that, socio-economically disadvantaged students were given support through various after-school reading, writing and mathematics courses that had been put in place at the end of the 1990s. The new “National Human Resources Development Strategies for Korea” defined policy objectives and implementation strategies. As part of this, and following experiences with PISA and other instruments, the government established the National Diagnostic Assessment of Basic Competency (NDABC) and strengthened the National Assessment of Educational Achievement (NAEA) as measurement tools for monitoring the quality of educational achievement. These instruments were implemented to ensure that all students had attained basic competencies. The NDABC was implemented as a diagnostic tool in 2002 to measure basic competency in reading, writing and mathematics for third-grade students. These measurement tools are now used locally to diagnose the progress of elementary and middle-school students across different subjects.The NAEA programme was introduced in 1998. Following changes in educational policy in 2002, the programme has expanded its subject and grade coverage. NAEA assesses educational achievement and trends for 6th, 9th and 10th grade students in Korean Language Arts, social studies, mathematics, science and English. With the help of NAEA, the government monitors individual student performance levels and the accountability of public education. Since 2000, Korea has seen significant improvements in both reading and science performance (see Figure V.1.2 and Tables V.2.1 and V.3.4). The proportion of top performers in reading increased in Korea by more than seven percentage points from 5.7% to 12.9% between 2000 and 2009 (see Figure V.2.5 and Table V.2.2). That is the highest observed change among countries participating in PISA. Korea also experienced improved scores in science from an already high level in 2006 (see Figure V.3.5 and Table V.3.4). Moreover, in 2006 11% of its students scored below Level 2 in science, whereas in 2009 this proportion had been reduced to 6% - nearly the lowest among the OECD countries (see Figure V.3.7 and Table V.3.5). On the other hand, Korea is among countries that have seen the highest increase in variation of reading performance (see Figure V.4.1 and Table V.4.1). A closer look reveals that the increase was driven by improvements among high-achieving students that were not shared by low-achieving students (see Figure V.2.11 and Table V.2.3). The 2009 results from Korea also show a modest increase in the association of socio- economic background with PISA performance. One factor that may have contributed to an increase in the number of top performers in reading is the introduction of higher standards and the demand for language literacy. Korean Language Arts as a screening subject have been strengthened in the College Scholastic Ability Test (CSAT), which students must take to get into university, especially top-ranking institutions. Depending on what subjects they intend to take at university and on their future careers, students generally select five to seven subjects on the assessment. However, almost all top-
  • 37.
    Country Boxes 32 ©OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V ranking universities focus on Korean Language Arts, mathematics and English. The reading domain of Korean Langauge Arts, in particular, is the largest and most important part of this assessment, while NAEA/NDABC tend to evaluate the six domains of the Korean Language Arts Curriculum – listening, speaking, reading, writing, literature, and grammar – equally. This provides additional incentives for high-achieving students in Korea to spend more time studying the language arts and also mathematics and science. Korea is also one of the countries with the highest number of students participating in after-school lessons. More than two-thirds of students participate in such lessons for remedial purposes, while nearly half of the students participate in after-school lessons for enrichment purposes in at least one of the following three subjects: science, mathematics and reading (see Volume IV, What Makes a School Successful? Resources, Policies and Practices, Table IV.3.17a). While private lessons are very popular in Korea among those who can afford them, after-school group classes are often subsidised, so even disadvantaged students enrol frequently. For example, as of June 2007, 99.8% of all primary and secondary schools were operating after-school programmes and about 50% of all primary and secondary students participated in after-school activities (see MEHRD, 2007). Many observers suspect that high participation rates in after-school classes in Korea may be due to cultural factors and an intense focus on preparation for university entrance examinations. PISA 2006 data show that Korean students attending schools with socio-economically advantaged students are more likely to attend after-school lessons with private teachers than students in other countries. On the other hand, disadvantaged students in Korea are more likely to attend after-school group lessons more often than students in other countries. In both cases, attending such extra lessons after school is associated with higher performance on PISA (OECD, 2010d). The gender gap increased by 20 score points in Korea, mainly because of a marked improvement in girls’ performance that was not matched by a similar trend among boys (see Figure V.2.7 and Table V.2.4). The percentage of top performers increased among girls by more than nine percentage points, while among boys it rose by slightly less than five percentage points (see Tables V.2.5 and V.2.6). Overall, the average reading performance improved only among girls, while it remained at similar levels among boys. The remarkable improvement in girls’ performance was noticed not only in reading, but also in other assessment areas covered by PISA and other international or national studies. The gender gap in mathematics and science has been narrowing for a number of years, while PISA 2009 results show that the reading advantage of girls has become even greater. National assessments demonstrated that the number of girls performing at the highest levels has been gradually increasing since 2002. Several changes could be associated with the more positive trend among girls. Since 2000, a more female- friendly science and mathematics curriculum has been gradually introduced. For instance, women who were scientists or engineers were promoted and thus became good role models for girls, a more gender-neutral language was used in textbooks, and learning materials that were considered to be more interesting for girls were introduced in science teaching. At the same time, national assessments such as the NAEA were re- developed to better monitor how girls and boys acquire skills differently and to use formats that girls prefer, including, for example, constructed response-item format. On the other hand, the trend may also be explained partly by changes in the society. Over the past few years, the family structure in Korea has changed as the number of children per household has rapidly decreased and the number of single-child families increased. While traditionally girls from larger families were unlikely to get a good education, sociologists note that parents in Korea today tend to value educating their children a great deal, regardless of gender. Smaller families along with new opportunities and incentives for learning may also explain this trend. Korean students’ lower performance in the PISA 2006 science assessment compared with the 2003 assessment prompted policy makers to integrate modern science into school programmes. Although the number of Korean students who performed below Level 2 in both mathematics and science was very small compared to other countries, Korean officials considered the overall level of science performance too low. In 2007, the Korean government decided to merge the Ministry of Science and Technology and the Ministry of Education into one ministry and to improve and strengthen science education in order to enhance creativity and problem- solving skills. Measures that have been undertaken cut across different activities, including providing new mathematics and science textbooks that are more comprehensible and more interesting for students, but also using teaching methods that encourage experimenting and inquiry-oriented science education. Recent
  • 38.
    Country Boxes 33 PISA 2009Results: LEARNING TRENDS – VOLUME V © OECD 2010 improvements in science, especially among top-performing students, could be associated with these latest policy changes. Nevertheless, larger improvements are expected at all performance levels once the new policy is fully implemented. Box V.C Poland In 2000, Poland’s 15-year-old students averaged 479 score points on the PISA reading assessment, well below the OECD average of 500. More troubling for policy makers in Poland was the fact that over 23% of students had not reached the baseline Level 2 in reading. The PISA results also showed large disparities in reading performance between students attending various types of secondary schools. The mean score among students enroled in basic vocational schools – who, at that time, constituted more than one-fifth of all students – was 358 score points, while the mean score among students enroled in general academic schools was 543 score points and that of students in secondary vocational schools was around 480 score points. Even prior to the release of the PISA results in 2000, plans were already under way in Poland to try and improve student learning outcomes. In 1998, the Polish Ministry of Education presented the outline of a reform agenda to: i) raise the level of education in Poland by increasing the number of people with secondary and higher- education qualifications; ii) ensure equal educational opportunities; and iii) support improvements in the quality of education. The reform was also part of a larger set of changes, including devolving more responsibilities for education to local authorities, health reforms and pension-system reforms. The education reform envisaged changes in the structure of the education system, reorganising the school network and transportation; changes in administration and supervision methods; changes in the curriculum; a new central examination system with independent student assessments; reorganising school finances through local government subsidies; and new teacher incentives, such as alternative promotion paths and a revised remuneration system. Although not all proposed changes were finally implemented as proposed, the reform clearly changed the way schools in Poland were managed, financed and evaluated, while also affecting teaching content and methods. The structural changes resulted in a new type of school: the lower secondary “gymnasium” with the same general education programme for all students, which became a symbol of the reform. The previous structure, comprising eight years of primary school followed by four or five years of secondary school or a three-year basic vocational school, was replaced by a system described as 6+3+3. This meant that education at primary school was reduced from eight to six years. After completing primary school, a pupil would then continue his or her education in a comprehensive three-year lower-secondary school. Thus, the period of general education, based on the same curriculum and standards for all students, was extended by one year. Only after completing three years of lower-secondary education would he or she move on to a three- or four-year upper-secondary school that provided access to higher education or to a two- or three-year basic vocational school. In the new system, each stage of education ends with a standardised national examination, which provides students, parents and teachers with feedback. Policy makers can also use the results of the examination to monitor the school system at the local or central level. The reformers assumed that the lower secondary gymnasia would allow Poland to raise the level of education, particularly in rural areas where schools were small. The new lower secondary schools would be larger; they would also be well-equipped, with qualified teachers. Since the number of pupils in each school varies depending on the catchment area, establishing the lower secondary gymnasia involved reorganising the school network. Thus, since 2000, a number of small primary schools have been closed, with many more students travelling to larger lower secondary schools.
  • 39.
    Country Boxes 34 ©OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V The reform postponed choosing between an upper secondary general or vocational curriculum by one year – giving all students one more year of a general lower secondary programme. The reform did not involve pre- primary education, nor did it result in lowering the age at which compulsory schooling begins (seven years); rather, it focused on primary and lower-secondary schools. In the meantime, enrolment in higher education increased from roughly half a million students before 1993 to nearly two million 15 years later (see GUS, 2009). This also changed the environment in which newly established schools operated, with more parents keen to provide their children with the best education and more students choosing schools carefully, taking into consideration future career prospects. Education became highly valued in Poland as the economic returns of a good education grew (see OECD, 2006a). The reformers had two main arguments for proposing such changes. First, dividing education into stages would allow teaching methods and curricula to better meet the specific needs of pupils of various ages. Second, changing the structure of the education system would require teachers to adapt the curriculum and their teaching methods, encouraging teachers to change not only what they taught but also how they taught. After years of complaints about overloaded curricula and disputes about the way forward, the concept of a core curriculum was adopted. This gave schools extensive autonomy to create their own curricula within a pre- determined general framework, balancing the three goals of education: imparting knowledge, developing skills and shaping attitudes. The curricular reform was designed not only to change the content of school education and to encourage innovative teaching methods, but also to change the teaching philosophy and culture of schools. Instead of passively following the instructions of the educational authorities, teachers were expected to develop their own teaching styles, which would be tailored to the needs of their pupils. Introducing curricular reform based on decentralisation required implementing a system for collecting information and monitoring the education system at the same time. The reformers thus decided to organise compulsory assessments to evaluate student achievement at the end of the primary and lower secondary cycles. The results of the primary school assessments would not affect the students’ school career, as completing the cycle would not depend on the results of those assessments. However, for admission to upper secondary schools, the score earned on the lower secondary gymnasium final examination would be considered together with the pupil’s final marks. Both of those examinations were first administered in 2002. Schooling would culminate with the matura examination, first administered in 2005, which would be taken at the end of upper secondary education. All of these examinations would be organised, set and corrected by the central examination board and regional examination boards–new institutions that had been set up as part of the reform. Introducing the national examination system not only provided an opportunity to monitor the outcomes of schools centrally in a partly decentralised system, it also changed incentives for students and teachers. It sent a clear signal to students that their success depended directly on their externally evaluated outcomes. It also made it possible to assess teachers and schools on a comparable scale across the whole country. Finally, it provided local governments with information on the outcomes of schools that were now under their organisational and financial responsibility. After the reform, local governments became an even more important part of the Polish school system. Although by 1996 almost all primary schools were already under the responsibility of local governments, changes in the financing scheme had been introduced together with the reform. The need to reorganise the school network in order to create space for lower secondary schools provided additional incentives for local governments to increase the efficiency and the quality of their local schools. After 2000, school funds were transferred to local governments using a per-pupil formula. Those funds now constitute a large share of their budgets. After 2002, some local governments also started using results from national examinations to evaluate their schools and to shape pre-primary and upper secondary education in their local area. The reform also introduced a new system of teacher development and evaluation. Initially, many teachers upgraded their levels of education and professional skills to meet those new requirements. But the changes only partly affected the remuneration system, which gives local governments and school principals little discretion. This, together with high employment security and other benefits contained in the so-called Teacher Charter, limited the impact of changes on the teaching profession (see OECD, 2006a).
  • 40.
    Country Boxes 35 PISA 2009Results: LEARNING TRENDS – VOLUME V © OECD 2010 The age cohorts covered by PISA in 2000, 2003 and 2006 have been affected by the reform in different ways. The first group, those assessed in 2000, was not affected by the reform. The group of 15-year-olds in 2003 that was covered by the second PISA assessment started primary school in the former system, but attended the new lower secondary gymnasia. Those students all had the same educational curricula and were not divided into different school types. The group covered by PISA 2006 had been part of the reformed educational system for most of its school career, while those assessed in 2009 had been part of that system for their entire school career. While it is not possible to establish a causal relationship between the reform and the outcomes measured by PISA, reading performance in Poland has improved by 21 score points since 2000 (see Figure V.2.1 and Table V.2.1).The largest improvement was observed in PISA 2003, right after the reform.The PISA 2009 results suggest that the lowest-performing students appear to have benefited most from the reform. The share of students below proficiency Level 2 decreased by eight percentage points and the performance of the lowest-achieving students increased by 40 score points, while remaining at similar levels for the highest-achieving students (see Figure V.2.4 and Tables V.2.2 and V.2.3). Additional analyses suggest that students from former vocational schools benefited most from these reforms (see Jakubowski, Patrinos, Porta, Wisniewski, 2010). Lower secondary school students assessed in 2006 with the same background as students who were in basic vocational schools in 2000 performed higher by roughly one standard deviation on the PISA reading scale. Smaller improvements were also apparent among 2006 lower secondary school students who had a similar background to those in secondary vocational schools in 2000, although the benefits to those who were similar to students in general upper secondary schools in 2000 were negligible. This suggests that the reform improved outcomes for students who would end up in former basic vocational schools under the old system and who were given a chance to acquire more general skills in newly created lower secondary schools. Poland reduced its total variation in reading performance by 20% (see Figure V.4.1 and Table V.4.1). This was mainly achieved by reducing the differences in performance between schools and improving performance among the lowest achievers. From a relatively high level in 2000, between-school variation decreased by three-fourths to a level well below the OECD average. Moreover, by 2009, the association between a school’s socio-economic background and its mean performance was three times weaker than that in 2000, although the overall impact of socio-economic background on performance remained unchanged (see Figure V.4.3 and Table V.4.3). This suggests that the school reform in Poland had the effect of distributing students from different backgrounds more evenly across schools. Nevertheless, the overall improvement in performance, larger improvements among the lowest-achieving students, and a decrease in the total variation of student performance, suggest that Poland improved markedly both with regard to its mean performance as well as to the level of equity in the distribution of learning opportunities.
  • 42.
    2 PISA 2009 Results:LEARNING TRENDS – VOLUME V © OECD 2010 37 This chapter highlights trends in reading performance between 2000 and 2009. It includes changes in performance among the lowest- and highest-achieving students, boys and girls, students with an immigrant background, socio-economically advantaged and disadvantaged students, and among countries. Trends in Reading
  • 43.
    2 TRENDS IN READING 38© OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V Continuity and change in the reading literacy framework and assessment Reading literacy includes a broad set of cognitive competencies, from basic decoding, through knowledge of words, grammar and linguistic and textual structures and features, to knowledge about the world. It also includes metacognitive competencies: the awareness of and ability to use a variety of appropriate strategies when processing texts. Specifically, PISA defines reading literacy as understanding, using and reflecting on written texts in order to achieve one’s goals, acquire knowledge, develop one’s potential and participate in society (OECD, 2006b). A more detailed description of the conceptual framework underlying the PISA reading assessment is provided in Volume I of this report, What Students Know and Can Do. The framework and instruments for measuring reading literacy were developed for the PISA 2000 assessment. The PISA 2000 mean score for reading for 28 OECD countries was set at 500 and the standard deviation was set at 100, establishing the scale against which reading performance in PISA 2009 was compared. Two countries that participated in PISA 2000 have joined the OECD since 2000, while results for four OECD countries were excluded from comparisons over time. Thus, reading performance trends are discussed for the 26 OECD countries that participated in and had comparable results from both the 2000 and 2009 assessments. The PISA 2000 OECD average for these 26 OECD countries is now 496, while the reading performance scale remained unchanged.1 In PISA 2003 and PISA 2006, when the focus shifted first to mathematics and then to science, reading was allocated smaller amounts of assessment time than in 2000, allowing only for an update on overall performance rather than the kind of in-depth analysis of knowledge and skills that was possible for the PISA 2000 and 2009 assessments. To ensure comparability across successive reading assessments, 41 out of the 130 PISA reading items used in the 2009 assessment were taken from the PISA 2000 assessment. These items were selected to reflect the overall balance of the framework so that the proportion of items contained in each type of task was similar. From the 41 items assessed in both 2000 and 2009, 28 reading items were also used in PISA 2003 and 2006 to assure the comparability of results for these assessments. Details of the equating methodology for reading performance trends are provided in Annex A1. The scale on which student performance is reported is thus the same as the one used in 2000. It can be compared across all four cycles. Consequently, the proficiency levels are also the same, although in 2009 the reading scale was extended with new proficiency levels, at both the top and bottom ends of the performance distribution, to reflect the capacity of PISA 2009 to provide more detailed information about low- and high-performing students. How student performance in reading has changed since 2000 The OECD’s average reading performance has remained similar since 2000, in relation to the 26 OECD countries that had comparable results both in the 2000 and 2009 assessments. This, in itself, is noteworthy because in recent years, most countries have increased their investment in education substantially. Between 1995 and 2007, expenditure per primary and secondary student increased by 43% in real terms, on average, across OECD countries (OECD, 2010b, Table B1.5). In the short period between 2000, when the first PISA assessment was undertaken, and 2007, increases in expenditures on education averaged around 25%; eight OECD countries increased their expenditures by between 35% and 71%. While not all these expenditures were devoted to raising the performance of students assessed in PISA, it is intriguing that in many countries, such major financial efforts have not yet translated into improvements in performance. However, some countries have seen marked improvements in learning outcomes. Among the 38 countries that can be compared between 2000 and 2009, 13 have seen improvements in reading performance since 2000 (Figure V.2.1, see alsoTableV.2.1). Of the 26 OECD countries with comparable results for both assessments, seven countries have seen improvements: Chile, Israel and Poland all improved their reading performance by more than 20 score points, and Portugal, Korea, Hungary and Germany by between 10 and 20 score points. Similarly, among the partner countries, Peru, Albania, Indonesia and Latvia improved their performance by more than 20 score points, and Liechtenstein and Brazil by between 10 and 20 score points. Four countries saw a decline in their reading performance between 2000 and 2009. Among those, student performance in Ireland decreased by 31 score points, in Sweden by 19 score points, and in Australia and the Czech Republic by 13 score points. PISA considers only those results as statistically significant, marking them as such, where the uncertainty in measuring changes in performance implies that an increase or decrease would be identified in less than five out
  • 44.
    2 TRENDS IN READING PISA2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010 39 of 100 replications of PISA when, in fact, there is no change. It is possible to calculate the exact percentage of replications in which a change would be reported when there is no real change. This so-called “p-value” is reported in Figure V.2.1 (see also the last column in Table V.2.1). The smaller this percentage, the more confidence one can have that the observed changes are real. The p-value allows readers to assess the reliability of observed performance differences that are not identified as statistically significant by PISA, using the stringent criteria described above. For example, the observed increase in performance is nine score points in Greece and eight score points in Hong Kong- China. This is a sizeable magnitude but the p-values for these estimates suggest that, in 28 out of 100 replications in the case of Greece and in 21 out of 100 replications in the case of Hong Kong-China, PISA could have identified such a change even if there is, in fact, no change. Because of the magnitude of the potential error, PISA does not identify these changes as statistically significant. However, readers who are satisfied with a lower level of confidence can still take these values into consideration. • Figure V.2.1• Change in reading performance between 2000 and 2009 -30 -25 -20 -15 -10 -5 0 5 10 15 20 25 30 35 40 45 50 Score point change in reading performance between 2000 and 2009 -35 p-value in % Peru 0 Chile 0 Albania 0 Indonesia 0 Latvia 0 Israel 4 Poland 0 Portugal 1 Liechtenstein 2 Brazil 1 Korea 3 Hungary 4 Germany 3 Greece 28 Hong Kong-­ C hina 21 Switzerland 38 Mexico 60 OECD average – 26 90 Belgium 86 Bulgaria 89 Italy 81 Denmark 74 Norway 74 Russian Federation 74 Japan 77 Romania 63 United States 62 Iceland 21 New Zealand 20 France 17 Thailand 15 Canada 6 Finland 8 Spain 5 Australia 4 Czech Republic 3 Sweden 0 Argentina 9 Ireland 0 Note: Statistically significant score point changes are marked in a darker tone. Countries are ranked in descending order of the score point change in reading performance between 2000 and 2009. Source: OECD, PISA 2009 Database, Table V.2.1. 12 http://dx.doi.org/10.1787/888932359967 Countries differ in their absolute performance levels, so even with improvements in reading performance, some countries still perform far below the OECD average, while some countries with a decline in reading performance may still outperform many other countries. It is thus useful to examine both where countries stand and how performance has changed. Countries towards the right of FigureV.2.2 improved their performance between 2000 and 2009, while those towards the left saw a decrease in student scores. Countries towards the top performed above the OECD average in 2009, while those towards the bottom performed below the OECD average. Countries that improved their performance between 2000 and 2009 can be classified into three groups, depending on their performance level in 2009. The first group includes countries that improved their performance but still performed below the OECD average. These countries are represented in the bottom-right corner of Figure V.2.2. The second group includes countries that improved their performance so that they now perform close to the OECD average. These countries are represented in the middle-right of Figure V.2.2. The third group contains countries that had already outperformed most of the PISA participants but still improved their performance. These countries are on the top-right part of Figure V.2.2. For countries with a white marker the changes were not statistically significant. Among countries that scored above the OECD average in 2009, three countries improved their performance. Korea improved its performance by 15 score points from an already high level in 2000. Poland improved its performance by 21 score points and, from a country that performed below the OECD average in 2000, became a country that
  • 45.
    2 TRENDS IN READING 40© OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V scored above the OECD average in 2009. The partner country Liechtenstein improved its performance by 17 score points. More detailed discussions of the school systems in Korea and Poland are provided in Boxes V.B and V.C, respectively, which appear between Chapters 1 and 2. Among average-performing countries in 2009, reading performance improved in Portugal, Hungary and Germany. Box V.D, which appears between Chapters 3 and 4, provides more details on reforms in Portugal. Several countries with below-average performance in 2009 saw marked improvements. Among OECD countries, student performance in Chile increased by 40 score points and is now close to 450 score points, while student performance in Israel increased by 22 score points and is now equal to 474 score points. Chile’s school system is briefly discussed in Box V.F, which appears after Chapter 4. The partner country Peru saw the largest improvement, with an increase of 43 score points, although its overall performance is still below 400 score points. Albania and Indonesia increased their performance by 30 to 40 score points, although both countries still perform at or below 400 score points. Brazil increased its performance by 16 score points and now performs above 400 score points (see Box V.G, which appears after Chapter 5). Latvia increased its performance by 26 score points and now performs at 484 score points. A number of countries performing above the average saw a decrease in reading scores. Australia’s performance declined by 13 score points but the country still ranks among the top performers in reading. Performance in Ireland and Sweden declined by 31 and 19 score points, respectively, and both countries now perform around the OECD average. The Czech Republic also saw a decline in performance and now scores below the OECD average. • Figure V.2.2 • How countries perform in reading and how reading performance has changed since 2000 Greece Hong Kong-China Switzerland Mexico Belgium Bulgaria Italy Denmark Norway Russian Federation Japan Romania United States Iceland New Zealand France Thailand Canada Finland Spain Argentina Chile Albania Indonesia Latvia Israel Poland Portugal Liechtenstein Brazil Korea Hungary Germany Australia Czech Republic Sweden Ireland 350 400 450 500 550 600 -40 -30 -20 -10 0 10 20 30 40 50 Mean score in reading in 2009 Score point change in reading between 2000 and 2009 PISA 2009 performance above OECD average Performance declined PISA 2009 performance above OECD average Performance improved PISA 2009 performance below OECD average Performance improved PISA 2009 performance below OECD average Performance declined Peru Note: Score point change in reading performance between 2000 and 2009 that are statistically significant are marked in a darker tone. Source: OECD, PISA 2009 Database, Table V.2.1. 12 http://dx.doi.org/10.1787/888932359967 OECD average Score point change in reading performance between 2000 and 2009
  • 46.
    2 TRENDS IN READING PISA2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010 41 • Figure V.2.3• Multiple comparisons between 2000 and 2009 Reading performance in 2000 Reading performance in 2009 Countries with lower performance in 2000 and similar performance in 2009 Countries with lower or similar performance in 2000 and higher performance in 2009 Countries with similar performance in 2000 and 2009 Countries with similar or higher performance in 2000 and lower performance in 2009 Countries with higher performance in 2000 and similar performance in 2009 Korea 525 539 Hong Kong-China Japan, Canada, Ireland, New Zealand, Australia Finland Finland 546 536 Korea, Hong Kong-China Hong Kong-China 525 533 Korea Japan, Canada, Ireland, New Zealand, Australia Finland Canada 534 524 Japan Korea, Hong Kong-China New Zealand Australia New Zealand 529 521 Korea, Hong Kong-China Japan, Canada, Australia Ireland Japan 522 520 Korea, Hong Kong-China New Zealand, Australia Sweden, Ireland Canada Australia 528 515 Canada, Korea, Hong Kong-China Japan, New Zealand Ireland Belgium 507 506 Liechtenstein, Switzerland, Poland Norway, United States Iceland, Sweden, Ireland, France Norway 505 503 Liechtenstein, Germany, Switzerland, Poland Iceland, Belgium, United States, France Sweden, Ireland Switzerland 494 501 Liechtenstein, Germany, Poland, Hungary Denmark, United States Italy Spain, Czech Republic Iceland, Norway, Belgium, Sweden, Ireland, France Poland 479 500 Liechtenstein, Germany, Hungary Italy Portugal, Spain, Greece, Czech Republic Iceland, Norway, Switzerland, Belgium, Denmark, Sweden, Ireland, United States, France Iceland 507 500 Liechtenstein, Germany, Switzerland, Poland, Hungary Belgium Norway, United States, France Sweden, Ireland United States 504 500 Liechtenstein, Germany, Poland, Hungary Iceland, Norway, Switzerland, Belgium, Denmark, Sweden, France Spain, Czech Republic Ireland Liechtenstein 483 499 Germany, Poland, Hungary Italy Spain, Greece, Czech Republic Iceland, Norway, Switzerland, Belgium, Denmark, Sweden, Ireland, United States, France Sweden 516 497 Iceland, Norway, Liechtenstein, Germany, Switzerland, Denmark, Poland, Portugal, Hungary, France Japan, Belgium United States Ireland Germany 484 497 Liechtenstein, Poland, Hungary Italy Spain, Greece, Czech Republic Iceland, Norway, Switzerland, Denmark, Sweden, Ireland, United States, France Ireland 527 496 Iceland, Norway, Liechtenstein, Germany, Switzerland, Denmark, Sweden, Poland, Portugal, Hungary, United States, France Japan, Belgium, Korea, Hong Kong-China, New Zealand, Australia France 505 496 Liechtenstein, Germany, Switzerland, Denmark, Poland, Portugal, Hungary Belgium Iceland, Norway, United States Sweden, Ireland Denmark 497 495 Liechtenstein, Germany, Poland, Portugal, Hungary Switzerland, United States Spain, Czech Republic Sweden, Ireland, France Hungary 480 494 Liechtenstein, Germany, Poland, Portugal Italy Spain, Greece, Czech Republic Iceland, Switzerland, Denmark, Sweden, Ireland, United States, France Portugal 470 489 Poland Latvia, Greece, Hungary Russian Federation, Israel Spain, Czech Republic Italy, Denmark, Sweden, Ireland, France Italy 487 486 Latvia, Portugal, Greece Liechtenstein, Germany, Switzerland, Poland, Hungary Spain Czech Republic Latvia 458 484 Portugal Russian Federation, Israel Italy, Spain, Greece, Czech Republic Greece 474 483 Latvia, Israel Liechtenstein, Germany, Poland, Hungary Portugal Russian Federation Italy, Spain, Czech Republic Spain 493 481 Latvia, Israel, Greece Liechtenstein, Germany, Switzerland, Denmark, Poland, Portugal, Hungary, United States Italy, Czech Republic Czech Republic 492 478 Latvia, Israel, Greece Italy, Liechtenstein, Germany, Switzerland, Denmark, Poland, Portugal, Hungary, United States Spain Israel 452 474 Latvia, Portugal Russian Federation Spain, Greece, Czech Republic Russian Federation 462 459 Latvia, Israel, Portugal, Greece Chile 410 449 Argentina,Thailand, Bulgaria, Romania, Mexico Bulgaria 430 429 Chile Thailand, Romania, Mexico Argentina Mexico 422 425 Chile Thailand, Bulgaria, Romania Argentina Romania 428 424 Chile Thailand, Bulgaria, Mexico Argentina Thailand 431 421 Chile Bulgaria, Romania, Mexico Argentina Brazil 396 412 Argentina Indonesia 371 402 Argentina Argentina 418 398 Indonesia Thailand, Bulgaria, Romania, Brazil, Mexico, Chile Albania 349 385 Peru 327 370 Source: OECD, PISA 2009 Database. 12 http://dx.doi.org/10.1787/888932359967
  • 47.
    2 TRENDS IN READING 42© OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V Figure V.2.3. provides multiple comparisons of changes in the relative standing of countries in reading performance in 2000 and 2009. Countries are sorted by their performance in 2009. For each country the figure identifies a list of other countries or economies with similar performance. The first group includes comparisons between countries that had lower scores in 2000 but have similar performance levels in 2009 as the country shown in the first column. The second group includes countries with lower or similar scores in 2000 but that show higher scores in 2009. The third group includes countries whose performance was similar in 2000 and 2009. The fourth group includes countries with similar or higher scores in 2000 and lower scores in 2009. The fifth group includes countries with higher scores in 2000 and similar scores in 2009. The figure includes all 38 countries that have comparable results from the 2000 and 2009 assesments. The chart can be used to see how the position of a country changed compared to other countries that are close in relative performance. Mean performance summarises overall student performance in PISA. While it gives a general idea of how countries perform in comparison to others, mean performance can mask important variations in student performance. For policy makers, information about the variability of student performance is important. For example, readers interested in policies and practices relating to the most talented students might be interested in those countries in which the highest-achieving students improved their performance, or countries in which the share of high-achieving students grew. Similarly, readers interested in policies and practices relating to lower-performing students might examine more closely those countries that have seen improvements among the lowest-achieving students, or where the share of low-achieving students decreased. Performance trends among low- and high-achieving students can be examined by considering changes in the percentage of students at each of the PISA proficiency levels. As explained in Volume I, What Students Know and Can Do, reading scores in 2009 are reported according to different levels of proficiency that correspond to tasks of varying difficulty. Establishing proficiency levels in reading makes it possible not only to rank students’ performance but also to describe what students at different levels of the reading scale can do. As explained in Volume I, reading proficiency Level 2 can be considered a baseline level of proficiency, at which students have learned to read and begin to demonstrate the kind of competencies that are required to use reading for learning. Students below this level may still be capable of locating pieces of explicitly stated information that are prominent in the text, recognising a main idea in a text about a familiar topic, or recognising the connection between information in the text and their everyday experience. However, they have not acquired the level of literacy that is required to participate effectively and productively in life. On average across the 26 OECD countries with comparable results for both assessments, 18.1% of students performed below Level 2 in 2009, while the corresponding percentage in 2000 was 19.3% (TableV.2.2). Although this percentage changed only slightly between the two assessments, it varied noticeably among countries. Reducing the percentage of poorly performing students is considered one of the most important tasks for school systems in many countries, given the large economic and social costs associated with poor performance in school. Following up on students who were assessed in PISA 2000, the Canadian Youth in Transitions Survey shows that students scoring below Level 2 face a disproportionately higher risk of poor participation in post-secondary education or low labour-market outcomes at age 19, and even worse outcomes at age 21, the latest age for which these data are available (OECD, 2010a). FigureV.2.4 shows changes in the share of students below Level 2. For each country, bars represent the percentage of students performing below Level 2 in 2009, while markers denote that share in 2000. Countries are sorted according to the percentage of students below Level 2 in 2009, with those that show fewer students at this low proficiency level are on the left. To make comparisons of changes in the percentage of students at different proficiency levels more meaningful, countries can be grouped according to how many students in those countries performed at each level in 2000. In 2000, more than 60% of students in Peru, Albania and Indonesia performed below Level 2 (Table V.2.2). All three countries have seen a reduction in this share of more than 10 percentage points. The proportion of lower-performing students remained at relatively high levels in these countries, but this trend shows that real progress has been made in all the PISA countries where the very highest percentages of 15-year-olds have limited reading skills.
  • 48.
    2 TRENDS IN READING PISA2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010 43 Among countries where between 40% and 60% of students performed below Level 2 in 2000, in Chile that proportion decreased by 18 percentage points (see Box V.F), while the proportion decreased by smaller amounts in Mexico and the partner country Brazil (see Box V.G). Among countries where the proportion of students performing below Level 2 was smaller than 40% but still above the OECD average of 19%, the partner country Latvia reduced the proportion by 13%, while Portugal, Poland, Hungary, Germany, Switzerland, and the partner country Liechtenstein reduced it by smaller amounts (see Boxes V.D for Portugal and V.C for Poland for examples of policies that might be associated with these trends). In the partner country Thailand, the proportion of students performing below Level 2 increased by six percentage points from a relatively high level of 37%. In countries where the proportion of students performing below Level 2 was already below average in 2000, Denmark further reduced the proportion by three percentage points and now shows 15% of students below Level 2. The proportion of students below Level 2 increased in Ireland, the Czech Republic, Sweden, France, Spain and Iceland. While this proportion is still below the OECD average in Iceland, Ireland and Sweden, it is now above average in France, Spain and the Czech Republic. Students performing at Level 5 or 6 are frequently referred to as “top performers” in this report. These students can handle texts that are unfamiliar in either form or content. They can find information in such texts, demonstrate detailed understanding, and infer which information is relevant to the task. Using such texts, they are also able to evaluate critically and to build hypotheses, draw on specialised knowledge and accommodate concepts that may be contrary to expectations. A comparison of the kinds of tasks students at Level 5 or above are capable of suggests that those who get to this level can be regarded as potential “world class” knowledge workers of tomorrow. Thus, the proportion of a country’s students reaching this level is a good indicator of its future economic competitiveness. On average across the 26 OECD countries with comparable results for both assessments, the combined percentage of students performing at Level 5 or 6 was 9.0% in 2000 and decreased to 8.2% in 2009 (see Table V.2.2). Although the proportion of students at this level changed only slightly between the assessments, it varies considerably across countries. • Figure V.2.4 • Percentage of students below proficiency Level 2 in reading in 2000 and 2009 Korea 0 Finland 0 Hong Kong-­ C hina 0 Canada 0 Japan 0 Australia 0 New Zealand 0 Norway 0 Poland -­ Denmark -­ Liechtenstein -­ Switzerland -­ Iceland + Ireland + Sweden + Hungary -­ Latvia -­ United States 0 Portugal -­ Belgium 0 OECD average –­ 26 -­ Germany -­ Spain + France + Italy 0 Greece 0 Czech Republic + Israel 0 Russian Federation 0 Chile -­ Mexico -­ Romania 0 Bulgaria 0 Thailand + Brazil -­ Argentina 0 Indonesia -­ Albania -­ Peru -­ 2009 higher than 2000 2009 lower than 2000 No statistically significant difference 95% confidence level + - 0 Countries are ranked in ascending order of the percentage of students below proficiency Level 2 in reading in 2009. Source: OECD, PISA 2009 Database, Table V.2.2. 12 http://dx.doi.org/10.1787/888932359967 Change in the percentage of students below proficiency Level 2 in reading between 2000 and 2009 0 10 20 30 40 50 60 70 80 90 Percentage of students below proficiency Level 2 2009 2000
  • 49.
    2 TRENDS IN READING 44© OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V Figure V.2.5 shows changes in the shares of top-performing students. For each country, blue bars represent the percentage of students performing at Level 5 or 6 in 2009, while markers denote the corresponding proportion in 2000. Countries are sorted according to the percentage of students at Level 5 or above in 2009, with countries that have the largest proportion of top performers on the left. • Figure V.2.5 • Percentage of top performers in reading in 2000 and 2009 Note: Countries are ranked in descending order of top performers in reading in 2009. Source: OECD, PISA 2009 Database, Table V.2.2. 12 http://dx.doi.org/10.1787/888932359967 The proportion of top performers increased in Japan and Korea and the partner economy Hong Kong-China to one of the higest levels among 2009 participants (Table V.2.2). In Japan, this proportion increased from nearly 10% to above 13%. In Korea, it increased by more than seven percentage points from less than 6% to almost 13%, which was the highest observed change among participating countries. Because of this improvement, Korea moved from below to above the OECD average in the percentage of top performers (see also Box V.B). In Hong Kong-China, this proportion increased by almost three percentage points to slightly more than 12%. Among countries that have relatively low proportions of top performers, the percentage of students at Level 5 or above increased by three percentage points in Israel, and by less than one percentage point in Chile and the partner country Brazil. In several countries that had above-average proportions of top performers in 2000, this percentage decreased. The most noticeable change was in Ireland, where the proportion of top performers decreased from 14% to 7%, which is below the OECD-26 average. In Australia, Canada, Finland and New Zealand, the decrease was smaller and all these countries still have more top performers than the OECD average for the 26 countries that have comparable results from both assessments. This proportion decreased in Norway and Sweden from a similar level of 11% in 2000 to 9% in Sweden and 8% in Norway. The proportion of top performers decreased from 8% to less than 5% in Denmark and from 7% to 5% in the Czech Republic. Interestingly, in Denmark, the proportion of students below Level 2 also decreased. The partner country Romania is the only country where the proportion of top performers decreased from an already low level, from 2% to less than 1%. While trends in proficiency levels compare the highest- and the lowest-performing students with an absolute measure, it is also possible to compare the top and bottom ends of the performance distribution relative to the average student within a country. This is particularly useful in countries with very low or high overall levels of student performance, in which international benchmarks for the highest- and the lowest-performing students may be less relevant. Such within-country comparisons can be facilitated by analysing the percentiles of the New Zealand - Finland - Japan + Korea + Australia - Canada - Hong Kong-­ C hina + Belgium 0 United States 0 France 0 Sweden - Iceland 0 Norway - Switzerland­ 0­ Germany­ 0­ Israel + Poland 0 Ireland - Hungary 0 Italy 0 Greece 0 Czech Republic - Portugal­ 0­ Denmark -­ Liechtenstein 0 Spain 0 Russian Federation 0 Latvia 0 Bulgaria 0 Brazil + Chile + Argentina 0 Romania - Peru ­ 0­ Mexico 0 Thailand 0 Albania 0 Indonesia­ 0­ 2009 higher than 2000 2009 lower than 2000 No statistically significant difference 95% confidence level + - 0 0 2 4 6 8 10 12 14 16 18 20 Percentage of top performers 2009 2000 Change in the percentage of top performers in reading between 2000 and 2009
  • 50.
    2 TRENDS IN READING PISA2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010 45 student performance distribution within a country. Percentiles do not indicate what students can do; they provide quantitative information on the performance of the lowest- or the highest-achieving students relative to other students in a country. The 90th percentile indicates the point on the PISA performance scale below which 90% of students in a country score or which only 10% of students exceed. Changes in the value of the 90th percentile show whether a country saw an increase or decrease in the performance level of its highest-performing students. Similarly, the 10th percentile indicates the point on the PISA performance scale below which only 10% of students in a country score. A change in the value of the 10th percentile indicates whether a country sees an increase or decrease in the performance level of its lowest-performing students. The difference between the 90th and 10th percentiles can be used as a measure of the range of performance in each country. Trends in this difference show whether the variation in student performance within the country is changing. Performance at key percentile ranks can change even if a country’s mean performance remains the same. Figure V.2.6 classifies countries into four groups (see also Table V.2.3). Countries in the top-right corner show improved performance among both their highest- and lowest-achieving students, while countries in the bottom- • Figure V.2.6 • Performance changes among the lowest- and highest-achieving students in reading between 2000 and 2009 Albania Chile Denmark Indonesia Ireland Peru Argentina Belgium Bulgaria Czech Republic Finland Greece Hong Kong-China Hungary Iceland Italy Mexico New Zealand Russian Federation Spain United States Australia Brazil Israel Japan Korea Romania France Germany Latvia Liechtenstein Norway Poland Portugal Sweden Switzerland -40 -30 -20 -10 0 10 20 30 40 50 -40 -30 -20 -10 0 10 20 30 40 50 60 70 Change in the 90th percentile between 2000 and 2009 Change in the 10th percentile between 2000 and 2009 Changes for lowest- and highest-achieving students Decrease in lowest-achieving students Decrease in highest-achieving students Increase in lowest-achieving students Decrease in highest-achieving students Increase in highest-achieving students Increase in lowest-achieving students Increase in highest-achieving students Decrease in lowest-achieving students Thailand Canada Note: Changes for both lowest- and highest-achieving students that are statistically significant are marked in darker tone. Source: OECD, PISA 2009 Database, Table V.2.3. 12 http://dx.doi.org/10.1787/888932359967
  • 51.
    2 TRENDS IN READING 46© OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V left corner show a decline in performance among both groups of students. Countries in the top-left corner show improvements in performance among their highest-achieving students and a decline in the performance of their lowest-achieving students. In these countries, variation in performance increased because of the widening gap between the top and the bottom levels of student performance. Countries in the bottom-right corner show an improvement in performance among their lowest-achieving students and a decline among their highest-achieving students. In these countries, the variation in performance diminished. Most of the countries, however, are situated in the top-right or bottom-left corners, indicating that performance trends among their lowest- and highest-achieving students in these countries are similar. Countries indicated with blue markers showed statistically significant changes in the performance of both their highest- and lowest-achieving students. Countries indicated with white markers did not see statistically significant changes or saw them for either the highest- or the lowest-achieving students, but not for both. Chile and three partner countries, Indonesia, Albania and Peru, all show marked improvements in reading performance among both their lowest- and highest-achieving students. These countries are also among those that show the largest improvement in mean performance and in which the percentage of students performing below Level 2 decreased. The lowest-achieving students show relatively larger improvements than the highest-achieving students in Chile and Indonesia, while in Peru and Albania both groups of students show similar levels of improvement. In short, in these countries, students across the entire performance scale improved. Six countries – Poland, Portugal, Germany, Switzerland, and the partner countries Latvia and Liechtenstein – saw improvements in the performance of their lowest-achieving students while maintaining the performance level among the highest-achieving students. Korea, Israel, and the partner country Brazil raised the performance of their highest-achieving students while maintaining the performance level among the lowest-achieving students. In Denmark, the performance of the lowest-achieving students improved, while the performance of the highest- achieving students declined. Similarly, in Norway, the performance of the lowest-achieving students improved and the share of top performers decreased. As a consequence, the performance gap between the lowest- and the highest- achieving students narrowed markedly in these two countries, while their mean performance did not change. In Australia and Canada, and the partner country Romania, performance among their highest-achieving students declined while performance among their lowest-achieving students remained largely unchanged. In France, the performance of the lowest-achieving students declined while the performance of the highest-achieving students remained the same. In Ireland and to some extent in Sweden, the performance of both the lowest- and highest-achieving students declined. These countries are also among those that show the greatest decrease in mean performance results and are among those in which the percentage of students at the highest proficiency levels fell while the percentage of those below Level 2 rose. Fortherestofthecountries,performanceamongthelowest-andthehighest-achievingstudentsdidnotchangemeasurably. How gender differences in reading have evolved The gender gap is far wider in reading than it is in either mathematics or science, and this has been true since the first PISA assessment in 2000. Girls outperform boys in reading in all countries participating in 2009, with an average advantage of 39 score points across OECD countries (see Table V.2.4). In 2000, the corresponding gender gap was 32 score points, on average, across OECD countries. The gender gap widened in some countries, but it did not narrow in any country. It increased by more than 20 score points in Israel and Korea and the partner country Romania. In all of these countries, the score point difference between boys and girls at least doubled. In Israel and Korea, the gap widened because of a marked improvement in girls’ performance that was not matched by a similar trend among boys (see Box V.B, which discusses changes in girls’ performance in Korea). The performance advantage among girls also increased in Portugal, the partner economy, Hong Kong-China, and the partner countries, Indonesia and Brazil, where the overall positive trend was due, in part, to a greater improvement among girls in comparison with boys. The gender gap also widened in France and Sweden, mainly because of a decline in boys’ performance.
  • 52.
    2 TRENDS IN READING PISA2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010 47 None of the countries where the advantage of girls increased is among those with the widest gender gaps. However, after the changes in the relative performance of boys and girls in Romania and Israel, the gender gap has become wider in these countries than on average across OECD countries, while it had previously been narrower. In general, girls’ performance advantage in reading is most pronounced in the percentage of students who perform below Level 2 (Tables V.2.5 and V.2.6). Across OECD countries, 24% of boys perform below Level 2 compared to only 12% of girls. Policy makers in many countries are already concerned about the large percentage of boys who lack basic reading skills. Therefore, any increase in this share is worth noting. Figure V.2.8 shows changes in the percentages of boys and girls who perform below Level 2 in reading. Countries are sorted according to the overall trend among lower-performing students, with those where their numbers have fallen most shown on the left. Across OECD countries, the percentage of girls performing below Level 2 decreased by two percentage points, while the share of lower-performing boys did not change. In nearly all countries where there was a decrease in the percentage of students performing below Level 2, this trend was usually more apparent among girls. In Indonesia, the overall decrease in the percentage of students performing below Level 2 was around 15 percentage points; but while the percentage of girls performing below Level 2 decreased by 21 percentage points, the percentage of boys performing at that level decreased by only 9 percentage points. Similarly, in Peru and Albania, the share of girls performing below Level 2 decreased by 19 and 17 percentage points, respectively, whereas the corresponding share of boys decreased by 11 and 12 percentage points, respectively. In Israel and Brazil, the overall decrease in the share of students performing below Level 2 • Figure V.2.7 • Comparison of gender differences in reading between 2000 and 2009 0 10 20 30 40 50 60 70 Score point difference Gender difference in performance in 2000 Gender difference in performance in 2009 Chile 0 Peru­ 0­ United States 0 Mexico 0 Belgium 0 Brazil + Denmark 0 Spain 0 Liechtenstein 0 Hong Kong-­ C hina + Canada 0 Korea + Indonesia + ­ Argentina 0 Australia 0 Thailand 0 Hungary­ 0­ Portugal­ +­ Switzerland­ 0­ Japan 0 Ireland 0 Germany­ 0­ France + Israel + Romania + Iceland 0 Russian Federation 0 Sweden + New Zealand 0 Italy 0 Greece 0 Norway 0 Latvia 0 Czech Republic 0 Poland 0 Finland 0 Bulgaria 0 Albania 0 Notes: All gender differences in PISA 2009 are significant. Gender differences in 2000 that are statistically significant are marked in a darker tone. Countries are ranked in ascending order of gender differences (girls - boys) in 2009. Source: OECD, PISA Database 2009, Table V.2.4. 12 http://dx.doi.org/10.1787/888932359967 Change in the perfor- mance gap between boys and girls in reading between 2000 and 2009 Girls perform better in all countries/economies 2009 higher than 2000 No statistically significant difference 95% confidence level + 0
  • 53.
    2 TRENDS IN READING 48© OECD 2010 PISA 2009 Results: LEARNING TRENDS – VOLUME V was also mainly the result of improvements among girls, with 11 and 9 percentage points fewer girls, respectively, performing below Level 2. The decrease in the percentage of boys performing below Level 2 in these countries was more modest, at two and three percentage points, respectively. In Chile and Poland, the percentage of boys and girls below Level 2 decreased by about the same amount. In another set of countries, the percentage of students below Level 2 has risen. In Sweden, France and Spain, this increase has occurred for both boys and girls although it has been greater for boys. In Ireland, the Czech Republic and Iceland, only the percentage of boys with a reading proficiency below Level 2 has risen. In Thailand, on the other hand, it has risen slightly for girls but not for boys. In most countries, changes in the percentage of top-performing students, those at reading proficiency Level 5 or 6, are quite similar among boys and girls, but in a few countries they differ noticeably (Tables V.2.5 and V.2.6). For example, while in Denmark and Romania the decrease in the percentage of top performers was almost identical among boys and girls, it differed in magnitude in Finland, Australia, Canada and Ireland. In New Zealand, only the percentage of top performers among girls decreased significantly, while in the Czech Republic and Germany, only the percentage of top performers among boys decreased significantly. Although the percentage of top performers increased in Japan and Korea and the partner economy Hong Kong- China to similarly high levels, the increase was very different among boys and girls. In Korea, the increase was the largest when looking at all students, but also when looking separately at boys and girls. Nonetheless, the percentage of top performers increased among girls by more than nine percentage points and among boys by slightly less than five percentage points. In Hong Kong-China, the percentage of top performers among girls increased by more than six percentage points, while it did not change among boys. Similarly, in Japan, this proportion increased by almost five percentage points among girls, more than among boys. Effectively, the gap in the proportion of top performers among boys and girls widened in these countries. • Figure V.2.8 • Change in the share of boys and girls who are low performers in reading between 2000 and 2009 -25 -20 -15 -10 -5 0 5 10 15 Percentage change between 2000 and 2009 Share of students below proficiency Level 2 increased Share of students below proficiency Level 2 decreased Chile Indonesia Peru Albania Latvia Portugal Poland Israel Liechtenstein Brazil Hungary Germany Mexico Switzerland Greece Denmark Norway Belgium OECD average – 26 Romania Hong Kong-China United States Russian Federation Korea New Zealand Bulgaria Canada Finland Australia Italy Iceland Spain Japan France Sweden Czech Republic Thailand Ireland Argentina Change in the percentage of boys below proficiency Level 2 Change in the percentage of girls below proficiency Level 2 Note: Changes in the share of students below proficiency Level 2 that are statistically significant are marked in a darker tone. Countries are ranked in ascending order of change in the percentage of all students below Level 2 on the reading scale between 2000 and 2009. Source: OECD, PISA 2009 Database, Table V.2.2. Table V.2.5 and Table V.2.6. 12 http://dx.doi.org/10.1787/888932359967
  • 54.
    2 TRENDS IN READING PISA2009 Results: LEARNING TRENDS – VOLUME V © OECD 2010 49 Changes in performance and changes in student populations The PISA assessments continue to evolve, to capture newly emerging knowledge and skills as the learning goals and instructional practices of countries change, reflecting methodological advances. At the same time, PISA implements high technical standards and coherence in methodologies across successive assessments, ensuring that performance can be monitored reliably over time and that the samples of students are representative of the same populations. However, in many countries the demographic and socio-economic context of student populations has changed. Thus, observed changes in learning outcomes may not only reflect changes in the quality of the educational services provided for 15-year-olds, but also changes in the composition of the student populations. For example, if migration into a country has been significant over the past ten years, it might influence learning outcomes. Similarly, if the student population has become more socio-economically diverse, then this too can influence outcomes. This section discusses how trends are affected by changes in student populations. It also provides an overall trend line that summarises information across all PISA assessments. Annex A6 provides details on methods used in this section. It also discusses any impact that technical changes in the national samples of students may have on the comparability of student performance over time. The impact of changes in the socio-economic composition of student populations on trends in reading performance In the following section, changes in the age and gender composition of students, the socio-economic background of student populations, changes in the share of students who always or almost always speak the language of the assessment at home, and changes in the share of students with foreign-born parents are accounted for when interpreting changes in student performance. The corresponding demographic data for 2000 and 2009 are presented in Annex A6 where the adjustment method is also explained in detail. The data on changes in socio-economic background are provided in Table V.4.2. Figure V.2.9 shows both the observed change in student performance and the predicted performance change if the composition of the student population in 2000 had been similar to the one in 2009, that is, if the student population in 2000 had the same age and gender composition, the same socio-economic background and the same share of • Figure V.2.9 • Changes in reading performance between 2000 and 2009 -40 -30 -20 -10 0 10 20 30 40 50 60 Score point change in reading performance between 2000 and 2009 Note: Observed score point changes that are statistically significant are marked in a darker tone. Countries are ranked in descending order of the observed score point change between 2000 and 2009. Source: OECD, PISA 2009 Database, Table V.2.7. 12 http://dx.doi.org/10.1787/888932359967 Observed score point change Score point change adjusted for socio-­ demographic changes Peru Chile Albania Indonesia Latvia Israel Poland Portugal Liechtenstein Brazil Korea Hungary Germany Greece Hong Kong-­ C hina Switzerland Mexico OECD average – ­ 2 6 Belgium Bulgaria Italy Denmark Norway Russian Federation Japan Romania United States Iceland New Zealand France Thailand Canada Finland Spain Australia Czech Republic Sweden Argentina Austria Ireland
  • 55.
    Exploring the Varietyof Random Documents with Different Content
  • 56.
    A painted whoreat half a crown. The bright mind fouled, the beauty gay All eaten out and fallen away, By drunken days and weary tramps From pub to pub by city lamps, Till men despise the game they started Till health and beauty are departed, And in a slum the reeking hag Mumbles a crust with toothy jag, Or gets the river's help to end The life too wrecked for man to mend. We spat and smoked and took our swipe Till Silas up and tap his pipe, And begged us all to pay attention Because he'd several things to mention. We'd seen the fight (Hear, hear. That's you); But still one task remained to do; That task was his, he didn't shun it, To give the purse to him as won it; With this remark, from start to out He'd never seen a brisker bout. There was the purse. At that he'd leave it. Let Kane come forward to receive it. I took the purse and hemmed and bowed, And called for gin punch for the crowd; And when the second bowl was done,
  • 57.
    I called, 'Let'shave another one.' Si's wife come in and sipped and sipped (As women will) till she was pipped. And Si hit Dicky Twot a clouter Because he put his arm about her; But after Si got overtasked She sat and kissed whoever asked. My Doxy Jane was splashed by this, I took her on my knee to kiss. And Tom cried out, 'O damn the gin; Why can't we all have women in? Bess Evans, now, or Sister Polly, Or those two housemaids at the Folly? Let someone nip to Biddy Price's, They'd all come in a brace of trices. Rose Davies, Sue, and Betsy Perks; One man, one girl, and damn all Turks.' But, no. 'More gin,' they cried; 'Come on. We'll have the girls in when it's gone.' So round the gin went, hot and heady, Hot Hollands punch on top of deady. Hot Hollands punch on top of stout Puts madness in and wisdom out. From drunken man to drunken man The drunken madness raged and ran. 'I'm climber Joe who climbed the spire.' 'You're climber Joe the bloody liar.'
  • 58.
    'Who says Ilie?' 'I do.' 'You lie, I climbed the spire and had a fly.' 'I'm French Suzanne, the Circus Dancer, I'm going to dance a bloody Lancer.' 'If I'd my rights I'm Squire's heir.' 'By rights I'd be a millionaire.' 'By rights I'd be the lord of you, But Farmer Scriggins had his do, He done me, so I've had to hoove it, I've got it all wrote down to prove it. And one of these dark winter nights He'll learn I mean to have my rights; I'll bloody him a bloody fix, I'll bloody burn his bloody ricks.' From three long hours of gin and smokes, And two girls' breath and fifteen blokes', A warmish night, and windows shut, The room stank like a fox's gut. The heat and smell and drinking deep Began to stun the gang to sleep. Some fell downstairs to sleep on the mat, Some snored it sodden where they sat. Dick Twot had lost a tooth and wept, But all the drunken others slept. Jane slept beside me in the chair,
  • 59.
    And I gotup; I wanted air. I opened window wide and leaned Out of that pigstye of the fiend And felt a cool wind go like grace About the sleeping market-place. The clock struck three, and sweetly, slowly, The bells chimed Holy, Holy, Holy; And in a second's pause there fell The cold note of the chapel bell, And then a cock crew, flapping wings, And summat made me think of things How long those ticking clocks had gone From church and chapel, on and on, Ticking the time out, ticking slow To men and girls who'd come and go, And how they ticked in belfry dark When half the town was bishop's park, And how they'd rung a chime full tilt The night after the church was built, And how that night was Lambert's Feast, The night I'd fought and been a beast. And how a change had come. And then I thought, 'You tick to different men.' What with the fight and what with drinking And being awake alone there thinking, My mind began to carp and tetter, 'If this life's all, the beasts are better.'
  • 60.
    And then Ithought, 'I wish I'd seen The many towns this town has been; I wish I knew if they'd a-got A kind of summat we've a-not, If them as built the church so fair Were half the chaps folk say they were; For they'd the skill to draw their plan, And skill's a joy to any man; And they'd the strength, not skill alone, To build it beautiful in stone; And strength and skill together thus... O, they were happier men than us. 'But if they were, they had to die The same as every one and I. And no one lives again, but dies, And all the bright goes out of eyes, And all the skill goes out of hands, And all the wise brain understands, And all the beauty, all the power Is cut down like a withered flower. In all the show from birth to rest I give the poor dumb cattle best.' I wondered, then, why life should be, And what would be the end of me When youth and health and strength were gone And cold old age came creeping on?
  • 61.
    A keeper's gun?The Union ward? Or that new quod at Hereford? And looking round I felt disgust At all the nights of drink and lust, And all the looks of all the swine Who'd said that they were friends of mine; And yet I knew, when morning came, The morning would be just the same, For I'd have drinks and Jane would meet me And drunken Silas Jones would greet me, And I'd risk quod and keeper's gun Till all the silly game was done. 'For parson chaps are mad supposin' A chap can change the road he's chosen.' And then the Devil whispered 'Saul, Why should you want to live at all? Why fret and sweat and try to mend? It's all the same thing in the end. But when it's done,' he said, 'it's ended. Why stand it, since it can't be mended?' And in my heart I heard him plain, 'Throw yourself down and end it, Kane.' 'Why not?' said I. 'Why not? But no. I won't. I've never had my go. I've not had all the world can give. Death by and by, but first I'll live. The world owes me my time of times,
  • 62.
    And that time'scoming now, by crimes.' A madness took me then. I felt I'd like to hit the world a belt. I felt that I could fly through air, A screaming star with blazing hair, A rushing comet, crackling, numbing The folk with fear of judgment coming, A 'Lijah in a fiery car Coming to tell folk what they are. 'That's what I'll do,' I shouted loud, 'I'll tell this sanctimonious crowd, This town of window-peeping, prying, Maligning, peering, hinting, lying, Male and female human blots Who would, but daren't be, whores and sots, That they're so steeped in petty vice That they're less excellent than lice, That they're so soaked in petty virtue That touching one of them will dirt you, Dirt you with the stain of mean Cheating trade and going between, Pinching, starving, scraping, hoarding Spying through the chinks of boarding To see if Sue the prentice lean Dares to touch the margarine. Fawning, cringing, oiling boots,
  • 63.
    Raging in thecrowd's pursuits, Flinging stones at all the Stephens, Standing firm with all the evens, Making hell for all the odd, All the lonely ones of God, Those poor lonely ones who find Dogs more mild than human kind. For dogs,' I said, 'are nobles born To most of you, you cockled corn. I've known dogs to leave their dinner, Nosing a kind heart in a sinner. Poor old Crafty wagged his tail The day I first came home from jail, When all my folk, so primly clad, Glowered black and thought me mad, And muttered how they'd been respected, While I was what they'd all expected. (I've thought of that old dog for years, And of how near I come to tears.) 'But you, you minds of bread and cheese, Are less divine than that dog's fleas. You suck blood from kindly friends, And kill them when it serves your ends. Double traitors, double black, Stabbing only in the back, Stabbing with the knives you borrow From the friends you bring to sorrow.
  • 64.
    You stab allthat's true and strong; Truth and strength you say are wrong; Meek and mild, and sweet and creeping, Repeating, canting, cadging, peeping, That's the art and that's the life To win a man his neighbour's wife. All that's good and all that's true, You kill that, so I'll kill you.' At that I tore my clothes in shreds And hurled them on the window leads; I flung my boots through both the winders And knocked the glass to little flinders; The punch bowl and the tumblers followed, And then I seized the lamps and holloed. And down the stairs, and tore back bolts, As mad as twenty blooded colts; And out into the street I pass, As mad as two-year-olds at grass, A naked madman waving grand A blazing lamp in either hand. I yelled like twenty drunken sailors, 'The devil's come among the tailors.' A blaze of flame behind me streamed, And then I clashed the lamps and screamed 'I'm Satan, newly come from hell.' And then I spied the fire-bell.
  • 65.
    I've been aringer, so I know How best to make a big bell go. So on to bell-rope swift I swoop, And stick my one foot in the loop And heave a down-swig till I groan, 'Awake, you swine, you devil's own.' I made the fire-bell awake, I felt the bell-rope throb and shake; I felt the air mingle and clang And beat the walls a muffled bang, And stifle back and boom and bay Like muffled peals on Boxing Day, And then surge up and gather shape, And spread great pinions and escape; And each great bird of clanging shrieks O Fire, Fire! from iron beaks. My shoulders cracked to send around Those shrieking birds made out of sound With news of fire in their bills. (They heard 'em plain beyond Wall Hills.) Up go the winders, out come heads, I heard the springs go creak in beds; But still I heave and sweat and tire, And still the clang goes 'Fire, Fire!' 'Where is it, then? Who is it, there? You ringer, stop, and tell us where.'
  • 66.
    'Run round andlet the Captain know.' 'It must be bad, he's ringing so.' 'It's in the town, I see the flame; Look there! Look there, how red it came.' 'Where is it, then 'O stop the bell.' I stopped and called: 'It's fire of hell; And this is Sodom and Gomorrah, And now I'll burn you up, begorra.' By this the firemen were mustering, The half-dressed stable men were flustering, Backing the horses out of stalls While this man swears and that man bawls, 'Don't take th'old mare. Back, Toby, back. Back, Lincoln. Where's the fire, Jack?' 'Damned if I know. Out Preston way.' 'No. It's at Chancey's Pitch, they say.' 'It's sixteen ricks at Pauntley burnt.' 'You back old Darby out, I durn't.' They ran the big red engine out, And put 'em to with damn and shout. And then they start to raise the shire, 'Who brought the news, and where's the fire?' They'd moonlight, lamps, and gas to light 'em. I give a screech-owl's screech to fright 'em, And snatch from underneath their noses The nozzles of the fire hoses.
  • 67.
    'I am thefire. Back, stand back, Or else I'll fetch your skulls a crack; D'you see these copper nozzles here? They weigh ten pounds apiece, my dear; I'm fire of hell come up this minute To burn this town, and all that's in it. To burn you dead and burn you clean, You cogwheels in a stopped machine, You hearts of snakes, and brains of pigeons, You dead devout of dead religions, You offspring of the hen and ass, By Pilate ruled, and Caiaphas. Now your account is totted. Learn Hell's flames are loose and you shall burn.' At that I leaped and screamed and ran, I heard their cries go 'Catch him, man.' 'Who was it?' 'Down him.' 'Out him, Ern. 'Duck him at pump, we'll see who'll burn.' A policeman clutched, a fireman clutched, A dozen others snatched and touched. 'By God, he's stripped down to his buff.' 'By God, we'll make him warm enough.' 'After him.' 'Catch him,' 'Out him,' 'Scrob him. 'We'll give him hell.' 'By God, we'll mob him.' 'We'll duck him, scrout him, flog him, fratch him. 'All right,' I said. 'But first you'll catch him.'
  • 68.
    The men whodon't know to the root The joy of being swift of foot, Have never known divine and fresh The glory of the gift of flesh, Nor felt the feet exult, nor gone Along a dim road, on and on, Knowing again the bursting glows, The mating hare in April knows, Who tingles to the pads with mirth At being the swiftest thing on earth. O, if you want to know delight, Run naked in an autumn night, And laugh, as I laughed then, to find A running rabble drop behind, And whang, on every door you pass, Two copper nozzles, tipped with brass, And doubly whang at every turning, And yell, 'All hell's let loose, and burning.' I beat my brass and shouted fire At doors of parson, lawyer, squire, At all three doors I threshed and slammed And yelled aloud that they were damned. I clodded squire's glass with turves Because he spring-gunned his preserves. Through parson's glass my nozzle swishes Because he stood for loaves and fishes, But parson's glass I spared a tittle.
  • 69.
    He give mean orange once when little, And he who gives a child a treat Makes joy-bells ring in Heaven's street, And he who gives a child a home Builds palaces in Kingdom come, And she who gives a baby birth Brings Saviour Christ again to Earth, For life is joy, and mind is fruit, And body's precious earth and root. But lawyer's glass--well, never mind, Th'old Adam's strong in me, I find. God pardon man, and may God's son Forgive the evil things I've done. What more? By Dirty Lane I crept Back to the Lion, where I slept. The raging madness hot and floodin' Boiled itself out and left me sudden, Left me worn out and sick and cold, Aching as though I'd all grown old; So there I lay, and there they found me On door-mat, with a curtain round me. Si took my heels and Jane my head And laughed, and carried me to bed. And from the neighbouring street they reskied My boots and trousers, coat and weskit; They bath-bricked both the nozzles bright To be mementoes of the night,
  • 70.
    And knowing whatI should awake with They flannelled me a quart to slake with, And sat and shook till half-past two Expecting Police Inspector Drew. I woke and drank, and went to meat In clothes still dirty from the street. Down in the bar I heard 'em tell How someone rang the fire-bell, And how th'inspector's search had thriven, And how five pounds reward was given. And Shepherd Boyce, of Marley, glad us By saying it was blokes from mad'us, Or two young rips lodged at the Prince Whom none had seen nor heard of since, Or that young blade from Worcester Walk (You know how country people talk). Young Joe the ostler come in sad, He said th'old mare had bit his dad. He said there'd come a blazing screeching Daft Bible-prophet chap a-preaching, Had put th'old mare in such a taking She'd thought the bloody earth was quaking. And others come and spread a tale Of cut-throats out of Gloucester jail, And how we needed extra cops With all them Welsh come picking hops;
  • 71.
    With drunken Welshin all our sheds We might be murdered in our beds. By all accounts, both men and wives Had had the scare up of their lives. I ate and drank and gathered strength, And stretched along the bench full length, Or crossed to window seat to pat Black Silas Jones's little cat. At four I called, 'You devil's own, The second trumpet shall be blown. The second trump, the second blast; Hell's flames are loosed, and judgment's passed. Too late for mercy now. Take warning I'm death and hell and Judgment morning.' I hurled the bench into the settle, I banged the table on the kettle, I sent Joe's quart of cider spinning. 'Lo, here begins my second inning.' Each bottle, mug, and jug and pot I smashed to crocks in half a tot; And Joe, and Si, and Nick, and Percy I rolled together topsy versy. And as I ran I heard 'em call, 'Now damn to hell, what's gone with Saul?' Out into street I ran uproarious The devil dancing in me glorious.
  • 72.
    And as Iran I yell and shriek 'Come on, now, turn the other cheek.' Across the way by almshouse pump I see old puffing parson stump. Old parson, red-eyed as a ferret From nightly wrestlings with the spirit; I ran across, and barred his path. His turkey gills went red as wrath And then he froze, as parsons can. 'The police will deal with you, my man.' 'Not yet,' said I, 'not yet they won't; And now you'll hear me, like or don't. The English Church both is and was A subsidy of Caiaphas. I don't believe in Prayer nor Bible, They're lies all through, and you're a libel, A libel on the Devil's plan When first he miscreated man. You mumble through a formal code To get which martyrs burned and glowed. I look on martyrs as mistakes, But still they burned for it at stakes; Your only fire's the jolly fire Where you can guzzle port with Squire, And back and praise his damned opinions About his temporal dominions. You let him give the man who digs, A filthy hut unfit for pigs,
  • 73.
    Without a well,without a drain, With mossy thatch that lets in rain, Without a 'lotment, 'less he rent it, And never meat, unless he scent it, But weekly doles of 'leven shilling To make a grown man strong and willing, To do the hardest work on earth And feed his wife when she gives birth, And feed his little children's bones. I tell you, man, the Devil groans. With all your main and all your might You back what is against what's right; You let the Squire do things like these, You back him in't and give him ease, You take his hand, and drink his wine, And he's a hog, but you're a swine. For you take gold to teach God's ways And teach man how to sing God's praise. And now I'll tell you what you teach In downright honest English speech. 'You teach the ground-down starving man That Squire's greed's Jehovah's plan. You get his learning circumvented Lest it should make him discontented (Better a brutal, starving nation Than men with thoughts above their station), You let him neither read nor think,
  • 74.
    You goad hiswretched soul to drink And then to jail, the drunken boor; O sad intemperance of the poor. You starve his soul till it's rapscallion, Then blame his flesh for being stallion. You send your wife around to paint The golden glories of "restraint." How moral exercise bewild'rin' Would soon result in fewer children. You work a day in Squire's fields And see what sweet restraint it yields; A woman's day at turnip picking, Your heart's too fat for plough or ricking. 'And you whom luck taught French and Greek Have purple flaps on either cheek, A stately house, and time for knowledge, And gold to send your sons to college, That pleasant place, where getting learning Is also key to money earning. But quite your damn'dest want of grace Is what you do to save your face; The way you sit astride the gates By padding wages out of rates; Your Christmas gifts of shoddy blankets That every working soul may thank its Loving parson, loving squire Through whom he can't afford a fire.
  • 75.
    Your well-packed bench,your prison pen, To keep them something less than men; Your friendly clubs to help 'em bury, Your charities of midwifery. Your bidding children duck and cap To them who give them workhouse pap. O, what you are, and what you preach, And what you do, and what you teach Is not God's Word, nor honest schism, But Devil's cant and pauperism.' By this time many folk had gathered To listen to me while I blathered; I said my piece, and when I'd said it, I'll do old purple parson credit, He sunk (as sometimes parsons can) His coat's excuses in the man. 'You think that Squire and I are kings Who made the existing state of things, And made it ill. I answer, No, States are not made, nor patched; they grow, Grow slow through centuries of pain And grow correctly in the main, But only grow by certain laws Of certain bits in certain jaws. You want to doctor that. Let be. You cannot patch a growing tree. Put these two words beneath your hat,
  • 76.
    These two: securusjudicat. The social states of human kinds Are made by multitudes of minds. And after multitudes of years A little human growth appears Worth having, even to the soul Who sees most plain it's not the whole. This state is dull and evil, both, I keep it in the path of growth; You think the Church an outworn fetter; Kane, keep it, till you've built a better. And keep the existing social state; I quite agree it's out of date, One does too much, another shirks, Unjust, I grant; but still ... it works. To get the whole world out of bed And washed, and dressed, and warmed, and fed, To work, and back to bed again, Believe me, Saul, costs worlds of pain. Then, as to whether true or sham That book of Christ, Whose priest I am; The Bible is a lie, say you, Where do you stand, suppose it true? Good-bye. But if you've more to say, My doors are open night and day. Meanwhile, my friend, 'twould be no sin
  • 77.
    To mix morewater in your gin. We're neither saints nor Philip Sidneys, But mortal men with mortal kidneys.' He took his snuff, and wheezed a greeting, And waddled off to mothers' meeting; I hung my head upon my chest, I give old purple parson best. For while the Plough tips round the Pole The trained mind outs the upright soul, As Jesus said the trained mind might, Being wiser than the sons of light, But trained men's minds are spread so thin They let all sorts of darkness in; Whatever light man finds they doubt it, They love not light, but talk about it. But parson'd proved to people's eyes That I was drunk, and he was wise; And people grinned and women tittered, And little children mocked and twittered So blazing mad, I stalked to bar To show how noble drunkards are, And guzzled spirits like a beast, To show contempt for Church and priest, Until, by six, my wits went round Like hungry pigs in parish pound. At half-past six, rememb'ring Jane, I staggered into street again
  • 78.
    With mind madeup (or primed with gin) To bash the cop who'd run me in; For well I knew I'd have to cock up My legs that night inside the lock-up, And it was my most fixed intent To have a fight before I went. Our Fates are strange, and no one knows his; Our lovely Saviour Christ disposes. Jane wasn't where we'd planned, the jade. She'd thought me drunk and hadn't stayed. So I went up the Walk to look for her And lingered by the little brook for her, And dowsed my face, and drank at spring, And watched two wild duck on the wing. The moon come pale, the wind come cool, A big pike leapt in Lower Pool, The peacock screamed, the clouds were straking, My cut cheek felt the weather breaking; An orange sunset waned and thinned Foretelling rain and western wind, And while I watched I heard distinct The metals on the railway clinked. The blood-edged clouds were all in tatters, The sky and earth seemed mad as hatters; They had a death look, wild and odd, Of something dark foretold by God.
  • 79.
    And seeing itso, I felt so shaken I wouldn't keep the road I'd taken, But wandered back towards the inn Resolved to brace myself with gin. And as I walked, I said, 'It's strange, There's Death let loose to-night, and Change.' In Cabbage Walk I made a haul Of two big pears from lawyer's wall, And, munching one, I took the lane Back into Market-place again. Lamp-lighter Dick had passed the turning And all the Homend lamps were burning, The windows shone, the shops were busy, But that strange Heaven made me dizzy. The sky had all God's warning writ In bloody marks all over it, And over all I thought there was A ghastly light beside the gas. The Devil's tasks and Devil's rages Were giving me the Devil's wages. In Market-place it's always light, The big shop windows make it bright; And in the press of people buying I spied a little fellow crying Because his mother'd gone inside
  • 80.
    And left himthere, and so he cried. And mother'd beat him when she found him, And mother's whip would curl right round him, And mother'd say he'd done't to crost her, Though there being crowds about he'd lost her. Lord, give to men who are old and rougher The things that little children suffer, And let keep bright and undefiled The young years of the little child. I pat his head at edge of street And gi'm my second pear to eat. Right under lamp, I pat his head, 'I'll stay till mother come,' I said, And stay I did, and joked and talked, And shoppers wondered as they walked. 'There's that Saul Kane, the drunken blaggard, Talking to little Jimmy Jaggard. The drunken blaggard reeks of drink.' 'Whatever will his mother think?' 'Wherever has his mother gone? Nip round to Mrs Jaggard's, John, And say her Jimmy's out again, In Market-place, with boozer Kane.' 'When he come out to-day he staggered. O, Jimmy Jaggard, Jimmy Jaggard.' 'His mother's gone inside to bargain, Run in and tell her, Polly Margin,
  • 81.
    And tell herpoacher Kane is tipsy And selling Jimmy to a gipsy.' 'Run in to Mrs Jaggard, Ellen, Or else, dear knows, there'll be no tellin', And don't dare leave yer till you've fount her, You'll find her at the linen counter.' I told a tale, to Jim's delight, Of where the tom-cats go by night, And how when moonlight come they went Among the chimneys black and bent, From roof to roof, from house to house, With little baskets full of mouse All red and white, both joint and chop Like meat out of a butcher's shop; Then all along the wall they creep And everyone is fast asleep, And honey-hunting moths go by, And by the bread-batch crickets cry; Then on they hurry, never waiting To lawyer's backyard cellar grating Where Jaggard's cat, with clever paw, Unhooks a broke-brick's secret door; Then down into the cellar black, Across the wood slug's slimy track, Into an old cask's quiet hollow, Where they've got seats for what's to follow;
  • 82.
    Then each tom-catlights little candles, And O, the stories and the scandals, And O, the songs and Christmas carols, And O, the milk from little barrels. They light a fire fit for roasting (And how good mouse-meat smells when toasting), Then down they sit to merry feast While moon goes west and sun comes east. Sometimes they make so merry there Old lawyer come to head of stair To 'fend with fist and poker took firm His parchments channelled by the bookworm, And all his deeds, and all his packs Of withered ink and sealing wax; And there he stands, with candle raised, And listens like a man amazed, Or like a ghost a man stands dumb at, He says, 'Hush! Hush! I'm sure there's summat!' He hears outside the brown owl call, He hears the death-tick tap the wall, The gnawing of the wainscot mouse, The creaking up and down the house, The unhooked window's hinges ranging, The sounds that say the wind is changing. At last he turns, and shakes his head, 'It's nothing, I'll go back to bed.'
  • 83.
    And just thenMrs Jaggard came To view and end her Jimmy's shame. She made one rush and gi'm a bat And shook him like a dog a rat. 'I can't turn round but what you're straying. I'll give you tales and gipsy playing. I'll give you wand'ring off like this And listening to whatever 't is, You'll laugh the little side of the can, You'll have the whip for this, my man; And not a bite of meat nor bread You'll touch before you go to bed. Some day you'll break your mother's heart, After God knows she's done her part, Working her arms off day and night Trying to keep your collars white. Look at your face, too, in the street. What dirty filth 've you found to eat? Now don't you blubber here, boy, or I'll give you sum't to blubber for.' She snatched him off from where we stand And knocked the pear-core from his hand, And looked at me, 'You Devil's limb, How dare you talk to Jaggard's Jim; You drunken, poaching, boozing brute, you. If Jaggard was a man he'd shoot you.' She glared all this, but didn't speak,
  • 84.
    She gasped, whitehollows in her cheek; Jimmy was writhing, screaming wild, The shoppers thought I'd killed the child. I had to speak, so I begun. 'You'd oughtn't beat your little son; He did no harm, but seeing him there I talked to him and gi'm a pear; I'm sure the poor child meant no wrong, It's all my fault he stayed so long, He'd not have stayed, mum, I'll be bound If I'd not chanced to come around. It's all my fault he stayed, not his. I kept him here, that's how it is.' 'Oh! And how dare you, then?' says she, 'How dare you tempt my boy from me? How dare you do't, you drunken swine, Is he your child or is he mine? A drunken sot they've had the beak to, Has got his dirty whores to speak to, His dirty mates with whom he drink, Not little children, one would think. Look on him, there,' she says, 'look on him And smell the stinking gin upon him, The lowest sot, the drunk'nest liar, The dirtiest dog in all the shire: Nice friends for any woman's son After ten years, and all she's done.
  • 85.
    'For I've hadeight, and buried five, And only three are left alive. I've given them all we could afford, I've taught them all to fear the Lord. They've had the best we had to give, The only three the Lord let live. 'For Minnie whom I loved the worst Died mad in childbed with her first. And John and Mary died of measles, And Rob was drownded at the Teasels. And little Nan, dear little sweet, A cart run over in the street; Her little shift was all one stain, I prayed God put her out of pain. And all the rest are gone or going The road to hell, and there's no knowing For all I've done and all I've made them I'd better not have overlaid them. For Susan went the ways of shame The time the 'till'ry regiment came, And t'have her child without a father I think I'd have her buried rather. And Dicky boozes, God forgimme, And now't's to be the same with Jimmy. And all I've done and all I've bore Has made a drunkard and a whore, A bastard boy who wasn't meant,
  • 86.
    And Jimmy gwinewhere Dicky went; For Dick began the self-same way And my old hairs are going gray, And my poor man's a withered knee, And all the burden falls on me. 'I've washed eight little children's limbs, I've taught eight little souls their hymns, I've risen sick and lain down pinched And borne it all and never flinched; But to see him, the town's disgrace, With God's commandments broke in's face, Who never worked, not he, nor earned, Nor will do till the seas are burned, Who never did since he was whole A hand's turn for a human soul, But poached and stole and gone with women, And swilled down gin enough to swim in; To see him only lift one finger To make my little Jimmy linger. In spite of all his mother's prayers, And all her ten long years of cares, And all her broken spirit's cry That drunkard's finger puts them by, And Jimmy turns. And now I see That just as Dick was, Jim will be, And all my life will have been vain.
  • 87.
    Welcome to ourwebsite – the perfect destination for book lovers and knowledge seekers. We believe that every book holds a new world, offering opportunities for learning, discovery, and personal growth. That’s why we are dedicated to bringing you a diverse collection of books, ranging from classic literature and specialized publications to self-development guides and children's books. More than just a book-buying platform, we strive to be a bridge connecting you with timeless cultural and intellectual values. With an elegant, user-friendly interface and a smart search system, you can quickly find the books that best suit your interests. Additionally, our special promotions and home delivery services help you save time and fully enjoy the joy of reading. Join us on a journey of knowledge exploration, passion nurturing, and personal growth every day! ebookbell.com