Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Programme for International Student Assessment

From Wikipedia, the free encyclopedia
Scholastic performance study by the OECD
"PISA" redirects here. For other uses, seePisa (disambiguation).
Programme for International Student Assessment
AbbreviationPISA
Formation1997
PurposeComparison of education attainment across the world
HeadquartersOECD Headquarters
Location
  • 2 rue André Pascal, 75775 Paris Cedex 16
Region served
World
Membership79 government education departments
Official language
English and French
Head of the Early Childhood and Schools Division
Yuri Belfali
Main organ
PISA Governing Body (Chair – Michele Bruniges)
Parent organization
OECD
Websitewww.oecd.org/pisa/
PISA average scores (2022)
Coloured world map with PISA PISA average Mathematics scores (2022)
Mathematics
Coloured world map with PISA PISA average Science scores (2022)
Science
Coloured world map with PISA PISA average Reading scores (2022)
Reading
  Score is higher than 549
  Score equal to or between 500 and 549
  Score equal to or between 450 and 499
  Score equal to or between 400 and 449
  Score equal to or between 350 and 399
  Score is less than 345
  No data

TheProgramme for International Student Assessment (PISA) is a worldwide study by theOrganisation for Economic Co-operation and Development (OECD) in member and non-member nations intended to evaluate educational systems by measuring 15-year-old school pupils' scholastic performance on mathematics, science, and reading.[1] It was first performed in 2000 and then repeated every three years. Its aim is to provide comparable data to enable countries to improve their education policies and outcomes. It measures problem solving and cognition.[2]

The results of the 2022 data collection were released in December 2023.[3]

Influence and impact

[edit]

PISA, and similar international standardised assessments of educational attainment are increasingly used in the process of education policymaking at both national and international levels.[4]

PISA was conceived to set in a wider context the information provided by national monitoring of education system performance through regular assessments within a common, internationally agreed framework; by investigating relationships between student learning and other factors they can "offer insights into sources of variation in performances within and between countries".[5]

Until the 1990s, few European countries used national tests. In the 1990s, ten countries / regions introduced standardised assessment, and since the early 2000s, ten more followed suit. By 2009, only five European education systems had no national student assessments.[4]

The impact of these international standardised assessments in the field of educational policy has been significant, in terms of the creation ofnew knowledge, changes inassessment policy, andexternal influence over national educational policy more broadly.[6][7][8]

Creation of new knowledge

[edit]

Data from international standardised assessments can be useful in research on causal factors within or across education systems.[4] Mons notes that the databases generated by large-scale international assessments have made it possible to carry out inventories and comparisons of education systems on an unprecedented scale[note 1] on themes ranging from the conditions for learning mathematics and reading, to institutional autonomy and admissions policies.[9] They allow typologies to be developed that can be used for comparative statistical analyses of education performance indicators, thereby identifying the consequences of different policy choices. They have generated new knowledge about education: PISA findings have challenged deeply embedded educational practices, such as the early tracking of students into vocational or academic pathways.[10]

Barroso and de Carvalho find that PISA provides a common reference connecting academic research in education and the political realm of public policy, operating as a mediator between different strands of knowledge from the realm of education and public policy.[11] However, although the key findings from comparative assessments are widely shared in the research community[4] the knowledge they create does not necessarily fit with government reform agendas; this leads to some inappropriate uses of assessment data.

Changes in national assessment policy

[edit]

Emerging research suggests that international standardised assessments are having an impact on national assessment policy and practice. PISA is being integrated into national policies and practices on assessment, evaluation, curriculum standards and performance targets; its assessment frameworks and instruments are being used as best-practice models for improving national assessments; many countries have explicitly incorporated and emphasise PISA-like competencies in revised national standards and curricula; others use PISA data to complement national data and validate national results against an international benchmark.[10]

External influence over national educational policy

[edit]

PISA may influence national education policy choices in a variety of ways. Participation in international assessments like PISA has been linked to significant education policy changes and outcomes, such as higher student enrollments and education reforms.[6] However, critics have argued that participation could lead to undesirable outcomes, such as higher repetition rates and narrowing of curricula.[7] The impact of PISA may also vary according to the specific country context.[12]

Policy-makers in most participating countries see PISA as an important indicator of system performance; PISA reports can define policy problems and set the agenda for national policy debate; policymakers seem to accept PISA as a valid and reliable instrument for internationally benchmarking system performance and changes over time; most countries—irrespective of whether they performed above, at, or below the average PISA score—have begun policy reforms in response to PISA reports.[10]

Against this, impact on national education systems varies markedly. For example, in Germany, the results of the first PISA assessment caused the so-called 'PISA shock': a questioning of previously accepted educational policies; in a state marked by jealously guarded regional policy differences, it led ultimately to an agreement by all Länder to introduce common national standards and even an institutionalised structure to ensure that they were observed.[13] In Hungary, by comparison, which shared similar conditions to Germany, PISA results have not led to significant changes in educational policy.[14]

Because many countries have set national performance targets based on their relative rank or absolute PISA score, PISA assessments have increased the influence of their (non-elected) commissioning body, the OECD, as an international education monitor and policy actor, which implies an important degree of 'policy transfer' from the international to the national level; PISA in particular is having "an influential normative effect on the direction of national education policies".[10] Thus, it is argued that the use of international standardised assessments has led to a shift towards international, external accountability for national system performance; Rey contends that PISA surveys, portrayed as objective, third-party diagnoses of education systems, actually serve to promote specific orientations on educational issues.[4]

National policy actors refer to high-performing PISA countries to "help legitimise and justify their intended reform agenda within contested national policy debates".[15] PISA data can be "used to fuel long-standing debates around pre-existing conflicts or rivalries between different policy options, such as in the French Community of Belgium".[16] In such instances, PISA assessment data are used selectively: in public discourse governments often only use superficial features of PISA surveys such as country rankings and not the more detailed analyses. Rey (2010:145, citing Greger, 2008) notes that often the real results of PISA assessments are ignored as policymakers selectively refer to data in order to legitimise policies introduced for other reasons.[17]

In addition, PISA's international comparisons can be used to justify reforms with which the data themselves have no connection; in Portugal, for example, PISA data were used to justify new arrangements for teacher assessment (based on inferences that were not justified by the assessments and data themselves); they also fed the government's discourse about the issue of pupils repeating a year, (which, according to research, fails to improve student results).[18] In Finland, the country's PISA results (that are in other countries deemed to be excellent) were used by Ministers to promote new policies for 'gifted' students.[19] Such uses and interpretations often assume causal relationships that cannot legitimately be based upon PISA data which would normally require fuller investigation through qualitative in-depth studies andlongitudinal surveys based on mixed quantitative and qualitative methods,[20] which politicians are often reluctant to fund.

Recent decades have witnessed an expansion in the uses of PISA and similar assessments, from assessing students' learning, to connecting "the educational realm (their traditional remit) with the political realm".[21] This raises the question of whether PISA data are sufficiently robust to bear the weight of the major policy decisions that are being based upon them, for, according to Breakspear, PISA data have "come to increasingly shape, define and evaluate the key goals of the national / federal education system".[10] This implies that those who set the PISA tests – e.g. in choosing the content to be assessed and not assessed – are in a position of considerable power to set the terms of the education debate, and to orient educational reform in many countries around the globe.[10]

Framework

[edit]

PISA stands in a tradition of international school studies, undertaken since the late 1950s by theInternational Association for the Evaluation of Educational Achievement (IEA). Much of PISA's methodology follows the example of theTrends in International Mathematics and Science Study (TIMSS, started in 1995), which in turn was much influenced by the U.S.National Assessment of Educational Progress (NAEP). The reading component of PISA is inspired by the IEA'sProgress in International Reading Literacy Study (PIRLS).

PISA aims to testliteracy of students in three competence fields: reading, mathematics, science on an indefinite scale.[22]

The PISA mathematics literacy test asks students to apply their mathematical knowledge to solve problems set in real-world contexts. To solve the problems students must activate a number of mathematical competencies as well as a broad range of mathematical content knowledge. TIMSS, on the other hand, measures more traditional classroom content such as an understanding of fractions and decimals and the relationship between them (curriculum attainment). PISA claims to measure education's application to real-life problems andlifelong learning (workforce knowledge).

In the reading test, "OECD/PISA does not measure the extent to which 15-year-old students are fluent readers or how competent they are at word recognition tasks or spelling." Instead, they should be able to "construct, extend and reflect on the meaning of what they have read across a wide range of continuous and non-continuous texts."[23]

PISA also assesses students in innovative domains. In 2012 and 2015 in addition to reading, mathematics and science, they were tested in collaborative problem solving. In 2018 the additional innovative domain was global competence.

Implementation

[edit]

PISA is sponsored, governed, and coordinated by the OECD, but paid for by participating countries.[citation needed]

Method of testing

[edit]

Sampling

[edit]

The students tested by PISA are aged between 15 years and 3 months and 16 years and 2 months at the beginning of the assessment period. The school year pupils are in is not taken into consideration. Only students at school are tested, not home-schoolers. In PISA 2006, however, several countries also used a grade-based sample of students. This made it possible to study how age and school year interact.

To fulfill OECD requirements, each country must draw a sample of at least 5,000 students. In small countries likeIceland andLuxembourg, where there are fewer than 5,000 students per year, an entire age cohort is tested. Some countries used much larger samples than required to allow comparisons between regions.

Test

[edit]
PISA test documents on a school table (Neues Gymnasium, Oldenburg, Germany, 2006)

Each student takes a two-hour computer based test. Part of the test is multiple-choice and part involves fuller answers. There are six and a half hours of assessment material, but each student is not tested on all the parts. Following the cognitive test, participating students spend nearly one more hour answering a questionnaire on their background including learning habits, motivation, and family. School directors fill in a questionnaire describing school demographics, funding, etc. In 2012 the participants were, for the first time in the history of large-scale testing and assessments, offered a new type of problem, i.e. interactive (complex) problems requiring exploration of a novel virtual device.[24][25]

In selected countries, PISA started experimentation withcomputer adaptive testing.

National add-ons

[edit]

Countries are allowed to combine PISA with complementary national tests.

Germany does this in a very extensive way: On the day following the international test, students take a national test calledPISA-E [de] (E=Ergänzung=complement). Test items of PISA-E are closer to TIMSS than to PISA. While only about 5,000 German students participate in the international and the national test, another 45,000 take the national test only. This large sample is needed to allow an analysis by federal states. Following a clash about the interpretation of 2006 results, the OECD warned Germany that it might withdraw the right to use the "PISA" label for national tests.[26]

Data scaling

[edit]

From the beginning, PISA has been designed with one particular method of data analysis in mind. Since students work on different test booklets, raw scores must be 'scaled' to allow meaningful comparisons. Scores are thus scaled so that the OECD average in each domain (mathematics, reading and science) is 500 and thestandard deviation is 100.[27] This is true only for the initial PISA cycle when the scale was first introduced, though, subsequent cycles are linked to the previous cycles through IRT scale linking methods.[28]

This generation of proficiency estimates is done using a latent regression extension of theRasch model, a model ofitem response theory (IRT), also known as conditioning model or population model. The proficiency estimates are provided in the form of so-called plausible values, which allow unbiased estimates of differences between groups. The latent regression, together with the use of a Gaussianprior probability distribution of student competencies allows estimation of the proficiency distributions of groups of participating students.[29] The scaling and conditioning procedures are described in nearly identical terms in the Technical Reports of PISA 2000, 2003, 2006. NAEP and TIMSS use similar scaling methods.

Ranking results

[edit]

All PISA results are tabulated by country; recent PISA cycles have separate provincial or regional results for some countries. Most public attention concentrates on just one outcome: the mean scores of countries and their rankings of countries against one another. In the official reports, however, country-by-country rankings are given not as simple league tables but as cross tables indicating for each pair of countries whether or not mean score differences arestatistically significant (unlikely to be due to random fluctuations in student sampling or in item functioning). In favorable cases, a difference of 9 points is sufficient to be considered significant.[citation needed]

PISA never combines mathematics, science and reading domain scores into an overall score. However, commentators have sometimes combined test results from all three domains into an overall country ranking. Such meta-analysis is not endorsed by the OECD, although official summaries sometimes use scores from a testing cycle's principal domain as a proxy for overall student ability.

PISA 2022 ranking summary

[edit]

The results of PISA 2022 were presented on 5 December 2023, which included data for around 700,000 participating students in 81 countries and economies, withSingapore emerging as the top performer in all categories.[30]

BothLebanon and the Chinese provinces/municipalities ofBeijing,Shanghai,Jiangsu andZhejiang participated this edition, but their results were not published as they were not able to fully collect data because of COVID restrictions.[31]

Because of theRussian full-scale invasion of Ukraine, only 18 of 27 Ukrainian regions had their data collected, thus the results are not representative of the following regions:Dnipropetrovsk Oblast,Donetsk Oblast,Kharkiv Oblast,Luhansk Oblast,Zaporizhzhia Oblast,Kherson Oblast,Mykolaiv Oblast, theAutonomous Republic of Crimea and the city ofSevastopol.[32]

Mathematics[30]Science[30]Reading[30]
1 Singapore575
2 Macau552
3 Taiwan547
4 Hong Kong540
5 Japan536
6 South Korea527
7 Estonia510
8  Switzerland508
9 Canada497
10 Netherlands493
11 Ireland492
12 Belgium489
13 Denmark489
14 United Kingdom489
15 Poland489
16 Australia487
17 Austria487
18 Czech Republic487
19 Slovenia485
20 Finland484
21 Latvia483
22 Sweden482
23 New Zealand479
24 Germany475
25 Lithuania475
26 France474
27 Spain473
28 Hungary473
29 Portugal472
International Average (OECD)472
30 Italy471
31 Vietnam469
32 Norway468
33 Malta466
34 United States465
35 Slovakia464
36 Croatia463
37 Iceland459
38 Israel458
39 Turkey453
40 Brunei442
41 Ukraine441
42 Serbia440
43 United Arab Emirates431
44 Greece430
45 Romania428
46 Kazakhstan425
47 Mongolia425
48 Cyprus418
49 Bulgaria417
50 Moldova417
51 Qatar414
52 Chile412
53 Uruguay409
54 Malaysia409
55 Montenegro406
56 Azerbaijan397
57 Mexico395
58 Thailand394
59 Peru391
60 Georgia390
61 North Macedonia389
62 Saudi Arabia389
63 Costa Rica385
64 Colombia383
65 Brazil379
66 Argentina378
67 Jamaica377
68 Albania368
69 Indonesia366
70 Palestinian Authority366
71 Morocco365
72 Uzbekistan364
73 Jordan361
74 Panama357
75 Kosovo355
76 Philippines355
77 Guatemala344
78 El Salvador343
79 Dominican Republic339
80 Paraguay338
81 Cambodia336
1 Singapore561
2 Japan547
3 Macau543
4 Taiwan537
5 South Korea528
6 Estonia526
7 Hong Kong520
8 Canada515
9 Finland511
10 Australia507
11 Ireland504
12 New Zealand504
13  Switzerland503
14 Slovenia500
15 United Kingdom500
16 United States499
17 Poland499
18 Czech Republic498
19 Denmark494
20 Latvia494
21 Sweden494
22 Germany492
23 Austria491
24 Belgium491
25 Netherlands488
26 France487
27 Hungary486
28 Spain485
29International Average (OECD)485
 Lithuania484
30 Portugal484
31 Croatia483
32 Norway478
33 Italy477
34 Turkey476
35 Vietnam472
36 Malta466
37 Israel465
38 Slovakia462
39 Ukraine450
40 Iceland447
41 Serbia447
42 Brunei446
43 Chile444
44 Greece441
45 Uruguay435
46 United Arab Emirates432
47 Qatar432
48 Romania428
49 Kazakhstan423
50 Bulgaria421
51 Moldova417
52 Malaysia416
53 Mongolia412
54 Cyprus411
55 Colombia411
56 Costa Rica411
57 Mexico410
58 Thailand409
59 Peru408
60 Argentina406
61 Brazil403
62 Jamaica403
63 Montenegro403
64 Saudi Arabia390
65 Panama388
66 Georgia384
67 Indonesia383
68 Azerbaijan380
69 North Macedonia380
70 Albania376
71 Jordan375
72 El Salvador374
73 Guatemala373
74 Palestinian Authority369
75 Paraguay368
76 Morocco365
77 Dominican Republic360
78 Kosovo357
79 Philippines356
80 Uzbekistan355
81 Cambodia347
1 Singapore543
2 Ireland516
3 Japan516
4 South Korea515
5 Taiwan515
6 Estonia511
7 Macau510
8 Canada507
9 United States504
10 New Zealand501
11 Hong Kong500
12 Australia498
13 United Kingdom494
14 Finland490
15 Denmark489
16 Poland489
17 Czech Republic489
18 Sweden487
19  Switzerland483
20 Italy482
21 Germany480
22 Austria480
23 Belgium479
24 Norway477
25 Portugal477
26International Average (OECD)476
27 Croatia475
28 Latvia475
29 Spain474
 France474
30 Israel474
31 Hungary473
32 Lithuania472
33 Slovenia469
34 Vietnam462
35 Netherlands459
36 Turkey456
37 Chile448
38 Slovakia447
39 Malta445
40 Serbia440
41 Greece438
42 Iceland436
43 Uruguay430
44 Brunei429
45 Romania428
46 Ukraine428
47 Qatar419
48 United Arab Emirates417
49 Costa Rica415
50 Mexico415
51 Moldova411
52 Brazil410
53 Jamaica410
54 Colombia409
55 Peru408
56 Montenegro405
57 Bulgaria404
58 Argentina401
59 Panama392
60 Malaysia388
61 Kazakhstan386
62 Saudi Arabia383
63 Cyprus381
64 Thailand379
65 Mongolia378
66 Georgia374
67 Guatemala374
68 Paraguay373
69 Azerbaijan365
70 El Salvador365
71 Indonesia359
72 North Macedonia359
73 Albania358
74 Dominican Republic351
75 Palestinian Authority349
76 Philippines347
77 Jordan342
78 Kosovo342
79 Morocco339
80 Uzbekistan336
81 Cambodia329

Rankings comparison 2000–2022

[edit]
Mathematics
Country2022[33]2018[34]201520122009200620032000
ScoreRankScoreRankScoreRankScoreRankScoreRankScoreRankScoreRankScoreRank
International Average (OECD)472489490494495494499492
 Albania368684374841357394543775338133
 Algeria36072
 Argentina37866379714095838830
 Australia487174912949425504175141352012524105336
 Austria4871649923497205061649622505175061850312
 China B-S-J-G[a]5316
 China B-S-J-Z[b]5911
 AzerbaijanBaku3975642056
 Belarus47238
 Belgium48912508155071551513515125201152975208
 Bosnia and Herzegovina40662
 Brazil3796538470377683895538651370503563933435
 Brunei4424043051
 Bulgaria41749436494414743943428414134343028
 Argentina CABA[c]4564341849
 Cambodia33681
 Canada49795121251610518115278527753265336
 Chile41252417594235042347421444114438432
 Taiwan547353155424560354345491
 Colombia383643916939064376583815237049
 Costa Rica38563402634006240753
 Croatia463364644046441471384603846734
 Cyprus418484514543748
 Czech Republic4871849922492284992249325510155161249814
 Denmark4891350913511125002050317513145141451410
 Dominican Republic339793257832873
 El Salvador34378
 Estonia51075238520952195121551513
 Finland484205071651113519105415548254425365
 France474264952549326495234972049622511155179
 Georgia390603986640460
 Germany4752550020506165141451314504195031949016
 Greece4304445144454444534046637459374453244724
 Guatemala34477
 Hong Kong54045514548256125552547355015601
 Hungary4732848136477374773749027491264902548817
 Iceland4593749526488314932550716506165151351410
 Indonesia3667037972386663756037155391473603736734
 Ireland4921150021504185011848730501215032050312
 Israel45838463414703946639447394423843326
 Italy4713048731490304853048333462364663145722
 Jamaica37767
 Japan53655276532553665297523953455572
 Jordan361734006538067386573875038448
 Kazakhstan4254642354460424324540548
 South Korea52765267524755445463547454235473
 Kosovo355753667536271
 Latvia4832149624482344912648234486304832746321
 Lebanon3936839663
 Lithuania475244813547836479354773548629
 Luxembourg48333486334902748928490274932344625
 Macau55225583544353855251052585278
 Malaysia40954440474464542148
 Malta466334723947935
 Mexico3955740961408594135041946406453853638731
 Moldova414504215542052
 Mongolia42547
 Montenegro406554305341854410514034939946
 Morocco3657136874
 Netherlands493105199512115238526953155384
 New Zealand479234942749521500215191152210523115374
 Macedonia38962394673716938133
 Norway4683250119502194892849819490284952249913
 Palestinian Authority36669
 Panama3577435376
 Paraguay33880
 Peru391594006438765368613655729236
 Philippines3557635377
 Poland4891551610504175181249523495244902447020
 Portugal4722949228492294872948731466354663045423
 Qatar414514146040261376593685631852
 Romania42845430524444644542427424154242629
 Russia48830494234823246836476324682947818
 Serbia4404244846
 Saudi Arabia3896137373
 Singapore57515692564157315621
 Slovakia46435486324753848233497214922549821
 Slovenia485195091451014501195011850418
 Spain4732748134486324843148332480314852647619
 Sweden4822250217494244783649424502205091651011
  Switzerland508851511521853175346530652795297
 Thailand3945841957415564274641945417414173543227
 Trinidad and Tobago4175541447
 Tunisia3677038856371543655135938
 Turkey45339454424205144841445404244042333
 Ukraine[d]4414145343
 United Arab Emirates43143435504274943444
 United Kingdom489145021849227494244922649523508175297
 United States4653447837470404813448729474334832849315
 Uruguay40953418584185340952427434273942234
 Uzbekistan36472
 Vietnam469314952251115
Science
Country2022[33]2018[34]2015201220092006
ScoreRankScoreRankScoreRankScoreRankScoreRankScoreRank
International Average (OECD)485489493501501498
 Albania3767041759427543975839154
 Algeria37672
 Argentina406604046543252
 Australia5071050317510145211452795278
 Austria491234902849526506214942851117
 China B-S-J-G[a]51810
 China B-S-J-Z[b]5901
 AzerbaijanBaku3806839868
 Belarus47137
 Belgium491244992050220505225071951018
 Bosnia and Herzegovina39867
 Brazil403624046640166402554054939049
 Brunei4464243150
 Bulgaria421504245644646446434394243440
 Argentina CABA[c]4753842549
 Cambodia34781
 Canada515851885287525952975343
 Chile444434444544745445444474143839
 Taiwan537451610532452311520115324
 Colombia411544136241660399564025038850
 Costa Rica41155416604205842947
 Croatia483314723647537491324863549325
 Cyprus411564394743351
 Czech Republic498184972149329508205002251314
 Denmark494204932550221498254992449623
 Dominican Republic360773367833273
 El Salvador37372
 Estonia526653045343541552885315
 Finland511952265315545455415631
 France487264932449527499244982549524
 Georgia384663837341163
 Germany492225031650916524105201251612
 Greece441444524445544467404703847337
 Guatemala37373
 Hong Kong520751795239555154925422
 Hungary486274813247735494305032050420
 Iceland447414753547339478374962649126
 Indonesia383673967040365382603835539348
 Ireland504124962250319522135081850819
 Israel465374624246740470394553945438
 Italy477334684048134494314893347535
 Jamaica40363
 Japan547252955382547353945316
 Jordan375714295140964409544154742243
 Kazakhstan4234939769456434254840053
 South Korea52855197516115386538552210
 Kosovo357783657537871
 Latvia494194872949031502234942949027
 Lebanon3847238668
 Lithuania484294823147536496284913148831
 Luxembourg4773448333491334843648633
 Macau543354435296521155111651116
 Malaysia41652438484434742050
 Malta466364574346541
 Mexico410574195741661415524164641047
 Moldova417514285242853
 Mongolia41253
 Montenegro403614156141162410534015141246
 Morocco3657637774
 Netherlands48825503155091752212522105259
 New Zealand5041150812513125161653265307
 Macedonia380694136338470
 Norway478324902749824495295002348732
 Palestinian Authority36974
 Panama3886536576
 Paraguay36875
 Peru4085940464397673736136957
 Philippines3567935777
 Poland49917511115012252685081749822
 Portugal484304922650123489344933047436
 Qatar432464195841859384593795634952
 Romania428484265543550439464284341845
 Russia4783348732486354783747934
 Serbia4474044046
 Saudi Arabia3906438671
 Singapore56115512556155125423
 Slovakia462384644146142471384903248829
 Slovenia500145071351313514185121551911
 Spain485284833049330496274883448830
 Sweden494214991949328485364952750321
  Switzerland503134952350618515175171351215
 Thailand409584265342157444454254542144
 Trinidad and Tobago4255641048
 Tunisia38669398574015238651
 Turkey476344683942555463414544042442
 Ukraine[d]4503946938
 United Arab Emirates43247434494374844842
 United Kingdom500155051450915514195141451513
 United States499165021849625497265022148928
 Uruguay435454265443549416514274442841
 Uzbekistan35580
 Vietnam4723552585287
Reading
Country2022[33]2018[34]201520122009200620032000
ScoreRankScoreRankScoreRankScoreRankScoreRankScoreRankScoreRankScoreRank
International Average (OECD)476487493496493489494493
 Albania358734056140563394583855534939
 Algeria35071
 Argentina401584026342556
 Australia498125031650316512125158513752545284
 Austria4802148427485334902647037490214912249219
 China B-S-J-G[a]49427
 China B-S-J-Z[b]5551
 AzerbaijanBaku3656938968
 Belarus47436
 Belgium4792349322499205091650610501115071150711
 Bosnia and Herzegovina40362
 Brazil4105241357407624075241249393474033639636
 Brunei4294440859
 Bulgaria40457420544324943647429424024343032
 Argentina CABA[c]4753842948
 Cambodia32981
 Canada50785206527352375245527452835342
 Chile44837452434594244143449414423741035
 Taiwan5155503174972352384952149615
 Colombia409544125842557403544134838549
 Costa Rica41550426494275244145
 Croatia475264792948731485334763447729
 Cyprus381634245044345
 Czech Republic4891749025487304932447832483254892449220
 Denmark4891550118500184962349522494184921949716
 Dominican Republic351743427635869
 El Salvador36570
 Estonia511652355196516105011250112
 Finland490145207526452455362547254315461
 France4742949323499195051949620488224961750514
 Georgia374673807040165
 Germany4802249820509115081849718495174912148422
 Greece4384145742467414773848330460354723047425
 Guatemala37466
 Hong Kong500115244527254515333536351095256
 Hungary4733147633470404882849424482264822548023
 Iceland4364247435482354833550015484234922050712
 Indonesia3597137172397673965740253393463823837138
 Ireland516251885215523649619517651565275
 Israel47430470374793748632474354393945229
 Italy4822047632485344902548627469324762948721
 Jamaica41053
 Japan51635041551685383520749814498145229
 Jordan342784195540861399554055140144
 Kazakhstan3866138769427543935939054
 South Korea51545149517753645391556153425257
 Kosovo342773537534772
 Latvia4752747930488294892748428479274912345828
 Lebanon3537434773
 Lithuania472324763447239477374683847031
 Luxembourg47038481364883047236479284792744130
 Macau510752535091250915487264922049815
 Malaysia38860415564315039856
 Malta445394484444744
 Mexico4154942053423584244942544410424003742234
 Moldova411514245141659
 Mongolia37865
 Montenegro405564215242755422504085039248
 Morocco3397935973
 Netherlands459354852650315511135089507105138
 New Zealand501105061250910512115216521552255293
 Macedonia35972393673527037337
 Norway477254991951395042050311484245001250513
 Palestinian Authority34975
 Panama3925937771
 Paraguay37368
 Peru408554016439866384613705732740
 Philippines3477634077
 Poland48916512105061351895001450884971647924
 Portugal4772449224498214883148925472304782847026
 Qatar419474076040264388603725631251
 Romania42845428474344743846424453964542833
 Russia47931495264754045940440384423246227
 Serbia4404043945
 Saudi Arabia3836239965
 Singapore54315492535154225264
 Slovakia44738458414534346341477334663346931
 Slovenia469334952150514481364832949419
 Spain47428496254882948131461344812649318
 Sweden48718506115001748334497175079514751610
  Switzerland4831948428492285091450113499134991349417
 Thailand3796439366409604414442146417404203543131
 Trinidad and Tobago4275341647
 Tunisia3616840453404523805037539
 Turkey45636466404285147539464394473644133
 Ukraine[d]4284646639
 United Arab Emirates41748432464344844242
 United Kingdom494135041449822499214942349516507105238
 United States5049505134972449822500164951850415
 Uruguay43043427484374641151426434134143434
 Uzbekistan33680
 Vietnam462344873250817
  1. ^abcBeijing,Shanghai,Jiangsu,Guangdong
  2. ^abcBeijing, Shanghai, Jiangsu,Zhejiang
  3. ^abcCiudad Autónoma de Buenos Aires
  4. ^abcExcept for regionsoccupied by Russia

Previous years

[edit]
Main article:Programme for International Student Assessment (2000 to 2012)
PeriodFocusOECD countriesPartner countriesParticipating studentsNotes
2000Reading284 + 11265,000The Netherlands disqualified from data analysis. 11 additional non-OECD countries took the test in 2002.
2003Mathematics3011275,000UK disqualified from data analysis, due to its low response rate.[35] Also included test inproblem solving.
2006Science3027400,000Reading scores for US disqualified from analysis due to misprint in testing materials.[36]
2009[37]Reading3441 + 10470,00010 additional non-OECD countries took the test in 2010.[38][39]
2012[40]Mathematics3537510,000
2015[41]Science3431509,000
2018[42]Reading3742600,000
2022Mathematics3744690,000

Reception

[edit]
Further information:Programme for International Student Assessment (2000 to 2012)

China

[edit]

China's participation in the 2012 test was limited toShanghai,Hong Kong, andMacau as separate entities. In 2012, Shanghai participated for the second time, again topping the rankings in all three subjects, as well as improving scores in the subjects compared to the 2009 tests. Shanghai's score of 613 in mathematics was 113 points above the average score, putting the performance of Shanghai pupils about 3 school years ahead of pupils in average countries. Educational experts debated to what degree this result reflected the quality of thegeneral educational system in China, pointing out that Shanghai has greater wealth and better-paid teachers than the rest of China.[43] Hong Kong placed second in reading and science and third in maths.

Andreas Schleicher, PISA division head and co-ordinator, stated that PISA tests administered in rural China have produced some results approaching the OECD average. Citing further as-yet-unpublished OECD research, he said, "We have actually done Pisa in 12 of the provinces in China. Even in some of the very poor areas you get performance close to the OECD average."[44] Schleicher believes that China has also expanded school access and has moved away from learning by rote,[45] performing well in both rote-based and broader assessments.[44]

In 2018 the Chinese provinces that participated wereBeijing,Shanghai,Jiangsu andZhejiang. In 2015, the participating provinces were Jiangsu,Guangdong, Beijing, and Shanghai.[46] The 2015 Beijing-Shanghai-Jiangsu-Guangdong cohort scored a median 518 in science in 2015, while the 2012 Shanghai cohort scored a median 580.

Critics of PISA counter that in Shanghai and other Chinese cities, most children of migrant workers can only attend city schools up to the ninth grade, and must return to their parents' hometowns for high school due tohukou restrictions, thus skewing the composition of the city's high school students in favor of wealthier local families. A population chart of Shanghai reproduced inThe New York Times shows a steep drop off in the number of 15-year-olds residing there.[47] According to Schleicher, 27% of Shanghai's 15-year-olds are excluded from its school system (and hence from testing). As a result, the percentage of Shanghai's 15-year-olds tested by PISA was 73%, lower than the 89% tested in the US.[48] Following the 2015 testing, OECD published in depth studies on the education systems of a selected few countries including China.[49]

In 2014,Liz Truss, the British Parliamentary Under-Secretary of State at theDepartment for Education, led a fact-finding visit to schools and teacher-training centres in Shanghai.[50] Britain increased exchanges with Chinese teachers and schools to find out how to improve quality. In 2014, 60 teachers from Shanghai were invited to the UK to help share their teaching methods, support pupils who are struggling, and help to train other teachers.[51] In 2016, Britain invited 120 Chinese teachers, planning to adopt Chinese styles of teaching in 8,000 aided schools.[52] By 2019, approximately 5,000 of Britain's 16,000 primary schools had adopted the Shanghai's teaching methods.[53] The performance of British schools in PISA improved after adopting China's teaching styles.[54][55]

Finland

[edit]

Finland, which received several top positions in the first tests, fell in all three subjects in 2012, but remained the best performing country overall in Europe, achieving their best result in science with 545 points (5th) and worst in mathematics with 519 (12th) in which the country was outperformed by four other European countries. The drop in mathematics was 25 points since 2003, the last time mathematics was the focus of the tests. For the first time Finnish girls outperformed boys in mathematics narrowly. It was also the first time pupils in Finnish-speaking schools did not perform better than pupils inSwedish-speaking schools. Former minister of Education and ScienceKrista Kiuru expressed concern for the overall drop, as well as the fact that the number of low-performers had increased from 7% to 12%.[56]

India

[edit]

India participated in the 2009 round of testing but pulled out of the 2012 PISA testing, with the Indian government attributing its action to the unfairness of PISA testing to Indian students.[57] India had ranked 72nd out of 73 countries tested in 2009.[58] The Indian Express reported, "The ministry (of education) has concluded that there was a socio-cultural disconnect between the questions and Indian students. The ministry will write to the OECD and drive home the need to factor in India's "socio-cultural milieu". India's participation in the next PISA cycle will hinge on this".[59] The Indian Express also noted that "Considering that over 70 nations participate in PISA, it is uncertain whether an exception would be made for India".

India did not participate in the 2012, 2015 and 2018 PISA rounds.[60]

AKendriya Vidyalaya Sangathan (KVS) committee as well as a group of secretaries on education constituted by thePrime Minister of IndiaNarendra Modi recommended that India should participate in PISA. Accordingly, in February 2017, theMinistry of Human Resource Development underPrakash Javadekar decided to end the boycott and participate in PISA from 2020. To address the socio-cultural disconnect between the test questions and students, it was reported that the OECD will update some questions. For example, the word avocado in a question may be replaced with a more popular Indian fruit such as mango.[61]

India did not participate in the 2022 PISA rounds citing due toCOVID-19 pandemic disruption.[62]

Malaysia

[edit]

In 2015, the results from Malaysia were found by the OECD to have not met the maximum response rate.[63] Opposition politicianOng Kian Ming said the education ministry tried to oversample high-performing students in rich schools.[64][65]

Sweden

[edit]

Sweden's result dropped in all three subjects in the 2012 test, which was a continuation of a trend from 2006 and 2009. It saw the sharpest fall in mathematics performance with a drop in score from 509 in 2003 to 478 in 2012. The score in reading showed a drop from 516 in 2000 to 483 in 2012. The country performed below the OECD average in all three subjects.[66] The leader of the opposition,Social DemocratStefan Löfven, described the situation as a national crisis.[67] Along with the party's spokesperson on education,Ibrahim Baylan, he pointed to the downward trend in reading as most severe.[67]

In 2020, Swedish newspaperExpressen revealed that Sweden had inflated their score in PISA 2018 by not conforming to OECD standards. According to professorMagnus Henrekson a large number of foreign-born students had not been tested.[68]

United Kingdom

[edit]

In the 2012 test, as in 2009, the result was slightly above average for the United Kingdom, with the science ranking being highest (20).[69]England,Wales,Scotland andNorthern Ireland also participated as separated entities, showing the worst result for Wales which in mathematics was 43rd of the 65 countries and economies. Minister of Education in WalesHuw Lewis expressed disappointment in the results, said that there were no "quick fixes", but hoped that several educational reforms that have been implemented in the last few years would give better results in the next round of tests.[70] The United Kingdom had a greater gap between high- and low-scoring students than the average. There was little difference between public and private schools when adjusted forsocio-economic background of students. The gender difference in favour of girls was less than in most other countries, as was the difference between natives and immigrants.[69]

Writing in theDaily Telegraph,Ambrose Evans-Pritchard warned against putting too much emphasis on the UK's international ranking, arguing that an overfocus on scholarly performances in East Asia might have contributed to the area's lowbirthrate, which he argued could harm the economic performance in the future more than a good PISA score would outweigh.[71]

In 2013, theTimes Educational Supplement (TES) published an article, "Is PISA Fundamentally Flawed?" by William Stewart, detailing serious critiques of PISA's conceptual foundations and methods advanced by statisticians at major universities.[72]

In the article, ProfessorHarvey Goldstein of theUniversity of Bristol was quoted as saying that when the OECD tries to rule out questions suspected of bias, it can have the effect of "smoothing out" key differences between countries. "That is leaving out many of the important things," he warned. "They simply don't get commented on. What you are looking at is something that happens to be common. But (is it) worth looking at? PISA results are taken at face value as providing some sort of common standard across countries. But as soon as you begin to unpick it, I think that all falls apart."

Queen's University Belfast mathematician Dr. Hugh Morrison stated that he found the statistical model underlying PISA to contain a fundamental, insoluble mathematical error that renders Pisa rankings "valueless".[73] Goldstein remarked that Dr. Morrison's objection highlights "an important technical issue" if not a "profound conceptual error". However, Goldstein cautioned that PISA has been "used inappropriately", contending that some of the blame for this "lies with PISA itself. I think it tends to say too much for what it can do and it tends not to publicise the negative or the weaker aspects." Professors Morrison and Goldstein expressed dismay at the OECD's response to criticism. Morrison said that when he first published his criticisms of PISA in 2004 and also personally queried several of the OECD's "senior people" about them, his points were met with "absolute silence" and have yet to be addressed. "I was amazed at how unforthcoming they were," he told TES. "That makes me suspicious." "Pisa steadfastly ignored many of these issues," he says. "I am still concerned."[72]

Professor Svend Kreiner, of theUniversity of Copenhagen, agreed: "One of the problems that everybody has with PISA is that they don't want to discuss things with people criticising or asking questions concerning the results. They didn't want to talk to me at all. I am sure it is because they can't defend themselves.[72]

United States

[edit]

Since 2012 a few states have participated in the PISA tests as separate entities. Only the 2012 and 2015 results are available on a state basis. Puerto Rico participated in 2015 as a separate US entity as well.

2012 US State results
MathematicsScienceReading
 Massachusetts514
 Connecticut506
United States US Average481
 Florida467
 Massachusetts527
 Connecticut521
United States US Average497
 Florida485
 Massachusetts527
 Connecticut521
United States US Average498
 Florida492
2015 US State results
MathematicsScienceReading
 Massachusetts500
 North Carolina471
United States US Average470
 Puerto Rico378
 Massachusetts529
 North Carolina502
United States US Average496
 Puerto Rico403
 Massachusetts527
 North Carolina500
United States US Average497
 Puerto Rico410

PISA results for the United States by race and ethnicity

[edit]
Mathematics
Race2022[74]2018[75]20152012200920062003[76]
ScoreScoreScoreScoreScoreScoreScore
Asian543539498549524494506
White498503499506515502512
US Average465478470481487474483
More than one race476474475492487482502
Hispanic439452446455453436443
Other423436460446446
Black412419419421423404417
Science
Race2022[77]2018[75]2015[78]201220092006
ScoreScoreScoreScoreScoreScore
Asian578551525546536499
White537529531528532523
US Average499502496497502489
More than one race513502503511503501
Hispanic471478470462464439
Other462439465453
Black445440433439435409
Reading
Race2022[79]2018[75]201520122009200620032000
ScoreScoreScoreScoreScoreScoreScoreScore
Asian579556527550541513546
White537531526519525525538
US Average504505497498500495504
More than one race512501498517502515
Hispanic481481478478466453449
Black459448443443441430445
Other440438462456455

Research on possible causes of PISA disparities in different countries

[edit]

Although PISA and TIMSS officials and researchers themselves generally refrain from hypothesizing about the large and stable differences in student achievement between countries, since 2000, literature on the differences in PISA and TIMSS results and their possible causes has emerged.[80] Data from PISA have furnished several researchers, notablyEric Hanushek,Ludger Wößmann, Heiner Rindermann, and Stephen J. Ceci, with material for books and articles about the relationship between student achievement and economic development,[81]democratization, and health;[82] as well as the roles of such single educational factors as high-stakes exams,[83] the presence or absence of private schools and the effects and timing of ability tracking.[84]

It is important to be careful with the interpretation of this data due to PISA has also attributed low educational achievements to school poverty or the presence of a high number of immigrants students, but this is hoax and a superficial analysis of the data, as has been demonstrated in some studies. It is now known that educational achievements are determined by actions implemented in schools[85]

Critics and comments on accuracy

[edit]

David Spiegelhalter of Cambridge wrote: "Pisa does present the uncertainty in the scores and ranks - for example theUnited Kingdom rank in the 65 countries is said to be between 23 and 31. It's unwise for countries to base education policy on their Pisa results, asGermany,Norway andDenmark did after doing badly in 2001."[86]

According toForbes, an American media outlet, in an opinion article, some countries such asChina,Hong Kong,Macau, andArgentina select PISA samples from only the best-educated areas or from their top-performing students, slanting the results.[87]

According to an open letter toAndreas Schleicher, director of PISA, various academics and educators argued that "OECD and Pisa tests are damaging education worldwide".[88]

According toO Estado de São Paulo, Brazil shows a great disparity when classifying the results between public and private schools, where public schools would rank worse than Peru, while private schools would rank better than Finland.[89]

According to a 2023 book, PISA is failing in its mission. It suggests that flatlined student outcomes and policy shortcomings have much to do with PISA's implicit ideological biases, structural impediments such as union advocacy, and conflicts of interest.[90]

See also

[edit]

Explanatory notes

[edit]
  1. ^40 countries participated back then, and 81 countries and economies participated in the 2022 data collection.

References

[edit]
  1. ^"About PISA".OECD PISA. Retrieved8 February 2018.
  2. ^Berger, Kathleen (3 March 2014).Invitation to The Life Span (second ed.). worth.ISBN 978-1-4641-7205-2.
  3. ^"PISA 2022 Results".OECD. December 2023.Archived from the original on 5 December 2023. Retrieved15 December 2023.
  4. ^abcdeRey, O (2010)."The use of external assessments and the impact on education systems".CIDREE Yearbook. Archived fromthe original on 3 February 2017. Retrieved22 November 2019.
  5. ^McGaw, B (2008). "The role of the OECD in international comparative studies of achievement".Assessment in Education: Principles, Policy & Practice.15 (3):223–243.doi:10.1080/09695940802417384.
  6. ^abKijima, Rie; Lipscy, Phillip Y. (1 January 2024)."The politics of international testing".The Review of International Organizations.19 (1):1–31.doi:10.1007/s11558-023-09494-4.ISSN 1559-744X.
  7. ^abRamirez, Francisco O.; Schofer, Evan; Meyer, John W. (2018)."International Tests, National Assessments, and Educational Development (1970–2012)".Comparative Education Review.62 (3):344–364.doi:10.1086/698326.ISSN 0010-4086.
  8. ^Kamens, David H.; McNeely, Connie L. (2010)."Globalization and the Growth of International Educational Testing and National Assessment".Comparative Education Review.54 (1):5–25.doi:10.1086/648471.ISSN 0010-4086.
  9. ^Mons, N (2008). "Évaluation des politiques éducatives et comparaisons internationales".Revue française de pédagogie (in French).164 (July–August-September 2008):5–13.doi:10.4000/rfp.1985.
  10. ^abcdefBreakspear, S. (2012)."The Policy Impact of PISA: An Exploration of the Normative Effects of International Benchmarking in School System Performance".OECD Education Working Paper. OECD Education Working Papers.71.doi:10.1787/5k9fdfqffr28-en.
  11. ^Barroso, J.; de Carvalho, L.M. (2008). "Pisa: Un instrument de régulation pour relier des mondes".Revue française de pédagogie (in French).164 (164):77–80.doi:10.4000/rfp.2133.
  12. ^Martens, Kerstin; Niemann, Dennis (2013)."When Do Numbers Count? The Differential Impact of the PISA Rating and Ranking on Education Policy in Germany and the US".German Politics.22 (3):314–332.doi:10.1080/09644008.2013.794455.ISSN 0964-4008.
  13. ^Ertl, H. (2006). "Educational standards and the changing discourse on education: the reception and consequences of the PISA study in Germany".Oxford Review of Education.32 (5):619–634.doi:10.1080/03054980600976320.S2CID 144656964.
  14. ^Bajomi, I.; Berényi, E.; Neumann, E.; Vida, J. (2009)."The Reception of PISA in Hungary' accessed January 2017"(PDF).Knowledge and Policy. Archived fromthe original(PDF) on 2 February 2017.
  15. ^Steiner-Khamsi (2003), cited byBreakspear, S. (2012)."The Policy Impact of PISA: An Exploration of the Normative Effects of International Benchmarking in School System Performance".OECD Education Working Paper. OECD Education Working Papers.71.doi:10.1787/5k9fdfqffr28-en.
  16. ^Mangez, Eric; Cattonar, Branka (September–December 2009)."The status of PISA in the relationship between civil society and the educational sector in French-speaking Belgium".Sísifo: Educational Sciences Journal (10). Educational Sciences R&D Unit of the University of Lisbon:15–26.ISSN 1646-6500. Retrieved26 December 2017.
  17. ^D., Greger (2008). "Lorsque PISA importe peu. Le cas de la République Tchèque et de l'Allemagne".Revue française de pédagogie (in French).164 (164):91–98.doi:10.4000/rfp.2138. cited inRey 2010, p. 145
  18. ^Afonso, Natércio; Costa, Estela (September–December 2009)."The influence of the Programme for International Student Assessment (PISA) on policy decision in Portugal: the education policies of the 17th Portuguese Constitutional Government"(PDF).Sísifo: Educational Sciences Journal (10). Educational Sciences R&D Unit of the University of Lisbon:53–64.ISSN 1646-6500. Retrieved26 December 2017.
  19. ^Rautalin, M.; Alasuutari (2009). "The uses of the national PISA results by Finnish officials in central government".Journal of Education Policy.24 (5):539–556.doi:10.1080/02680930903131267.S2CID 154584726.
  20. ^Egelund, N. (2008). "The value of international comparative studies of achievement – a Danish perspective".Assessment in Education: Principles, Policy & Practice.15 (3):245–251.doi:10.1080/09695940802417400.
  21. ^Behrens, M. (2006). "Préface". In Mons, N.; Pons, X. (eds.).Les standards en éducation dans le monde francophone (in French). Neuchâtel: IRDP. cited inRey 2010, p. 142
  22. ^Hefling, Kimberly (3 December 2013)."Asian nations dominate international test". Yahoo!. Archived fromthe original on 5 December 2013.
  23. ^"Chapter 2 of the publication 'PISA 2003 Assessment Framework'"(PDF). Pisa.oecd.org. Archived fromthe original(PDF) on 17 December 2005.
  24. ^Keeley, B. (April 2014)."PISA, we have a problem…".OECD Insights. Archived fromthe original on 17 June 2021.
  25. ^Poddiakov, Alexander."Complex Problem Solving at PISA 2012 and PISA 2015: Interaction with Complex Reality".SSRN. // Translated from Russian. Reference to the original Russian text:Poddiakov, A. (2012). "Reshenie kompleksnykh problem v PISA-2012 i PISA-2015: vzaimodeistvie so slozhnoi real'nost'yu".Obrazovatel'naya Politika (in Russian).6:34–53.
  26. ^C., Füller (5 December 2007)."Pisa hat einen kleinen, fröhlichen Bruder".taz (in German).
  27. ^Stanat, P; Artelt, C; Baumert, J; Klieme, E; Neubrand, M; Prenzel, M; Schiefele, U; Schneider, W (2002),PISA 2000: Overview of the study—Design, method and results, Berlin: Max Planck Institute for Human Development
  28. ^Mazzeo, John; von Davier, Matthias (2013),Linking Scales in International Large-Scale Assessments, chapter 10 in Rutkowski, L. von Davier, M. & Rutkowski, D. (eds.) Handbook of International Large-Scale Assessment: Background, Technical Issues, and Methods of Data Analysis., New York: Chapman and Hall/CRC.
  29. ^von Davier, Matthias; Sinharay, Sandip (2013),Analytics in International Large-Scale Assessments: Item Response Theory and Population Models, chapter 7 in Rutkowski, L. von Davier, M. & Rutkowski, D. (eds.) Handbook of International Large-Scale Assessment: Background, Technical Issues, and Methods of Data Analysis., New York: Chapman and Hall/CRC.
  30. ^abcd"Learning Data | QEdu Países".paises.qedu.org.br. Retrieved21 May 2024.
  31. ^"PISA 2022 Participants".OECD – PISA. Retrieved21 May 2024.
  32. ^OECD (2023).PISA 2022 Results (Volume I): The State of Learning and Equity in Education. Paris: Organisation for Economic Co-operation and Development.doi:10.1787/53f23881-en.ISBN 978-92-64-99796-7.
  33. ^abc"PISA 2022 Results". OECD. 2023.
  34. ^abc"PISA 2018 Results" (Document). OECD. 2019.doi:10.1787/888934028140.
  35. ^Jerrim, John (2021)."PISA 2018 in England, Northern Ireland, Scotland and Wales: Is the data really representative of all four corners of the UK?".Review of Education.9 (3).doi:10.1002/rev3.3270.ISSN 2049-6613.
  36. ^Baldi, Stéphane; Jin, Ying; Skemer, Melanie; Green, Patricia J; Herget, Deborah; Xie, Holly (10 December 2007),Highlights From PISA 2006: Performance of U.S. 15-Year-Old Students in Science and Mathematics Literacy in an International Context(PDF),NCES, retrieved14 December 2013,PISA 2006 reading literacy results are not reported for the United States because of an error in printing the test booklets. Furthermore, as a result of the printing error, the mean performance in mathematics and science may be misestimated by approximately 1 score point. The impact is below one standard error.
  37. ^PISA 2009 Results: Executive Summary(PDF),OECD, 7 December 2010, archived fromthe original(PDF) on 24 August 2012
  38. ^ACER releases results of PISA 2009+ participant economies,ACER, 16 December 2011, archived fromthe original on 14 December 2013
  39. ^Walker, Maurice (2011),PISA 2009 Plus Results(PDF),OECD, archived fromthe original(PDF) on 22 December 2011, retrieved28 June 2012
  40. ^PISA 2012 Results in Focus(PDF),OECD, 3 December 2013, archived fromthe original(PDF) on 3 December 2013, retrieved4 December 2013
  41. ^PISA 2015 Results(PDF),OECD, 2016, retrieved7 January 2025
  42. ^PISA 2018 Results(PDF),OECD, 2019, retrieved7 January 2025
  43. ^Phillips, Tom (3 December 2013)."OECD education report: Shanghai's formula is world-beating".The Telegraph. Retrieved8 December 2013.
  44. ^abCook, Chris (7 December 2010),"Shanghai tops global state school rankings",Financial Times, retrieved28 June 2012
  45. ^Mance, Henry (7 December 2010),"Why are Chinese schoolkids so good?",Financial Times, archived fromthe original on 8 December 2010, retrieved28 June 2012
  46. ^Coughlan, Sean (26 August 2014)."Pisa tests to include many more Chinese pupils".BBC News.
  47. ^Gao, Helen (23 January 2014)."Shanghai Test Scores and the Mystery of the Missing Children".New York Times. For Schleicher's initial response to these criticisms see his post,"Are the Chinese Cheating in PISA Or Are We Cheating Ourselves?".OECD's website blog. Education Today. 10 December 2013. Archived fromthe original on 17 February 2014.
  48. ^Stewart, William (6 March 2014)."More than a quarter of Shanghai pupils missed by international Pisa rankings".Times Educational Supplement. Archived fromthe original on 15 March 2014. Retrieved7 March 2014.
  49. ^"Education in China. A Snapshot"(PDF).OECD. 2016. Archived fromthe original(PDF) on 6 December 2016.
  50. ^Howse, Patrick (18 February 2014)."Shanghai visit for minister to learn maths lessons".BBC News. Retrieved19 July 2014.
  51. ^Coughlan, Sean (12 March 2014)."Shanghai teachers flown in for maths".BBC News. Retrieved11 August 2020.
  52. ^"Britain invites 120 Chinese Maths teachers for aided schools".India Today. 20 July 2016. Retrieved12 August 2020.
  53. ^"Scores bolster case for Shanghai math in British schools | The Star".www.thestar.com.my. 10 December 2019. Retrieved11 August 2020.
  54. ^Turner, Camilla (3 December 2019)."Britain jumps up international maths rankings following Chinese-style teaching".The Telegraph.ISSN 0307-1235. Retrieved11 August 2020.
  55. ^Starkey, Hannah (5 December 2019)."UK Boost International Maths Ranking After Adopting Chinese-Style Teaching".True Education Partnerships. Archived fromthe original on 3 August 2020. Retrieved11 August 2020.
  56. ^"PISA 2012: Proficiency of Finnish youth declining".University of Jyväskylä. Archived fromthe original on 13 December 2013. Retrieved9 December 2013.
  57. ^Hemali Chhapia, TNN (3 August 2012)."India backs out of global education test for 15-year-olds".The Times of India. Archived fromthe original on 29 April 2013.
  58. ^"PISA (Program for International Student Assessment): OECD".Drishti. 1 September 2021.
  59. ^"Poor PISA score: Govt blames 'disconnect' with India".The Indian Express. 3 September 2012.
  60. ^"India chickens out of international students assessment programme again".The Times of India. 1 June 2013.
  61. ^"PISA Tests: India to take part in global teen learning test in 2021".The Indian Express. 22 February 2017. Retrieved19 May 2018.
  62. ^"India opts out of PISA 2022: Prudence or Cowardice?".EducationWorld. 10 January 2024. Retrieved27 July 2024.
  63. ^"Ong: Did ministry try to rig results for Pisa 2015 report?". 8 December 2016.
  64. ^"Who's telling the truth about M'sia's Pisa 2015 scores?". 9 December 2016.
  65. ^"Malaysian PISA results under scrutiny for lack of evidence".School Advisor. 8 December 2016.
  66. ^Näslund, Lars (3 December 2013)."Svenska skolan rasar i stor jämförelse".Expressen (in Swedish). Retrieved4 December 2013.
  67. ^abKärrman, Jens (3 December 2013)."Löfven om Pisa: Nationell kris".Dagens Nyheter (in Swedish). Retrieved8 December 2013.
  68. ^"Sveriges PISA-framgång bygger på falska siffror" (in Swedish). 2 June 2020.
  69. ^abAdams, Richard (3 December 2013),"UK students stuck in educational doldrums, OECD study finds",The Guardian, retrieved4 December 2013
  70. ^"Pisa ranks Wales' education the worst in the UK".BBC. 3 December 2013. Retrieved4 December 2013.
  71. ^Evans-Pritchard, Ambrose (3 December 2013)."OECD educational report: Pisa fever is causing east Asia's demographic collapse".Telegraph.co.uk. Archived fromthe original on 3 December 2013. Retrieved4 December 2013.
  72. ^abcStewart, William (26 July 2013)."Is Pisa fundamentally flawed?".Times Educational Supplement. Archived fromthe original on 23 August 2013. Retrieved26 July 2013.
  73. ^Morrison, Hugh (2013)."A fundamental conundrum in psychology's standard model of measurement and its consequences for PISA global rankings"(PDF). Archived fromthe original(PDF) on 5 June 2013. Retrieved13 July 2017.
  74. ^"Average scores of U.S. 15-year-old students on the PISA mathematics literacy scale by race/ethnicity: 2022".
  75. ^abc"Highlights of U.S. PISA 2018 Results Web Report"(PDF).
  76. ^"Average scores among 15-year-olds on the Program for International Student Assessment (PISA) mathematics literacy assessment, by participating country and race/ethnicity in the United States: 2003".
  77. ^"Average scores of U.S. 15-year-old students on the PISA science literacy scale by race/ethnicity: 2022".
  78. ^"Average scores of U.S. 15-year-old students on the PISA science literacy scale, by race/ethnicity: 2015".
  79. ^"Average scores of U.S. 15-year-old students on the PISA reading literacy scale by race/ethnicity: 2022".
  80. ^Hanushek, Eric A.; Woessmann, Ludger (2011). "The economics of international differences in educational achievement". In Hanushek, Eric A.; Machin, Stephen; Woessmann, Ludger (eds.).Handbook of the Economics of Education. Vol. 3. Amsterdam: North Holland. pp. 89–200.
  81. ^Hanushek, Eric; Woessmann, Ludger (2008),"The role of cognitive skills in economic development"(PDF),Journal of Economic Literature,46 (3):607–668,doi:10.1257/jel.46.3.607
  82. ^Rindermann, Heiner; Ceci, Stephen J (2009), "Educational policy and country outcomes in international cognitive competence studies",Perspectives on Psychological Science,4 (6):551–577,doi:10.1111/j.1745-6924.2009.01165.x,PMID 26161733,S2CID 9251473
  83. ^Bishop, John H (1997). "The effect of national standards and curriculum-based exams on achievement".American Economic Review. Papers and Proceedings.87 (2):260–264.JSTOR 2950928.
  84. ^Hanushek, Eric; Woessmann, Ludger (2006),"Does educational tracking affect performance and inequality? Differences-in-differences evidence across countries"(PDF),Economic Journal,116 (510):C63 –C76,doi:10.1111/j.1468-0297.2006.01076.x
  85. ^Garcia Yeste, C., Morlà-Folch, T., Lopez de Aguileta, G., Natividad Sancho, L., Lopez de Aguileta, A., & Munté-Pascual, A. (2025). Achieving the same educational opportunities for all: overcoming hoax interpretations of the PISA results. International Journal of Adolescence and Youth, 30(1). https://doi.org/10.1080/02673843.2025.2459330
  86. ^Alexander, Ruth (10 December 2013)."How accurate is the Pisa test?".BBC News. Retrieved22 November 2019.
  87. ^Flows, Capital."Are The PISA Education Results Rigged?".Forbes. Retrieved22 November 2019.
  88. ^Guardian Staff (6 May 2014)."OECD and Pisa tests are damaging education worldwide – academics".The Guardian. Retrieved22 November 2019 – via www.theguardian.com.
  89. ^Cafardo, Rafael (4 December 2019)."Escolas privadas de elite do Brasil superam Finlândia no Pisa, rede pública vai pior do que o Peru". Retrieved4 December 2019 – via www.estadao.com.br.
  90. ^Montserrat Gomendio; José Ignacio Wert (2023).Dire Straits: Education Reforms, Ideology, Vested Interests and Evidence.doi:10.11647/OBP.0332.ISBN 978-1-80064-930-9.S2CID 256890161.

External links

[edit]
Global
Continents/subregions
Intercontinental
Cities/urban areas
Past and future
Population density
Growth indicators
Life expectancy
Otherdemographics
Health
Education and innovation
Economic
History
Guidelines
International
National
Academics
People
Other
Retrieved from "https://en.wikipedia.org/w/index.php?title=Programme_for_International_Student_Assessment&oldid=1321818565"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp