Failing our children?, Part 2

compiled & edited by Daniel Hagadorn

Even though we are failing in every measurable national and international test, at least we are leading the world in self-esteem.

According to the National Center for Education Statistics (NCES), “the National Assessment of Educational Progress (NAEP) is the only nationally representative and continuing assessment of what America’s students know and can do in various subject areas. Assessments are conducted periodically in mathematics, reading, science, writing, the arts, civics, economics, geography, and U.S. history.”

“Since NAEP assessments are administered uniformly using the same sets of test booklets across the nation, NAEP results serve as a common metric for all states and selected urban districts. The assessment stays essentially the same from year to year, with only carefully documented changes. This permits NAEP to provide a clear picture of student academic progress over time.”

Although the real-world value of standardized test scores remains dubious at best, it serves as an accurate baseline from which to evaluate, using its own preferred method of measurement, the results of public school education.

According to the Long-Term Trend Reading Assessment administered by the National Assessment of Educational Progress (NAEP): [1]

  • From 1971 to 2008, the average national NAEP Reading Assessment score for U.S. 4th-graders was 212 out of 500, or 42.5%, or an “F-”.
  • From 1971 to 2008, the average national NAEP Reading Assessment score for U.S. 8th-graders was 258 out of 500, or 51.6%, or an “F”.
  • From 1984 to 2008, the average national NAEP Reading Assessment score for U.S. 12th-graders was 270 out of 500, or 53.9%, or an “F”.

According to the Long-Term Trend Mathematics Assessment administered by the National Assessment of Educational Progress (NAEP): [2]

  • From 1978 to 2008, the average national NAEP Mathematics Assessment score for U.S. 4th-graders was 231 out of 500, or 46.1%, or an “F-”.
  • From 1978 to 2008, the average national NAEP Mathematics Assessment score for U.S. 8th-graders was 274 out of 500, or 54.7%, or an “F.”
  • From 1973 to 2008, the average national NAEP Mathematics Assessment score for U.S. 12th-graders was 287 out of 500, or 57.4%, or an “F+”.

According to the Science Assessment administered by the National Assessment of Educational Progress (NAEP): [3]

  • From 1996 to 2005, the average national NAEP Science Assessment score for U.S. 4th-graders was 148 out of 300, or 49.3%, or an “F-”.
  • From 1996 to 2005, the average national NAEP Science Assessment score for U.S. 8th-graders was 149 out of 300, or 49.6%, or an “F-”.
  • From 1996 to 2005, the average national NAEP Science Assessment score for U.S. 12th-graders was 148 out of 300, or 49.3%, or an “F-”.

According to the U.S. History Assessment administered by the National Assessment of Educational Progress (NAEP): [4]

  • From 1994 to 2006, the average national NAEP U.S. History Assessment score for U.S. 4th-graders was 208 out of 500, or 41.6%, or an “F-”.
  • From 1994 to 2006, the average national NAEP U.S. History Assessment score for U.S. 8th-graders was 261 out of 500, or 52.2%, or an “F”.
  • From 1994 to 2006, the average national NAEP U.S. History Assessment score for U.S. 12th-graders was 288 out of 500, or 57.6%, or an “F+”.

According to the Civics Assessment administered by the National Assessment of Educational Progress (NAEP): [5]

  • From 1998 to 2006, the average national NAEP Civics Assessment score for U.S. 4th-graders was 152 out of 300, or 50.6%, or an “F-”.
  • From 1998 to 2006, the average national NAEP Civics Assessment score for U.S. 8th-graders was 150 out of 300, or 50.0%, or an “F-”.
  • From 1998 to 2006, the average national NAEP Civics Assessment score for U.S. 12th-graders was 151 out of 300, or 50.3%, or an “F-”.

According to the Economics Assessment administered by the National Assessment of Educational Progress (NAEP): [6]

  • In 2006, the average national NAEP Economics Assessment score for U.S. 12th-graders was 150 out of 300, or 50%, or an “F”.

In the subjects of reading, mathematics, U.S. history, science, civics, and economics, U.S. 4th-, 8th-, and 12th-graders are failing…miserably. Moreover, this failure is prohibitively expensive, drains our national economy, and wastes the time of our students.

Okay, sure the U.S. scores are dismal, but that is only because our domestic academic standards are so rigorous. Certainly we compare more favorably against the rest of the world. Right?

Internationally, U.S. literacy scores in reading, mathematics, and science are—to put it charitably—dismal. Since the National Assessment of Educational Progress (NAEP) works closely with the following highly regarded international testing organizations, the results are especially revealing.

  • Program for International Student Assessment (PISA), sponsored by the Organization for Economic Cooperation and Development (OECD) and first conducted in 2000, assesses the reading, mathematics, and science literacy of 15-year-olds every three years.
  • Progress in International Reading Literacy Study (PIRLS), sponsored by the International Association for the Evaluation of Educational Achievement (IEA) and first conducted in 2001, assesses the reading literacy of 4th-graders every five years.
  • Trends in International Mathematics and Science Study (TIMSS), sponsored by the International Association for the Evaluation of Educational Achievement (IEA) and first conducted in 1995, assesses the mathematics and science literacy of both 4th- and 8th-graders every four years.

READING LITERACY

According to the 2000, 2003, and 2006 Program for International Student Assessment (PISA)[7] [8]

  • PISA (2000) reported the average U.S. 15-year-old’s reading literacy score of 504 out of 1000 was HIGHER than the international average of 500, but LOWER than their peers in 14 of the 43 participating countries: Finland (547), Canada (534), New Zealand (529), Australia (528), Ireland (527), Republic of Korea (525), United Kingdom (523), Japan (522), Sweden (516), Austria (507), Belgium (507), Iceland (507), France (505), and Norway (505).
  • PISA (2003) reported the average U.S. 15-year-old’s reading literacy score of 495 out of 1000 was HIGHER than the international average of 494, but LOWER than their peers in 18 of the 41 participating countries: Finland (544), Republic of Korea (534), Canada (528), Australia (525), Liechtenstein (525), New Zealand (522), Ireland (516), Sweden (514), Netherlands (513), Hong Kong-China (510), Belgium (508), United Kingdom (507), Norway (500), Switzerland (499), Japan (498), Poland (497), France (496), and Macao-China (498).
  • PISA (2006) [the U.S. did not participate] reported the [projected] average U.S. 15-year-old’s reading literacy score of 486 out of 1000 was LOWER than the international average of 492 and LOWER than their peers in 23 of the 57 participating countries: Republic of Korea (556), Finland (547), Hong Kong-China (536), Canada (527), New Zealand (521), Ireland (517), Australia (513), Liechtenstein (510), Poland (508), Netherlands (507), Sweden (507), Estonia (501), Belgium (501), Switzerland (499), Japan (498), Taipei-China (496), Germany (495), United Kingdom (495), Denmark (494), Slovenia (494), Macao-China (492), Austria (490), and France (488).

According to the 2001 and 2006 Progress in International Reading Literacy Study (PIRLS)… [9] [10]

  • PIRLS (2001) reported the average U.S. 4th-grader’s reading literacy score of 542 out of 1000 was HIGHER than the international average of 500, but LOWER than their peers in 8 of the 35 participating countries: Sweden (561), Netherlands (554), England (553), Bulgaria (550), Ontario-Canada (548), Latvia (545), Hungary (543), and Lithuania (543).
  • PIRLS (2006) reported the average U.S. 4th-grader’s reading literacy score of 540 out of 1000 was HIGHER than the international average of 500, but LOWER than their peers in 17 of the 45 participating countries: Russian Federation (565), Hong Kong-China (564), Alberta-Canada (560), British Columbia-Canada (558), Singapore (558), Luxembourg (557), Canada-Ontario (555), Hungary (551), Italy (551), Sweden (549), Germany (548), Belgium (547), Bulgaria (547), Netherlands (547), Denmark (546), and Canada-Nova Scotia (542), and Latvia (541).

MATHEMATICS LITERACY

According to the 2003 and 2006 Program for International Student Assessment (PISA)… [11] [12]

  • PISA (2003) reported the average U.S. 15-year-old’s mathematics literacy score of 483 out of 1000 was LOWER than the international average of 500 and LOWER than their peers in 26 of the 41 participating countries: Hong Kong-China (550), Finland (544), Republic of Korea (542), Netherlands (538), Liechtenstein (536), Japan (534), Canada (533), Belgium (529), Macao-China (527), Switzerland (527), New Zealand (524), Australia (524), Czech Republic (517), Iceland (515), Denmark (514), France (511), Sweden (509), United Kingdom (508), Austria (506), Ireland (503), Germany (503), Norway (495), Luxembourg (493), Hungary (490), Poland (490), and Spain (485).
  • PISA (2006) reported the average U.S. 15-year-old’s mathematics literacy score of 474 out of 1000 was LOWER than the international average of 498 and LOWER than their peers in 34 of the 57 participating countries: Taipei-China (549), Finland (548), Hong Kong-China (547) Republic of Korea (547), Netherlands (531), Switzerland (530), Macao-China (525), Liechtenstein (525), Canada (527), Japan (523), New Zealand (522), Belgium (520), Australia (520), Estonia (515), Denmark (513), Czech Republic (510), Iceland (506), Austria (505), Germany (504), Slovenia (504), Sweden (502), Ireland (501), France (496), United Kingdom (495), Poland (495), Slovak Republic (492), Hungary (491), Luxembourg (490), Norway (490), Lithuania (486), Latvia (486), Spain (480), Azerbaijan (476), and the Russian Federation (476).

According to the 2003 and 2007 Trends in International Mathematics and Science Study (TIMSS)… [13] [14]

  • TIMSS (2003) reported the average U.S. 4th-grader’s mathematics literacy score of 518 out of 1000 was HIGHER than the international average of 495, but LOWER than their peers in 11 of the 25 participating countries: Singapore (594), Hong Kong-China (575), Japan (565), Taipei-China (564), Belgium (551), Netherlands (540), Latvia (536), Lithuania (534), Russian Federation (532), England (531), and Hungary (529).
  • TIMSS (2007) reported the average U.S. 4th-grader’s mathematics literacy score of 529 out of 1000 was HIGHER than the international average of 500, but LOWER than their peers in 10 of the 41 participating countries: Hong Kong-China (607), Singapore (599), Taipei-China (576), Japan (568), Kazakhstan (549), Russian Federation (544), England (541), Latvia (537), Netherlands (535), and Lithuania (530).
  • TIMSS (2003) reported the average U.S. 8th-grader’s mathematics literacy score of 504 out of 1000 was HIGHER than the international average of 466, but LOWER than their peers in 14 of the 45 participating countries: Singapore (605), Republic of Korea (589), Hong Kong-China (586), Taipei-China (585), Japan (570), Belgium (537), Netherlands (536), Estonia (531), Hungary (529), Malaysia (508), Latvia (508), Russian Federation (508), Slovak Republic (508), and Australia (505).
  • TIMSS (2007) reported the average U.S. 8th-grader’s mathematics literacy score of 508 out of 1000 was HIGHER than the international average of 500, but LOWER than their peers in 11 of the 54 participating countries: Taipei-China (598), Republic of Korea (597), Singapore (593), Hong Kong-China (572), Japan (570), Quebec-Canada (528), Hungary (517), Ontario-Canada (517), England (513), Russian Federation (512), and British Columbia-Canada (509).

SCIENCE LITERACY

According to the 2000, 2003, and 2006 Program for International Student Assessment (PISA)[15] [16] [17]

  • PISA (2000) reported the average U.S. 15-year-old student’s science literacy score of 500 out of 1000 MATCHED the international average of 500, but was LOWER than their peers in 13 of the 30 participating countries: Republic of Korea (552), Japan (550), Finland (538), United Kingdom (532), Canada (529), Australia (528), New Zealand (528), Austria (519), Ireland (513), Sweden (512), Czech Republic (511), France (501), and Norway (500).
  • PISA (2003) reported the average U.S. 15-year-old student’s science literacy score of 491 out of 1000 was LOWER than the international average of 500 and LOWER than their peers in 23 of the 41 participating countries: Finland (548), Japan (548), Hong Kong-China (540), Republic of Korea (538), Australia (525), Liechtenstein (525), Macao-China (525), Netherlands (524), Czech Republic (523), New Zealand (521), Canada (519), United Kingdom (518), Switzerland (513), Austria (491), Belgium (509), France (511), Sweden (506), Ireland (505), Hungary (503), Germany (502), Poland (499), Iceland (495), and the Slovak Republic (495).
  • PISA (2006) reported the average U.S. 15-year-old students’ science literacy score of 489 out of 1000 was LOWER than the international average of 500 and LOWER than their peers in 22 of the 57 participating countries: Finland (563), Hong Kong-China (542), Canada (534), Taipei-China (532), Japan (531), Estonia (531), New Zealand (530), Australia (527), Netherlands (525), Republic of Korea (522), Liechtenstein (522), Slovenia (519), Germany (516), United Kingdom (515), Czech Republic (513), Switzerland (512), Austria (511), Macao-China (511), Belgium (510), Ireland (508), Hungary (504), and Sweden (503).

According to the 2007 Trends in International Mathematics and Science Study (TIMSS)… [18]

  • TIMSS (2003) reported the average U.S. 4th-grader’s science literacy score of 536 out of 1000 was HIGHER than the international average of 489, but LOWER than their peers in 5 of the 25 participating countries: Singapore (565), Taipei-China (551), Japan (543), Hong Kong-China (542), and England (540).
  • TIMSS (2007) reported the average U.S. 4th-grader’s science literacy score of 539 out of 1000 was HIGHER than the international average of 500, but LOWER than their peers in 8 of the 42 participating countries: Singapore (587), Taipei-China (557), Hong Kong-China (554), Japan (548), Russian Federation (546), Alberta-Canada (543), Latvia (542), and England (542).
  • TIMSS (2003) reported the average U.S. 8th-grader’s science literacy score of 527 was HIGHER than the international average of 473, but LOWER than their peers in 8 of the 45 participating countries: Singapore (578), Taipei-China (571), Republic of Korea (558), Hong Kong-China (556), Estonia (552), Japan (552), Hungary (543), and Netherlands (536).
  • TIMSS (2007) reported the average U.S. 8th-grader’s science literacy score of 520 was HIGHER than the international average of 500, but LOWER than their peers in 12 of the 54 participating countries: Singapore (567), Taipei-China (561), Japan (554), Republic of Korea (553), England (542), Hungary (539), Czech Republic (539), Slovenia (538), Hong Kong-China (530), Russian Federation (530), Ontario-Canada (526), and British Columbia-Canada (526).

For the self-appointed education “experts” who stubbornly cling to the myth of public school greatness, it is clear they will have to rely on evidence other than standardized test scores—which objectively demonstrate their gross ineptitude. The failure of these educrats becomes all the more difficult to tolerate when considering the billions of taxpayer dollars they have “invested” in the public school system.

Aside from the abysmal test scores—both domestic and international—high school seniors looking ahead to college are less prepared than ever, which does not bode well for an already bleak academic future.

The SAT (formerly the Scholastic Assessment Test and the Scholastic Aptitude Test) is a standardized test for college admissions in the United States, administered by the Educational Testing Service (ETS), and developed, published, and scored by the College Board. According to their website, “The SAT is not designed as an indicator of student achievement, but rather as an aid for predicting how well students will do in college.” NOTE: In 2004, the original assessment format was “re-centered” to achieve greater “academic accessibility” [read dumbed down].

According to the SAT exam administered by the Educational Testing Service (ETS): [19]

  • From 1990-1991 to 2008-2009, SAT-Critical Reading scores averaged 463 out of 800, or 57.8%, or an “F+”.
  • From 1990-1991 to 2008-2009, SAT-Mathematics scores averaged 514 out of 800, or 64.2%, or a “D”.

According to a survey conducted by the Chronicle of Higher Education, “What Professors and Teachers Think” (2006)[20], high school teachers consistently assess their graduating seniors FAR more favorably than college professors assess those same students as incoming freshmen. Researchers polled 746 high school teachers and 1,098 college professors specifically concerning the college readiness of their students. The results were appalling to say the least.

In response to the question, “How well prepared are your students for college-level work?”

  • 36% of high school teachers described their students as “very well prepared.”
  • Only 15% of college professors described their students as “very well prepared.”
  • 24% of college professors described their students as “poorly prepared.”
  • Only 12% of high school teachers described their students as “poorly prepared.”

The responses also varied significantly according to subject area…

In mathematics…

  • 37% of high school mathematics teachers described their students as “very well prepared.”
  • Only 4% of college mathematics professors described their students as “very well prepared.”

In science…

  • 38% of high school science teachers described their students as “very well prepared.”
  • Only 5% of college science professors described their students as “very well prepared.”

In writing…

  • Only 10% of high school teachers described their students as “not well prepared.”
  • While 44% of college professors described their students as “not well prepared.”

Overall…

  • 65% of high school teachers described their students “unprepared” or “somewhat prepared” for college.
  • 84% of college professors described their students “unprepared” or “somewhat prepared” for college.

Given the comprehensiveness of our national failure in education, the political indifference it attracts is as baffling as it is infuriating. Until you follow the money that is. Considering the incestuous relationship between politicians and special interests, it is unlikely that meaningful change will engender more than token congressional enthusiasm and support. [21] [22]

  • From 1990 to 2010, the education sector—which includes teachers, professors, and administrators at primary schools, high schools, colleges, graduate schools, and vocational and technical institutes, but excludes teachers unions—contributed $179,302,595 to politicians and/or political campaigns.
  • From 1990 to 2010, the 3.2 million-member National Education Association (NEA) contributed $30,097,067 to politicians and/or political campaigns.
  • From 1990 to 2010, the 856,000-member American Federation of Teachers (AFT) contributed $26,282,491 to politicians and/or political campaigns.
  • Put another way, from 1990 to 2010, these educational special interest groups have “invested” $232,682,153 into politicians and/or political campaigns.

The collective generosity of these particular special interest groups has managed to purchase a seemingly impervious congressional shield against overwhelming evidence of pedagogical incompetence.

See “Failing our children”, Part 3.


[1] U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), various years, 1971-2008 Long-Term Trend Reading Assessments. http://nationsreportcard.gov/ltt_2008/ltt0003.asp.

[2] U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), various years, 1973-2008 Long-Term Trend Mathematics Assessments. http://nationsreportcard.gov/ltt_2008/ltt0002.asp.

[3] U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1996, 2000, and 2005 Science Assessments.

[4] U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1994, 2001, and 2006 U.S. History Assessments.

[5] U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1998 and 2006 Civics Assessments.

[6] U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2006 Economics Assessment.

[7] Organization for Economic Cooperation and Development (OECD), Program for International Student Assessment (PISA), 2000 and 2003.

[8] Organization for Economic Cooperation and Development (OECD), Program for International Student Assessment (PISA), 2006.

[9] International Association for the Evaluation of Educational Achievement, Progress in International Reading Literacy (PIRLS), 2001 and 2006.

[10] J. R. Campbell, D. L. Kelly, I. V. S. Mullis, M. O. Martin & M. Sainsbury, “Framework and Specifications for PIRLS Assessment 2001,” 2nd ed. (Chestnut Hill, MA: TIMSS and PIRLS International Study Center, Lynch School of Education, Boston College, 2001).

[11] S. Baldi, Y. Jin, M. Skemer, P. J. Green & D. Herget, “Highlights From PISA 2006: Performance of U.S. 15-Year-Old Students in Science and Mathematics Literacy in an International Context,” (NCES 2008–016), National Center for Education Statistics, Institute of Education Sciences (Washington, DC: U.S. Department of Education, 2007).

[12] M. Lemke, A. Sen, E. Pahlke, L. Partelow, D. Miller, T. Williams, D. Kastberg & L. Jocelyn, “International Outcomes of Learning in Mathematics Literacy and Problem Solving: PISA 2003 Results From the U.S. Perspective,” (NCES 2005-003R), National Center for Education Statistics, (Washington, DC: U.S. Department of Education, 2004).

[13] I. V. S. Mullis, M. O. Martin, P. Foy, J. F. Olson, C. Preuschoff, E. Erberber, A. Arora, J. Galia, “TIMSS 2007 International Mathematics Report: Findings from IEA’s Trends in International Mathematics and Science Study at the Fourth and Eighth Grades”, (Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College, 2008). http://timss.bc.edu/TIMSS2007/PDF/T07_M_IR_Chapter1.pdf.

[14] International Association for the Evaluation of Educational Achievement (IEA), Trends in International Mathematics and Science Study (TIMSS), 2003. http://nces.ed.gov/timss/timss03tables.asp?Quest=1&Figure=1.

[15] Organization for Economic Cooperation and Development (OECD), Program for International Student Assessment (PISA), 2000 and 2003.

[16] Organization for Economic Cooperation and Development (OECD), Program for International Student Assessment (PISA), 2006.

[17] S. Baldi, Y. Jin, M. Skemer, P. J. Green & D. Herget, “Highlights From PISA 2006: Performance of U.S. 15-Year-Old Students in Science and Mathematics Literacy in an International Context,” (NCES 2008–016), National Center for Education Statistics, Institute of Education Sciences (Washington, DC: U.S. Department of Education, 2007).

[18] M. O. Martin, I. V. S. Mullis, P. Foy, J. F. Olson, E. Erberber, C. Preuschoff & J. Galia, “TIMSS 2007 International Science Report: Findings from IEA’s Trends in International Mathematics and Science Study at the Fourth and Eighth Grades”, (Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College, 2008). http://timss.bc.edu/TIMSS2007/PDF/T07_S_IR_Chapter1.pdf.

[19] U.S. Department of Education, National Center for Education Statistics, (2010); Digest of Education Statistics, 2009 (NCES 2010-013), Chapter 2. http://nces.ed.gov/fastfacts/display.asp?id=171.

[20] Alvin P. Sanoff, “A Perception Gap Over Students’ Preparation”, Chronicle of Higher Education (10 March 2006).

[21] Center for Responsive Politics. http://www.opensecrets.org/industries/indus.php?ind=L1300. Accessed 10 January 2010.

[22] The membership of the NEA and the AFT makes them the largest and second-largest education unions in the United States, respectively. Additionally, the NEA is one of Education International’s (EI-IE) 401 member organizations in 172 countries and territories, representing over 30 million education personnel from pre-school to university.

Leave a Reply

You must be logged in to post a comment.

Using Gravatars in the comments - get your own and be recognized!

XHTML: These are some of the tags you can use: <a href=""> <b> <blockquote> <code> <em> <i> <strike> <strong>