TIMSS Repeat (1999) Logo
Go to the Publications page to download reportsStudy Instruments and ProceduresTIMSS 1999 CalendarContact UsTIMSS 1999 PartnersTIMSS Funding
TIMSS 1999

Click here to return to the TIMSS 1999 home page

Return to the Press Information page

Media Contact: Patricia Delaney Director of Media Relations
Boston College
617-552-3352

TIMSS Project Contact:
Michael O. Martin
Ina V.S. Mullis
Co-Directors International Study
617-552-1600

MEDIA NOTE: The full TIMSS 1999 reports are available on-line at the International Study Center's web site on the Publications page or by calling 617-552-1600. To arrange interviews with the TIMSS International Study Co-Directors Michael O. Martin or Ina V.S. Mullis, or to obtain camera-ready color charts, please call the Boston College Office of Public Affairs at 617-552-3352.

 

 

 

 

 

 

Questions and Answers

What Is TIMSS 1999?

TIMSS 1999, also known as TIMSS-Repeat or TIMSS-R, measures progress in eighth-grade mathematics and science achievement around the world. Sponsored by the International Association for the Evaluation of Educational Achievement (IEA), TIMSS 1999 assessed the mathematics and science performance of more than 180,000 eighth-grade students in over 6,000 schools in 38 countries in 34 languages.

TIMSS 1999 builds on the success of the original TIMSS conducted in 1995, which was the largest and most comprehensive international study of mathematics and science achievement ever undertaken. TIMSS 1995 compared the mathematics and science achievement of students in 41 countries at five grade levels. TIMSS 1999 is a replication of TIMSS 1995 at the eighth grade.

As in the 1995 study, TIMSS 1999 also investigated the contexts for learning mathematics and science in the participating countries through background questionnaires completed by students, teachers, school principals, and national representatives. Information was collected about educational systems, curriculum, instructional practices, and characteristics of students, teachers, and schools, providing an extremely rich source of valuable insights into the teaching and learning of mathematics and science. The 1995 results have stirred debate, spurred reform efforts, and provided important information to decision makers, researchers, and practitioners the world over. The 1999 results are likely to have similar or greater impact.

Why an International Trend Study?

To meet the challenge of preparing children around the world for a technologically oriented 21st century, policy makers and educators need information about students' understanding of mathematics and science to improve learning and instruction. International comparisons of student achievement and related factors allow policy makers, educators, and researchers to view the performance of their respective educational systems in relation to other nations. This can provide a powerful base for better understanding and advancing the teaching and learning of mathematics and science.

TIMSS 1999 was the second phase of a long-term study designed to measure progress in mathematics and science achievement, with the next assessment planned for 2003 at grades 4 and 8. The study design allows countries that participated in TIMSS 1995 at the eighth grade to compare the performance of eighth-graders in 1995 with the performance of eighth-graders in 1999. It also allows countries that participated in 1995 at the fourth grade to compare the performance of fourth-graders in that year with their performance as eighth-graders four years later. Such a study has the potential for affecting policy and practice by investigating the effects on achievement of changes in curriculum, teacher preparation, instructional practices, school environment, and home and school resources.

Which Countries and Students Participated?

Thirty-eight countries participated in TIMSS 1999. Of these, 26 also participated in TIMSS 1995 at the eighth grade, and 17 participated in 1995 at the fourth grade.

The target population for TIMSS 1999 was defined as "the upper of the two adjacent grades with the most 13-year-olds," which in most countries was the eighth grade. In each country, approximately 3,500 students were assessed in about 150 schools, both public and private. A two-stage sampling procedure was used to ensure a nationally representative sample of students from each country. In the first stage, schools were randomly selected; in the second stage, one or two mathematics classrooms within each school were randomly selected.

What Was the Nature of the Test?

The mathematics and science tests were based on the TIMSS curriculum frameworks, which were developed by groups of educators with input from the TIMSS National Research Coordinators (NRCs). The curriculum frameworks contain three dimensions or aspects. The content aspect represents the subject matter content. The performance expectations aspect describes the many kinds of performances or behaviors that might be expected of students. The perspectives aspect focuses on the development of students' attitudes, interest, and motivation.

The tests were developed through a consensus process by international experts in mathematics, science, and educational measurement, and were endorsed by all participating countries. Working within the frameworks, test specifications were developed that included items representing a wide range of mathematics and science topics and eliciting a range of skills from the students. The mathematics test covered five content areas: fractions and number sense; measurement; data representation, analysis, and probability; geometry; and algebra. The science test covered six content areas: earth science; life science; physics; chemistry; environmental and resource issues; and scientific inquiry and the nature of science.
The tests included questions ranging from multiple-choice questions, comprising about three-fourths of the items, to open-response items requiring students to solve problems and explain their answers. To achieve broad content coverage, a matrix sampling technique was used in which the 308 test items (162 mathematics and 146 science) were systematically distributed across eight test booklets, and the booklets were randomly distributed to students. Each student in the sampled classrooms responded to one test booklet that included about 80 mathematics and science questions, requiring 90 minutes to complete. The items used in 1995 that were released for public use were replaced with items of similar content, format, and difficulty that were piloted in an extensive field test. About half the items used in 1999 will be released for public use.

What Is the Comparability of the Results?

Procedures used throughout the study ensure that the results are comparable across countries. To ensure comparability in testing, rigorous procedures were designed specifically to translate the tests and numerous training sessions were held in data collection and scoring activities. Quality control monitors observed testing sessions and reported back to the International Study Center at Boston College. The samples of students selected for testing were scrutinized according to rigorous standards designed to prevent bias and ensure comparability. Prior to analysis, the data from each country were subjected to exhaustive checks for accuracy and consistency.
To measure trends from 1995 to 1999, both tests were based on the framework developed for the original TIMSS in 1995. About one-third of the items in 1999 were identical to those used in 1995, and the rest were quite similar in content, format, and difficulty. Furthermore, the two assessments used the same sampling definitions and procedures; the same test administration procedures monitored by quality control observers; and the same data analysis and scaling methods.

How Are the Results Reported?

The results for the 38 countries that participated in TIMSS 1999 are presented in two companion reports, one for mathematics and one for science. The reports contain international rankings and country-by-country comparisons of mathematics and science achievement overall and for each content area; comparisons of country performance against international benchmarks; and gender differences in performance. Trend data are reported for the 26 countries that also participated in TIMSS 1995. The achievement data are accompanied by extensive questionnaire data about the home, classroom, school, and national contexts within which mathematics and science learning take place.

In addition to the reporting of achievement as scale scores, student performance is also described in terms of international benchmarks of performance. In order to provide meaningful descriptions of what performance on the achievement scale could mean in terms of the mathematics and science that students know and can do, TIMSS identified four points on the scale – Top 10%, Upper Quarter, Median, Lower Quarter (90th, 75th, 50th, and 25th percentiles, respectively) – for use as international benchmarks, and conducted an ambitious scale-anchoring exercise to describe performance at these benchmarks. These descriptions are accompanied by example test items illustrating student performance at each benchmark. The percentage of students in each country that reached each benchmark is reported, as well as the change between 1995 and 1999.

What Publications and Resources Are Available?

Released December 5, 2000:

  • TIMSS 1999 International Mathematics Report
  • TIMSS 1999 International Science Report
  • TIMSS 1999 Technical Report

For future release:

  • TIMSS 1999 Mathematics Released Item Set
  • TIMSS 1999 Science Released Item Set
  • TIMSS 1999 International Database and User Guide

Who Conducted TIMSS 1999?

TIMSS 1999 was conducted by the International Association for the Evaluation of Educational Achievement (IEA). With a permanent secretariat based in Amsterdam, the Netherlands, the IEA is an independent international cooperative of national research institutions and governmental research agencies. Its primary purpose is to conduct large-scale comparative studies of educational achievement to gain a deeper understanding of the effects of policies and practices within and across systems of education. Since its inception in 1959, the IEA has conducted more than 15 studies of cross-national achievement.

The IEA delegated responsibility for the overall direction and management of the project to the International Study Center (ISC) in the Lynch School of Education at Boston College. The ISC worked closely with participating countries to develop consensus on all aspects of the study and to ensure that it was implemented in all countries according to international standards. In coordinating the study, the ISC worked closely with other organizations that carried out important activities of the project, including the IEA Secretariat in Amsterdam, the IEA Data Processing Center in Hamburg, Statistics Canada in Ottawa, and Educational Testing Service in Princeton, New Jersey.

Funding for TIMSS 1999 was provided by the United States, the World Bank, and the participating countries. Within the United States, funding agencies included the National Center for Education Statistics of the U.S. Department of Education, the National Science Foundation, and the Department of Education's Office of Educational Research and Improvement.


Click here to return to the ISC homepage

View Questions and Responses at the Press Conference (not available in text format)

This video is available in two formats: QuickTime and RealPlayer

view in QuickTime
view in RealPlayer using a modem
view in RealPlayer using fast connection