Media Contact: Patricia
Delaney Director of Media Relations
TIMSS Project Contact:
MEDIA NOTE: The full TIMSS 1999 reports are available on-line at the International Study Center's web site on the Publications page or by calling 617-552-1600. To arrange interviews with the TIMSS International Study Co-Directors Michael O. Martin or Ina V.S. Mullis, or to obtain camera-ready color charts, please call the Boston College Office of Public Affairs at 617-552-3352.
Questions and Answers
TIMSS 1999, also known as TIMSS-Repeat or TIMSS-R, measures progress in eighth-grade mathematics and science achievement around the world. Sponsored by the International Association for the Evaluation of Educational Achievement (IEA), TIMSS 1999 assessed the mathematics and science performance of more than 180,000 eighth-grade students in over 6,000 schools in 38 countries in 34 languages.
TIMSS 1999 builds on the success of the original TIMSS conducted in 1995, which was the largest and most comprehensive international study of mathematics and science achievement ever undertaken. TIMSS 1995 compared the mathematics and science achievement of students in 41 countries at five grade levels. TIMSS 1999 is a replication of TIMSS 1995 at the eighth grade.
As in the 1995 study, TIMSS 1999 also investigated the contexts for learning mathematics and science in the participating countries through background questionnaires completed by students, teachers, school principals, and national representatives. Information was collected about educational systems, curriculum, instructional practices, and characteristics of students, teachers, and schools, providing an extremely rich source of valuable insights into the teaching and learning of mathematics and science. The 1995 results have stirred debate, spurred reform efforts, and provided important information to decision makers, researchers, and practitioners the world over. The 1999 results are likely to have similar or greater impact.
Why an International Trend Study?
To meet the challenge of preparing children around the world for a technologically
oriented 21st century, policy makers and educators need information about
students' understanding of mathematics and science to improve learning
and instruction. International comparisons of student achievement and
related factors allow policy makers, educators, and researchers to view
the performance of their respective educational systems in relation to
other nations. This can provide a powerful base for better understanding
and advancing the teaching and learning of mathematics and science.
Which Countries and Students Participated?
Thirty-eight countries participated in TIMSS 1999. Of these, 26 also participated in TIMSS 1995 at the eighth grade, and 17 participated in 1995 at the fourth grade.
The target population for TIMSS 1999 was defined as "the upper of the two adjacent grades with the most 13-year-olds," which in most countries was the eighth grade. In each country, approximately 3,500 students were assessed in about 150 schools, both public and private. A two-stage sampling procedure was used to ensure a nationally representative sample of students from each country. In the first stage, schools were randomly selected; in the second stage, one or two mathematics classrooms within each school were randomly selected.
What Was the Nature of the Test?
The mathematics and science tests were based on the TIMSS curriculum frameworks, which were developed by groups of educators with input from the TIMSS National Research Coordinators (NRCs). The curriculum frameworks contain three dimensions or aspects. The content aspect represents the subject matter content. The performance expectations aspect describes the many kinds of performances or behaviors that might be expected of students. The perspectives aspect focuses on the development of students' attitudes, interest, and motivation.
The tests were developed through a consensus process by international
experts in mathematics, science, and educational measurement, and were
endorsed by all participating countries. Working within the frameworks,
test specifications were developed that included items representing a
wide range of mathematics and science topics and eliciting a range of
skills from the students. The mathematics test covered five content areas:
fractions and number sense; measurement; data representation, analysis,
and probability; geometry; and algebra. The science test covered six content
areas: earth science; life science; physics; chemistry; environmental
and resource issues; and scientific inquiry and the nature of science.
What Is the Comparability of the Results?
Procedures used throughout the study ensure that the results are comparable
across countries. To ensure comparability in testing, rigorous procedures
were designed specifically to translate the tests and numerous training
sessions were held in data collection and scoring activities. Quality
control monitors observed testing sessions and reported back to the International
Study Center at Boston College. The samples of students selected for testing
were scrutinized according to rigorous standards designed to prevent bias
and ensure comparability. Prior to analysis, the data from each country
were subjected to exhaustive checks for accuracy and consistency.
How Are the Results Reported?
The results for the 38 countries that participated in TIMSS 1999 are presented in two companion reports, one for mathematics and one for science. The reports contain international rankings and country-by-country comparisons of mathematics and science achievement overall and for each content area; comparisons of country performance against international benchmarks; and gender differences in performance. Trend data are reported for the 26 countries that also participated in TIMSS 1995. The achievement data are accompanied by extensive questionnaire data about the home, classroom, school, and national contexts within which mathematics and science learning take place.
In addition to the reporting of achievement as scale scores, student performance is also described in terms of international benchmarks of performance. In order to provide meaningful descriptions of what performance on the achievement scale could mean in terms of the mathematics and science that students know and can do, TIMSS identified four points on the scale Top 10%, Upper Quarter, Median, Lower Quarter (90th, 75th, 50th, and 25th percentiles, respectively) for use as international benchmarks, and conducted an ambitious scale-anchoring exercise to describe performance at these benchmarks. These descriptions are accompanied by example test items illustrating student performance at each benchmark. The percentage of students in each country that reached each benchmark is reported, as well as the change between 1995 and 1999.
What Publications and Resources Are Available?
Released December 5, 2000:
For future release:
Who Conducted TIMSS 1999?
TIMSS 1999 was conducted by the International Association for the Evaluation of Educational Achievement (IEA). With a permanent secretariat based in Amsterdam, the Netherlands, the IEA is an independent international cooperative of national research institutions and governmental research agencies. Its primary purpose is to conduct large-scale comparative studies of educational achievement to gain a deeper understanding of the effects of policies and practices within and across systems of education. Since its inception in 1959, the IEA has conducted more than 15 studies of cross-national achievement.
The IEA delegated responsibility for the overall direction and management of the project to the International Study Center (ISC) in the Lynch School of Education at Boston College. The ISC worked closely with participating countries to develop consensus on all aspects of the study and to ensure that it was implemented in all countries according to international standards. In coordinating the study, the ISC worked closely with other organizations that carried out important activities of the project, including the IEA Secretariat in Amsterdam, the IEA Data Processing Center in Hamburg, Statistics Canada in Ottawa, and Educational Testing Service in Princeton, New Jersey.
Funding for TIMSS 1999 was provided by the United States, the World Bank, and the participating countries. Within the United States, funding agencies included the National Center for Education Statistics of the U.S. Department of Education, the National Science Foundation, and the Department of Education's Office of Educational Research and Improvement.