Progress in International Reading Literacy Study 2016 (IGLU/PIRLS 2016)

 

Table of contents

Project description

Blank data sets

Documentation

Further information

Notes on the use of the data

Literature

 

> Link to application form (Scientific Use Files)

Data Set Published on 09.07.2020
Version v3
Current Version Available Since 01.06.2023
Survey Period 2016
Sample Students in grade 4 (N=3,959); Classes (N=208); Schools (N=208)
Survey Unit Others
Parents
Principals
Students
Teachers
Measured Competencies German - Reading Comprehension
Region Germany, Baden-Wuerttemberg, Bavaria, Berlin, Brandenburg, Bremen, Hamburg, Hesse, Mecklenburg-Western Pomerania, Lower Saxony, North Rhine-Westphalia, Rhineland-Palatinate, Saarland, Saxony, Saxony-Anhalt, Schleswig-Holstein, Thuringia
Principal Investigators Bos, Prof. Dr. Wilfried
Mc Elvany, Prof. Dr. Nele
Data Producers Hußmann, Dr. Anke
Funded by Federal Ministry of Education and Research , Standing Conference of the Ministers of Education and Cultural Affairs of the Länder in the Federal Republic of Germany
Link https://ifs.ep.tu-dortmund.de/forschung/projekte-am-ifs/abgeschlossene-projekte/iglu/pirls-2016/
Related Studies IGLU 2001 (DOI: 10.5159/IQB_IGLU_2001_v1), IGLU 2006 (DOI: 10.5159/IQB_IGLU_2006_v1), IGLU 2011 (DOI: 10.5159/IQB_IGLU_2011_v1)
Suggested Citation Hußmann, A., Wendt, H., Bos, W., Bremerich-Vos, A., Kasper, D., Lankes, E.-M., McElvany, N., Stubbe, T. C., & Valtin, R. (2020). Progress in International Reading Literacy Study 2016 (PIRLS 2016) (Version 3) [Data set]. Berlin: IQB – Institut zur Qualitätsentwicklung im Bildungswesen. http://doi.org/10.5159/IQB_IGLU_2016_v3
Restriction Notice No federal state identifier variable is available for these records.

Cognitive abilities must not be used as a dependent variable in the analyses.

 

Project description

The project IGLU (International: Progress in Reading Literacy Study [PIRLS]) 2016 uses representative data to investigate the reading literacy of students at the end of the fourth grade in primary schools in Germany. Due to the international orientation of the project, school achievements in Germany can be compared with those in other countries and regions of the world. IGLU is a central element of educational monitoring in Germany and took place for the fourth time in 2016. The survey allows statements about trends in the school system. The IGLU study focuses on reading literacy. In addition, it looks at key characteristics of students, their classes, their schools and their families. A special feature of IGLU 2016 is the extension study PIRLS Literacy. In the past, it has been shown that in those countries where the reading skills of fourth-graders were significantly below international levels, the skills of the weaker children could not be adequately recorded with the regular IGLU instruments. Germany did NOT participate in PIRLS Literacy. (IQB)

back to overview

Blank data sets

For a first overview of the data set and its variables, dummy data sets containing the variables used and the value labels relating to them are provided for download here.

back to overview

Documentation

Here, you can find the PDF Report for IGLU 2016 (in German)

Here, you can find the Scale Manual for IGLU 2016 (in German).

back to overview

Further information

Further information on IGLU/PIRLS 2016 can be found on the websites of the IEA and the TIMSS & PIRLS International Study Center.

back to overview

Notes on the use of the data

Are the competence estimators of the PISA, IGLU and IQB studies comparable with each other?

In principle, the achievement tests used in German large scale assessment studies (PISA, IGLU and IQB studies) correlate highly, but the underlying competence models differ. The IQB tests are based on the educational standards of the The Standing Conference of the Ministers of Education and Cultural Affairs of the Länder in the Federal Republic of Germany (Kultusministerkonferenz, KMK) and as a result more closely aligned with the German school curriculum than PISA tests.

Comparability can be tested using IRT methods based on studies in which both PISA and IQB items were used. Some studies for comparison are, for example

The extent of comparability must be considered separately for reading and mathematical literacy and for secondary and primary education. Although it can be assumed that federal state differences can be well mapped using both measures, it is unfortunately not possible to analyse trends on a common metric.

How can I link the partial data sets?

A linked teacher-student data set already exists. There is also a data set containing only the teacher data. However, the teacher dataset does not allow for statements that are representative on a national level.

There are 79 duplicate cases in the teacher-student data set. These are automatically taken into account when weighted analyses are performed. Linking the teacher-student data set to the tracking data set by student ID is only possible after the duplicate cases have been treated (e.g. exclusion, random selection or combination).

Why do the numbers of students differ between the student-parent data set and the tracking data set?

The tracking data set contains all students who were included in the sample. In the student-parent data set, on the other hand, only those who actually took part in the test are included. Non-participation can be explained by various reasons, which are summarised in the tracking data set in the variable "TR_EXCLUSION_FDZ". The sample size of n = 3959 in the student-parent data set is also used in the results report, so it is recommended to work with this sample size.

How was the age variable calculated?

The reference time for the age calculation (variable "ASDAGE") is the test date, which has been emptied due to data protection. If you have questions regarding the age at a different point in time, please contact us. We will be happy to advise you on how to use our data in accordance with data protection regulations.

Scales on classroom management

What is provided in the data and reports?

The student-parent data (SEFB) and the teacher-student data (LSFB) each contain four assessment scales on classroom management in the broader sense (classroom management/discipline (CM), cognitive activation (CA), social climate/support from the teacher (SC) and structuring (STR)). The constituent items were asked of the students and teachers, respectively, and are documented in the respective sections of the scale manual.

A sum score of the recoded items was formed for each scale. The coding instructions are described in chapter 9 of the results report (Stahns et al., 2017, pp. 264). The (four-level) items were first dichotomised by assigning 1 as the new value to the responses 3 and 4 in most cases. The other two responses were assigned the value 0.  If the response to an item was missing because it was either omitted (missing by omission) or answered invalidly (missing invalid response), 0 was also assigned as the new value. A sum score was then calculated for the items of the scale.

Only people who had not been presented with the questions on classroom management were given empty/missing values on the individual items and sum scores. This means there is a valid sum score for every person who had the opportunity to answer the questions. However, this also means that people without valid answers to the individual items still have a valid sum score of 0. This applies to between 80 and 200 cases per scale.

What does this mean for data users?

The sum scores calculated in this way are included in the data and were the basis for the analyses in the IGLU 2016 report. To reproduce these results, the available sum scores have to be used. Data users who wish to conduct secondary analyses and use the constructs for classroom management in a different coding can use the individual unchanged items also included in the data to develop their own coding rules. To facilitate this, we provide an overview of the items, their assignment to the scales and the coding rules of the report chapter: Table (in German)

Reference

Stahns, R., Rieser, S., & Lankes, E.-M. (2017). Unterrichtsführung, Sozialklima und kognitive Aktivierung im Deutschunterricht in vierten Klassen. In A. Hußmann, H. Wendt, W. Bos, A. Bremerich-Vos, D. Kasper, E.-M. Lankes, et al. (Eds.), IGLU 2016: Lesekompetenzen von Grundschulkindern in Deutschland im internationalen Vergleich (pp. 251–277). Münster, New York: Waxmann. available at https://www.waxmann.com/index.php?eID=download&buchnr=3700

What else must be taken into account during data analysis?

You will find more detailed methodical information on calculating with data from large scale assessment surveys in this tutorial (english).

back to overview

Literature

Selected literature is listed PDF here (as of March 2023).

2022

Bütje, L. (2022). Exam Timing and the Socioeconomic Achievement Gap - unveröffentlichte Masterarbeit. Universität Konstanz, Konstanz.

2020

Hussmann, A., Wendt, H., Bos, W., Bremerich-Vos, A., Kasper, D., Lankes, E.-M. et al. (2020). Internationale Grundschul-Lese-Untersuchung 2016 (IGLU 2016) (Version 3) [Data set]. Berlin: IQB - Institut zur Qualitätsentwicklung im Bildungswesen. https://doi.org/10.5159/IQB_IGLU_2016_v3

Hussmann, A., Wendt, H., Bos, W. & Rieser, S. (2020). IGLU 2016: Skalenhandbuch zur Dokumentation der Erhebungsinstrumente und Arbeit mit den Datensätzen: Waxmann. Verfügbar unter https://books.google.de/books?id=jhsDEAAAQBAJ

2019

Hussmann, A. (2019). Unter der Norm - Kompetenz und Diagnostik in IGLU 2016. Below the norm – competence and diagnosis in PIRLS Germany 2016. Lengerich: Pabst Science Publishers.

2017

Hussmann, A., Wendt, H., Bos, W., Bremerich-Vos, A., Kasper, D., Lankes, E.-M. et al. (Hrsg.). (2017). IGLU 2016. Lesekompetenzen von Grundschulkindern in Deutschland im internationalen Vergleich. Münster: Waxmann. Verfügbar unter https://www.waxmann.com/index.php?eID=download&buchnr=3700

back to overview