Programme for International Student Assessment - International Plus 2003, 2004 (PISA-I-Plus 2003, 2004)
Table of contents
> Link to application form (Scientific Use Files)
Data Set Published on | 01.12.2013 |
---|---|
Current Version Available Since | 01.12.2013 |
Survey Period | 2004 |
Sample | Students in grade 9 and 10 pursuing a Mittlerer Schulabschluss (general education school leaving certificate obtained on completion of grade 10) (N=6,020); Teachers (N=1,939); Schools (N=216) |
Survey Unit | Students |
Measured Competencies | Mathematics, Natural Sciences |
Region | Germany, Baden-Wuerttemberg, Bavaria, Berlin, Brandenburg, Bremen, Hamburg, Hesse, Mecklenburg-Western Pomerania, Lower Saxony, North Rhine-Westphalia, Rhineland-Palatinate, Saarland, Saxony, Saxony-Anhalt, Schleswig-Holstein, Thuringia |
Principal Investigators | Prenzel, Prof. Dr. Manfred |
Data Producers | Konferenz der Kultusminister (KMK) |
Funded by | Standing Conference of the Ministers of Education and Cultural Affairs of the Länder in the Federal Republic of Germany, Organization for Economic Cooperation and Development (OECD) |
Link | https://www.forschungsdaten-bildung.de/en/studies/382-pisa-programme-for-international-student-assessment-international-plus-2003-2004 |
Related Studies | PISA 2000 (DOI: 10.5159/IQB_PISA_2000_v1), PISA 2003 (DOI: 10.5159/IQB_PISA_2003_v1), PISA 2006 (DOI: 10.5159/IQB_PISA_2006_v1), PISA 2009 (DOI: 10.5159/IQB_PISA_2009_v1), PISA 2012 (DOI: 10.5159/IQB_PISA_2012_v5), PISA 2015 (DOI: 10.5159/IQB_PISA_2015_v3), PISA 2018 (DOI: 10.5159/IQB_PISA_2018_v1), PISA Plus 2012-13 (DOI: 10.5159/IQB_PISA_Plus_2012-13_v2) |
Suggested Citation | Prenzel, M., Baumert, J., Blum, W., Lehmann, R., Leutner, D., Neubrand, M., Pekrun, R., Rost, J., & Schiefele, U. (2013). Programme for International Student Assessment - International Plus 2003, 2004 (PISA-I-Plus 2003, 2004) (Version 1) [Data set]. Berlin: IQB – Institut zur Qualitätsentwicklung im Bildungswesen. http://doi.org/10.5159/IQB_PISA_I_Plus_v1 |
Restriction Notice | Cognitive abilities must not be used as a dependent variable in the analyses. |
Project description
PISA-I-Plus is a national assessment scheme extending the cross-sectional design of the international PISA 2003 study to include an additional time of measurement, one year after the initial assessment. The sample of PISA-I-Plus was derived from the German sample of the international PISA 2003 assessment of 9th graders. In 2004, a subsample of these students were tested again at the end of grade 10. The study aims to describe students' development in the fields of mathematics and science in the course of one school year. Additionally, personal characteristics, family background and learning environment were considered as factors influencing learning progress during the school year. In contrast to PISA 2003, lower secondary schools (Hauptschule) were excluded from the sample because students enrolled in these schools typically obtain a school leaving certificate after 9th grade. PISA-I-Plus was commissioned by the Standing Conference of the Ministers of Education and Cultural Affairs of the Länder in the Federal Republic of Germany (Kultusministerkonferenz, KMK) and carried out by the national PISA consortium. (IQB)
Blank data sets
For a first overview of the data set and its variables, dummy data sets containing the variables used and the value labels relating to them are provided for download here.
Documentation
PISA-I-Plus 2003, 2004 is based on scales and questionnaires from PISA-2003. A separate codebook is not available. For further information, please consider our study description of PISA-2003 or the documentation (German only) provided by the IPN-Leibniz Institute for Science and Mathematics Education.
Courtesy of Waxmann publishing company, you can look at extracts from the book listed below (German only) by clicking on the link:
Notes on the use of the data
Further background information on teachers is available in the PISA 2003-I data set. Should you require this information for your research project, we recommend you apply for access to PISA 2003 as well.
Are the competence estimators of the PISA, IGLU and IQB studies comparable with each other?
In principle, the tests from PISA and the IQB studies correlate highly, but the underlying competency models differ. The IQB tests are based on the educational standards of the The Standing Conference of the Ministers of Education and Cultural Affairs of the Länder in the Federal Republic of Germany and thus more closely aligned with the schools´ curriculum than the PISA tests. Comparability can be tested using IRT methods based on studies in which both PISA and IQB items were used. Some studies for comparison are for example
an den Ham, Ann-Katrin; Ehmke, Timo; Hahn, Inga; Wagner, Helene; Schöps, Katrin (2016). Mathematische und naturwissenschaftliche Kompetenz in PISA, im IQB-Ländervergleich und in der National Educational Panel Study (NEPS) – Vergleich der Rahmenkonzepte und der dimensionalen Struktur der Testinstrumente. In: Bundesministerium für Bildung und Forschung [Hrsg.]: Forschungsvorhaben in Ankopplung an Large-Scale-Assessments. Berlin, Bundesministerium für Bildungund Forschung, S. 140-160.
Jude, Nina,Klieme, Eckhard [Hrsg.](2013). PISA 2009 - Impulse für die Schul- und Unterrichtsforschung. Weinheim u.a.: Beltz. (Zeitschrift für Pädagogik, Beiheft 59)
- Hartig, Johannes, Frey, Andreas (2012).Validität des Tests zur Überprüfung des Erreichens der Bildungsstandards in Mathematik Zusammenhänge mit den bei PISA gemessenen Kompetenzen und Varianz zwischen Schulen und Schulformen. Diagnostica 58, S. 3-14.
The extent of comparability must be considered separately for reading and mathematical literacy and for secondary and primary education. Although it can be assumed that federal state differences can be well mapped using both measures, it is unfortunately not possible to analyse absolute trends on a common metric.
The data sets of PISA 2012 and IQB National Assessment Study 2012 studies can be linked with each other using the ID variable [idstud_FDZ]. This allows you to compare correlations between the scaled test values of both studies.
Please note additionally:
1.) In contrast to the PISA surveys, reading and mathematical literacy are only tested together in the IQB study in primary school: Reading literacy was recorded in the IQB National Assessment Study 2009 (lower secondary level) and in the IQB National Assessment Study 2011 (primary school) as well as in the IQB Trends in Student Achievement 2015 (lower secondary level) and in the IQB Trends in Student Achievement 2016 (primary school). Mathematics competencies can be found in the IQB National Assessment Study 2012 (secondary level) and IQB National Assessment Study 2011 (primary level) as well as in the IQB Trends in Student Achievement 2016 (primary level) and the IQB Trends in Student Achievement 2018 (secondary level).
2.) We do not have a federal state identifier for the PISA surveys.
3.) If you wish to conduct analyses that include unpublished, novel comparisons between single federal states, our Rules of Procedure state that an extended application procedure with a review process applies.
Is it possible to record the age of students (to the day) in the IQB National Assessment Studies/IQB Trends in Student Achievement Studies and in PISA?
Information on the year and age of birth of students is collected as standard in the IQB National Assessment Studies and PISA studies and is available for re- and secondary analyses of the data. For reasons of data protection, however, the exact date of birth was not recorded and is not available in the data sets. The exact test date is also not included in most data sets (in PISA 2009 this information is available). Frequently, however, the data sets contain an age variable that was formed using the year and month of birth in relation to the test date (e.g. in the IQB National Assessment Studies 2011, 2012 and in PISA 2012, 2009, 2006).
Which PISA data can be linked to which IQB National Assessment Studies/IQB Trends in Student Achievement Studies?
The PISA 2012 data sets can be combined with the IQB National Assessment Study 2012. The students IDs have already been recoded in the data sets available at the FDZ at IQB in such a way that a linkage of both data sources is possible. Unfortunately, it is not possible to link the other PISA waves with the data from the IQB IQB National Assessment Studies /IQB Trends in Student Achievement Studies because the ID variables cannot be recoded uniformly.
At what levels was the Germany PISA data collected?
In the German PISA studies, information is only available at the federal state level. Please note that special conditions of use must be observed when analysing data from the federal states. You can read them here:
Rules of Procedure as of January 2019
Rules of Procedure - innovative state comparisons as of January 2019
What is the number of classes drawn per school in PISA surveys?
Information on sampling in the studies can be found in the results reports or scale manuals.
Here is a brief summary of sampling in PISA:
PISA 2000:
random selection of 28 15-year-olds and 10 non-15-year-old ninth graders per school; thus, full classes were not drawn, analyses can only be done at the school level
PISA 2003:
random selection of 15-year-olds per school; in addition, two complete 9th graders were drawn per school for the national expansion; in the PISA-E data, however, no class-based sampling was realised.
PISA 2006:
school-based sampling, then random selection of 15-year-olds per school; at the schools in the international sample (PISA_I), students from two complete 9th grades were additionally tested
PISA 2009:
school-based sampling, additionally students from two complete 9th grades per school were tested
PISA 2012:
school-based sampling, additionally students from two complete 9th grades per school were tested
PISA 2015:
school-based sampling, additionally random selection of 15 ninth graders per school
PISA 2018:
school-based sampling, additionally random selection of 15 ninth graders per school
How many students in special and vocational schools are included in the PISA data?
Special needs and vocational students were covered separately in the above PISA surveys. The sample sizes for these subgroups are given below. They are based on the data in the German PISA Extended Samples (PISA-E) in the student and school management data sets. Where appropriate, there may be slight differences from the reported sample sizes in the results reports.
PISA 2000 E:
- 9th grade: n= 11 students in vocational schools, n= 22 students in special schools out of a total of n= 34,754 students
- 15-year-olds: n= 241 students at vocational schools, n= 799 students at special schools out of a total of n= 35,584 students
- School data set: n= 18 vocational schools, n= 4 special schools out of a total of n= 1,342 schools
PISA 2003 E (here no differentiation between data sets for 9th grade & 15-year-olds possible):
- 9th grade: n= 654 students at vocational schools, n= 1,712 students at special schools out of a total of n= 46,185 students
- School data set: n= 43 vocational schools, no special schools out of a total of n= 1,411 schools
PISA 2006 E:
- 9th grade: no students at vocational schools or special schools in the data set
- 15-year-olds: n= 625 students at vocational schools, n= 2,560 students at special schools out of a total of n= 39,573 students
- School data set: n= 42 vocational schools, no special schools out of a total of n= 1,496 schools
PISA 2009 E:
- 9th grade: no students at vocational schools or special schools in the data set out of a total of n= 9,461 students
- School data set: n= 9 vocational schools, n= 13 special schools out of a total of n= 226 schools
PISA 2012 E:
- 9th grade: no students at vocational schools, n= 153 at special schools out of a total of n= 9,998 students
- 15-year-olds: n= 99 students at vocational schools, n= 139 at special schools out of a total of n= 5,001 students
- School data set: n= 7 vocational schools, n= 12 special schools out of a total of n= 230 schools
PISA 2015 E:
- 9th grade:no students at vocational schools, n= 165 at special schools out of a total of n= 4,149 students
- 15-year-olds: n= 160 students at vocational schools, n= 134 at special schools out of a total of n= 6,504 students
- School data set: n= 8 vocational schools, n= 12special schools out of a total of n= 205 schools
PISA 2018 E:
- 9th grade: no students at vocational schools, n= 115 at special schools out of a total of n= 3,567 students
- 15-year-olds: n= 184 students at vocational schools, n= 98 at special schools out of a total of n= 5,451 students
- School data set: n= 10 vocational schools, n= 7 special schools out of a total of n= 191 schools
Can teacher and student data be linked in PISA?
Unfortunately, linking is only possible for the partial data sets of 9th graders (the data sets of 15-year-olds include cross-school samples). In most PISA waves, two 9th graders were drawn, but the partial data sets often lack a unique class ID.
Here is an overview in bullet points of the individual PISA waves:
- PISA 2000: no teacher questionnaire was used here.
- PISA 2003: partial data set "PISA-I-9th grade": teacher questionnaires contain questions at school level, not at class level; a link via the variable [idclass_FDZ] is possible, but in the teacher data set there is a high proportion of missing values on this variable (presumably because many teachers were surveyed per school); partial data set "PISA-E": no teacher questionnaires available
- PISA Plus 2003-2004: a linkage is possible in principle, but teacher data would have to be imported from PISA 2003 data and are only available at the first measurement point.
- PISA 2006: partial data set "PISA-E": no teacher data set for 9th grades available, linkage only possible at school level; partial data set "PISA-I": no clear linkage possible, as the teacher data set does not contain a class ID.
- PISA 2009: also no class ID in the teacher data set, but linking via idsch and variable [LF39M01] (German taught in PISA class: yes vs. no) partially possible; however, two 9th grades were drawn from each school.
- PISA 2012: Linking is possible in principle via class name variables (teacher data set: class_FDZ; student data set: ClassName_FDZ) but difficult to achieve in practice, as the metric of the school ID does not correspond between the two sub-data sets and there is a high proportion of missing values on class name variables (I interpret reports from PISA staff that linking is not successful in the majority of cases).
- PISA 2015: Linkage is not directly possible, as all teachers in the drawn schools were surveyed.
- PISA 2018: A link between teachers and students via the variable "TEACHCLASS_ID" is not possible until the end of 2022 due to a blocking notice. However, this variable also only contains the information whether the teacher has taught a ninth grade or not. This is because almost all teachers in the drawn school were surveyed. Alternatively, the variable "TEACHERID" can be used, but this variable also does not allow a clear assignment between students and the corresponding teacher.
For which PISA data is a repeated measures data set available?
A repeated measures data set is available for PISA-2003 (PISA-Plus 2003, 2004) and PISA-2012 (PISA-Plus 2012, 2013).
How were the science literacy tests developed in PISA?
In contrast to the IQB National Assessment Studies, the science tests in PISA are not curricularly anchored or subject-specifically designed. Therefore, there are no subtests for biology, physics and chemistry in PISA. Instead, PISA tests scientific literacy (see e.g. OECD, 2006). This involves skills that are significant in situations in which one is confronted with science and technology. These situations relate to physical systems, living systems, earth and space systems and technological systems. Specifically, the following competencies are tested:
a) recognise scientific issues
b) describe, explain and predict scientific phenomena
c) use scientific evidence to make decisions.
More information on the concept and the test (including sample items) can be found here:
- OECD 2015 Assessment / Framework
- PISA Scientific Literacy
OECD: How PISA measures science literacy
- in this results report (for Germany) and here
- Prenzel, M., Artelt, C., Baumert, J., Blum, W., Hammann, M., Klieme, E., Pekrun, R. (Hrsg.) (2008). PISA 2006 in Deutschland: Die Kompetenzen der Jugendlichen im dritten Ländervergleich. Münster: Waxmann.
When did the tests for the first measurement point of PISA 2003-4 take place?
The tests for the first measurement point of the PISA I-Plus 2003-4 study took place in the period from 20.04. to 31.05.2003 ( reference). Unfortunately, no information is available on which day which school was tested.
Are there standard errors for the proficiency estimators in the PISA I-Plus 2003-4 data set?
For the PISA-I-Plus 2003-4 study, the following standard errors are available in the results report (Prenzel & the German PISA Consortium, 2006; Chapter 4, p. 112):
- Mathematics: t1: SE = 2.2, t2: SE = 2.1, increment t1-t2: SE = 1.1.
- Science: t1: SE = 2.3, t2: SE = 2.5, increase t1-t2: SE = 1.7
- Individual standard errors for the WLE estimators are unfortunately not included in the data set.
Are there variables that include the subjects taught?
Unfortunately, teacher variables are not included in the PISA I-Plus 2003-4 data set. However, the corresponding data for measurement time point (MZP) 1 can be taken from the teacher data set of the PISA 2003 study (data set name: PISA2003-I_Datensatz_Lehrkraft_9Kl) and transferred to the PISA I-Plus 2003-4 data set via the variable [idclass_FDZ]. However, the PISA 2003 teacher data set only contains information on whether any of the subjects taught were mathematics ([fama_all]), German ([fadeu_al]), physics ([faphy_al]), biology ([fabio_al]) or chemistry ([fache_al) (1 = yes).
Why do fewer students, but more schools and one more class remain in the data set if I only consider those students who belong to the core sample (kernstpr=1)?
The filter variable [kla_l] for replicating the sample sizes on p. 49 would be suitable. However, this would result in a total sample size (students) of n = 4550. Discrepancies could also be explained by the fact that students may have been excluded for the report (e.g. on the basis of the variables [class], [partbook] or [test1_2]). Unfortunately, this cannot be reconstructed on the basis of the data available to the FDZ at IQB. The FDZ at IQB did not receive the data set that was the basis for Chapter 2. It is also not possible, for example, to replicate the reported competency levels using the data sets available at the FDZ at the IQB. However, the total sample size on p. 49 can be replicated.
At what time did the survey take place in grade 10?
The survey for the second measurement point (MZP2) took place in spring 2004; unfortunately, the results report (Prenzel et al., 2006) does not give an exact date of testing.
What do variables with the ending [ _s] and [ _w] stand for and how are they scaled?
The ending "_s" can stand for "student statement" and for "standard error" in the data set. For scale values, the suffix "_w" denotes the WLE person ability estimator and the suffix "_s" denotes the standard error to that ability estimator. WLE estimators can be interpreted in much the same way as z-standardised values (except that the standard deviation is not fixed at 1): the value 0 corresponds to an average expression (the sample mean is usually fixed at 0), lower values (< 0) indicate lower expressions in the trait and higher values (> 0) indicate higher expressions in the trait. More information on WLE estimators can be found in the paper by Warm (1989).
Can I recognise variables of the 2nd measurement point in the student data set by the ending [ _2]?
Yes, in the data set the ending "_1" in the variable name as well as the addition "(t1)" in the variable label indicates the first time of measurement and the ending "_2" in the variable name as well as the addition "(t2)" in the variable label indicates the second time of measurement. In the PISA I-Plus 2003-4 data set, no teacher data are available for MZP2.
Do the variables [selblehr] and [selbklas] contain information on whether the class and/or mathematics teacher were changed? And what is the scaling (0/1)?
The variable [selblehr] indicates whether students were taught by the same teacher at both measurement points (0 = no; 1 = yes). The variable [selbklas] indicates whether students belonged to the same class group at both measurement points (0 = no; 1 = yes).
What does the variable [schform2] stand for?
The variable [schform2] stands for the type of school attended at the second time point.
Does the variable [tr_ma_04_r] contain the mathematics grade in grade 10? And does it refer to the last report mark and thus the half-year mark?
Yes, these are school grades from the first half of 2004 for all students who were examined in 2004 (insofar as complete information was available from the school coordination). This information is also documented in the variable label ["mathematics grade_recoded (at T2, 1st half-year 2004)"].
What do the endings [ _cs] and [ _gt] stand for?
The endings [ _cs] and [ _gt] stand for curricular and basic educational competence respectively.
What do [t1] and [t2] stand for in the variable names for mathematical competence?
[t1] and [t2] stand for the 1st and 2nd measurement point, respectively.
If [wle] variables contain the person measurement values, how do missing or negative values occur (e.g.: in the case of [wle_m1t1_cs])?
Missing values on the person ability estimator variables occur because these students did not take the competency test or completed too few items. Negative values can theoretically occur and indicate very low abilities; however, these could also be estimation problems of the abilities for these students (on the variable [wle_m1t1_cs] only four students have negative values).
In terms of social background, the data set includes variables on ISEI and EGP classes. Is there a difference between the variables [isei_m] and [m_isei] (mother's origin) and [isei_v] and [isei_f] (father's origin)?
The variable [isei_m] indicates the ISEI of the mother, in the variable [m_isei] missing information of the ISEI of the mother was filled with ISEI values of the father.
The variable [isei_v] contains the father's ISEI, the variable [isei_f] contains the highest ISEI in the family. This information is also stored in the variable labels.
What is the difference between the variables [egp_6], [m_egp_6], [egp_f6], [egp_v6] and [egp_m6]?
The same naming rules were applied here as for the ISEI variables.
- egp_6: EGP class of the father, missing values filled with information from the mother, 6-level (t1)
- m_egp_6: EGP class of the mother, missing values filled with information of the father, 6-level (t1)
- egp_f6: Highest EGP class in the family, 6-level (t1)
- egp_v6: EGP class of father, 6-level (t1)
- egp_m6: EGP class mother, 6-level (t1)
- egp_m: EGP-class mother - corrected data (t1)
- egp_v: EGP-class father - corrected data (t1)
What do the variables [sefic_t2] and [intma_t2] indicate?
These variables indicate the expression at measurement time point (MZP) 2 in the form of WLEs (comparable interpretation as for z-values). Negative values stand for lower expressions, positive values for higher expressions. To determine the increase/decrease, one needs the information on MZP1 ([sefic_w] & [intma_w]).
Is there a variable for mathematical self-concept at the 2nd measurement point?
Yes, the variable with the WLE estimator is called [msk_t2], the items for the scale are called msk1_2 - msk2_2.
Is there a variable for the global scale "interest in mathematics"?
Yes, the variables are called [intma_w] (MZP1) and [intma_t2] (MZP2).
Which test items were used for the 2nd measurement point in PISA-I-Plus 2003-4?
The tests used for the second measurement point in 2004 include new items in the general test (n = 23 new items) as well as in the Curicullum specific test (n = 23 new items).
For more information see chapters 2 (p. 52) and 12 (p. 311ff.) in Prenzel, M. und Deutsches PISA-Konsortium (2006). PISA 2003: Untersuchungen zur Kompetenzentwicklung im Verlauf eines Schuljahres. Münster u. a.: Waxmann.
How high were the reliabilities of the competency tests in PISA-I-Plus 2003-4?
Reliabilities (WLE) for the 2004 achievement tests (p. 314 in Prenzel, M. und Deutsches PISA-Konsortium (2006). PISA 2003: Untersuchungen zur Kompetenzentwicklung im Verlauf eines Schuljahres. Münster u. a.: Waxmann) are:
- Natural sciences:
- 2003: .77 (63 items)
- 2004: .61 (28 items)
- Mathematics (general test):
- 2003: .77 (77 items)
- 2004: .85 (98 items)
- Mathematics (curriculum sensitive test):
- 2003: .64 (46 items)
- 2004: .79 (68 items)
Discrepancies between these parameters and those reported in the PISA 2003 Scale Handbook could be related to (a) differences in the student sample (international sample vs. national 9th grade expansion), (b) differences in the item group (international items vs. national items), or (c) differences in the estimation method. The reliability estimates given above are based on: Andrich, D. (1982). An index of person separation in latent trait theory, the traditional KR. 20 index, and the Guttman scale response pattern. Education Research and Perspectives, 9(1), 95-104.
Where can I get more information on the variable of intended completion in PISA-I-Plus 2003-4?
The documentation of the survey instruments for the study are published in the following publication (unfortunately, for copyright reasons, we cannot make this document available to you electronically):
- Ramm, G. C., Adamsen, C., Neubrand, M., & Deutsches PISA-Konsortium (2006). PISA 2003: Dokumentation der Erhebungsinstrumente. Münster u. a.: Waxmann.
In addition, you can research the questionnaires for the study on the OECD page and the Research Data Centre of the DIPF.
Is there a scale manual for the PISA I-Plus 2003-4 study?
Unfortunately, there is no independent scale manual for this study. The international documentation and reports on the survey instruments may be helpful:
Does a [d] in variable names (e.g. st15q01d) stand for the national questionnaire?
The PISA 2003 Scale Manual (Ramm et al., 2006, p. 130) describes the rules for variable naming in more detail. The "d" at the end of the variable names thus indicates additional national questions in the international instrument. The scale manual is also suitable in many respects for work with the PISA I-Plus 2003-4 data.
What are the reliabilities of the competence estimators in the PISA-I-Plus 2003-4 study?
The WLE reliabilities of the competence estimators in the PISA-I-Plus 2003-4 study are presented in the results report "PISA 2003 - Untersuchungen zur Kompetenzentwicklung im Verlauf eines Schuljahres" (Prenzel & das deutsche PISA-Konsortium, 2006) in Chapter 12 (p. 314):
- Mathematics
a) overall test: t1 (2003): Rel. = .77; t2 (2004): Rel. = .85
b) curriculum-oriented test: t1 (2003): Rel. = .64; t2 (2004): Rel. = .79
- Nawi: t1 (2003): Rel. = .77; t2 (2004): Rel. = .61
Why are test item variables not included in the PISA I-Plus 2003-4 data set?
Variables on the individual test items were not provided to us because students' responses to these test items have already been combined into ability estimates. We recommend that if you are interested in the variable [mverst1t], you use the following person ability estimators for your analyses:
- Plausible Values (PVs) of mathematical ability at the first and second measurement time points: pv1m1t1_cs, pv2m1t1_cs, pv3m1t1_cs, pv4m1t1_cs, pv5m1t1_cs, pv1mzws_cs, pv2mzws_cs, pv3mzws_cs, pv4mzws_cs, pv5mzws_cs, pv1m2t1_cs, pv2m2t1_cs, pv3m2t1_cs, pv4m2t1_cs, pv5m2t1_cs, pv1m2t2_cs, pv2m2t2_cs, pv3m2t2_cs, pv4m2t2_cs, pv5m2t2_cs, pv1m1t1_gt, pv2m1t1_gt, pv3m1t1_gt, pv4m1t1_gt, pv5m1t1_gt, pv1mzws_gt, pv2mzws_gt, pv3mzws_gt, pv4mzws_gt, pv5mzws_gt, pv1m2t1_gt, pv2m2t1_gt, pv3m2t1_gt, pv4m2t1_gt, pv5m2t1_gt, pv1m2t2_gt, pv2m2t2_gt, pv3m2t2_gt, pv4m2t2_gt, pv5m2t2_gt
- Weighted Likelihood Estimates on students' mathematical ability: wle_m1t1_cs, wle_zws_cs, wle_m2t1_cs, wle_m2t2_cs, wle_m1t1_gt, wle_zws_gt, wle_m2t1_gt, wle_m2t2_gt.
- Plausible Values (PVs) of the scientific abilities at the first and second measurement time point: pv1n1t1, pv2n1t1, pv3n1t1, pv4n1t1, pv5n1t1, pv1nzws, pv2nzws, pv3nzws, pv4nzws, pv5nzws, pv1n2t1, pv2n2t1, pv3n2t1, pv4n2t1, pv5n2t1, pv1n2t2, pv2n2t2, pv3n2t2, pv4n2t2, pv5n2t2
- Weighted Likelihood Estimates of students' science ability: wle_n1t1, wle_nzws, wle_n2t1, wle_n2t2
More information on the analysis of these ability estimators can be found in the Results Report on PISA-I-Plus 2003-4 and in the following manuscripts:
- Prenzel, M. und Deutsches PISA-Konsortium (2006). PISA 2003: Untersuchungen zur Kompetenzentwicklung im Verlauf eines Schuljahres. Münster u. a.: Waxmann.
- Von Davier, M., Gonzalez, E., & Mislevy, R. J. (2009). What are plausible values and why are they useful? IERI Monograph Series Volume, 2, 9–36.
- Warm, T. A. (1989). Weighted likelihood estimation of ability in item response theory. Psychometrika, 54, 427–450.
Literature
Selected literature is listed here (as of March 2023).
2021
Becker, M., Kocaj, A., Jansen, M., Dumont, H. & Lüdtke, O. (2021). Class-average achievement and individual achievement development: Testing achievement composition and peer spillover effects using five German longitudinal studies. Journal of Educational Psychology. https://doi.org/10.1037/edu0000519
Ostermann, C. & Neugebauer, M. (2021). Macht Ähnlichkeit den Unterschied? Wenn sozioökonomisch benachteiligte Schülerinnen und Schüler von sozial ähnlichen Lehrkräften unterrichtet werden. KZfSS Kölner Zeitschrift für Soziologie und Sozialpsychologie, (73), 259–283. https://doi.org/10.1007/s11577-021-00779-3
Stallasch, S. E., Lüdtke, O., Artelt, C. & Brunner, M. (2021). Multilevel Design Parameters to Plan Cluster-Randomized Intervention Studies on Student Achievement in Elementary and Secondary School. Journal of Research on Educational Effectiveness, 22(10), 1–35. https://doi.org/10.1080/19345747.2020.1823539
2020
Stallasch, S. E., Lüdtke, O., Artelt, C. & Brunner, M. (2020), 3. March Multilevel Design Parameters to Plan Cluster-Randomized Intervention Studies on Student Achievement in Elementary and Secondary School. https://doi.org/10.35542/osf.io/f3p7q
2019
Atlay, C. (2019). Teaching quality and educational inequalities: An interdisciplinary inquiry of the relationship between student background and teaching quality - Dissertation. Eberhard-Karls-Universität Tübingen, Tübingen. https://doi.org/10.15496/publikation-30780
Atlay, C., Tieben, N., Hillmert, S. & Fauth, B. (2019). Instructional quality and achievement inequality: How effective is teaching in closing the social achievement gap? Learning and Instruction, 63. https://doi.org/10.1016/j.learninstruc.2019.05.008
Klein, O., Neugebauer, M. & Jacob, M. (2019), 17. July Migrant teachers in the classroom: A key to reduce ethnic disadvantages in school? https://doi.org/10.31235/osf.io/2s8n6
Lazarides, R. & Buchholz, J. (2019). Student-perceived teaching quality: How is it related to different achievement emotions in mathematics classrooms? Learning and Instruction, 61, 45–59. https://doi.org/10.1016/j.learninstruc.2019.01.001
Lazarides, R., Dietrich, J. & Taskinen, P. H. (2019). Stability and change in students' motivational profiles in mathematics classrooms: The role of perceived teaching. Teaching and Teacher Education, 79, 164–175. https://doi.org/10.1016/j.tate.2018.12.016
Göllner, R., Wagner, W., Eccles, J. S. & Trautwein, U. (2018). Students’ idiosyncratic perceptions of teaching quality in mathematics: A result of rater tendency alone or an expression of dyadic effects between students and teachers? Journal of Educational Psychology, 110(5), 709–725. https://doi.org/10.1037/edu0000236
2016
Autorengruppe Bildungsberichterstattung. (2016). Bildung in Deutschland 2016. Ein indikatorengestützter Bericht mit einer Analyse zu Bildung und Migration. Bielefeld: Bertelsmann. https://doi.org/10.3278/6001820ew
Kriegbaum, K. & Spinath, B. (2016). Explaining Social Disparities in Mathematical Achievement: The Role of Motivation. European Journal of Personality, 30(1), 45–63. https://doi.org/10.1002/per.2042
Lorenz, C.-V. (2016). A tree must be bent while it is young? The effect of age at school entrance on school performance in Germany - Unveröffentlichte Bachelorarbeit. Universität Mannheim, Mannheim.
2015
Kriegbaum, K., Jansen, M. & Spinath, B. (2015). Motivation: A predictor of PISA's mathematical competence beyond intelligence and prior test achievement. Learning and Individual Differences, 43, 140–148. https://doi.org/10.1016/j.lindif.2015.08.026
2014
Kriegbaum, K. (2014). Zur Wichtigkeit der Motivation als Prädiktor für die mathematische Kompetenz bei PISA 2003 und 2004 - Unveröffentlichte Masterarbeit. Ruprecht-Karls-Universität Heidelberg, Heidelberg.
2013
Prenzel, M., Baumert, J., Blum, W., Lehmann, R., Leutner, D., Neubrand, M., Pekrun, R., Rost, J. & Schiefele, U. (2013). Programme for International Student Assessment - International Plus 2003, 2004 (PISA-I-Plus 2003, 2004) (Version 1) [Datensatz]. Berlin: IQB - Institut zur Qualitätsentwicklung im Bildungswesen. https://doi.org/10.5159/IQB_PISA_I_Plus_v1
2006
Prenzel, M. & Deutsches PISA-Konsortium (Hrsg.). (2006). PISA 2003 : Untersuchungen zur Kompetenzentwicklung im Verlauf eines Schuljahres. Münster u. a.: Waxmann.