VERA – An overview

What is VERA?

VERA is a shortened form of Vergleichsarbeiten, which is the name for written comparison tests that students take in the 3rd and 8th grades (VERA-3 and VERA-8) in Germany. The tests are held nationwide and are designed to investigate which competencies the students have achieved by certain points in their school career. In this context, “nationwide” means that VERA tests must be carried out in all 3rd-grade and 8th-grade classes at all schools in the general education system to establish where students are in their learning. In some of Germany’s states, VERA tests are not called comparison tests: they are Lernstandserhebungen (learning assessments) in Hesse and North Rhine-Westphalia, KERMIT – Kompetenzen ermitteln (competency assessments) in Hamburg, and Kompetenztests (competency tests) in Saxony and Thuringia. South Tyrol and the German-speaking Community in Belgium began participating in VERA 3 in 2010.

Which subjects or domains does VERA test?

Germany’s states have agreed that VERA-3 must test either German or mathematics. If German is chosen one year, the test must address reading competency as a minimum. Other content areas that can be tested in German are listening, spelling, and language and language use. There are also plans to include writing as an optional area for future VERA-3 tests. VERA-3 mathematics tests focus on two out of five content areas (e.g. numbers and operations, and data, frequency and probability). The states and the IQB select the two areas prior to each round of tests.

All the states participating in VERA must also test 8th-grade students in at least one subject. If the chosen subject is German, students must be tested on reading competency as a minimum. There is also the option to test listening, spelling, or language and language use. VERA-8 is also set to include a test for writing in the future. The VERA-8 mathematics tests include all five content areas (core competencies). With regard to the first foreign language (English or French), students should be tested in the domains of reading and listening as a minimum.

The states are free to make schools test more than one subject or content area at both VERA-3 and VERA-8. Before each round of tests, teachers develop and trial new items and use them to produce new test booklets.

The proficiency level models are available here (in German).

What does VERA aim to achieve?

Carrying out comparison tests in all 16 states of the Federal Republic of Germany is part of the Standing Conference’s comprehensive strategy for educational monitoring, which it adopted in 2006. Educational monitoring aims to make the education system focus more attention on student competencies. Rather than asking what content a subject should teach, the system should be asking which competencies students should have achieved in that subject by a particular point in their school career. The hope is that this shift in focus will mean that, instead of teaching stagnant knowledge that students can only draw on to answer extremely narrow and familiar test items, lessons will help students develop networked knowledge that they can use to solve a wide variety of problems. This is an extremely demanding subject-didactic and pedagogical task, and tests and performance feedback can only ever play a supporting role for the teachers who have to fulfil it.

There are a number of important differences between international student assessments (e.g. PISA, PIRLS/IGLU and TIMSS) and Germany’s national assessments for monitoring the achievement of educational standards on the one hand, and the VERA comparison tests on the other. They mainly concern the respective goals (what purpose should the results serve?) and the level of evaluation (who should be evaluated?). The following table summarises a few of the differences.

 

 

International student assessments

National student assessments

Comparison tests/learning assessments

PISA, PIRLS/IGLU, TIMSS

Standing Conference’s national assessment studies

VERA-3 and VERA-8

Design

Sample survey

Sample survey

All students in the respective grade

Frequency

Every 3-5 years Every 5 years (primary school), every 3 years (secondary level) Annually
Main aim System monitoring System monitoring Teaching/school development
Evaluation level Countries States of the Federal Republic of Germany Schools, learning groups or classes
Implementation External test administrators External test administrators Teachers (usually)
Analysis Central Central Local: by teachers and state institutes
Results feedback In approx. 3 years In approx. 1 year Immediate feedback after data entry; differentiated feedback with multiple comparative values in a few weeks

 Differences between sample-based student assessments and comparison tests/learning assessments

In its March 2012 agreement on the further development of VERA, the Standing Conference stressed that the main function of the comparison tests was to drive teaching and school development, and that they should also help facilitate the implementation of subject-specific and subject-didactic concepts contained in the educational standards. The agreement emphasises that VERA is not suitable for issuing grades and should not be used as a predictor of success at subsequent schools. Rather than quizzing students on the teaching material or curriculum content they just learned in class, the comparison tests focus on competencies that might have nothing to do with that material or content. The agreement also states that VERA results from individual schools will not be published as league tables. In addition, giving school supervisors and inspectors access to VERA results must follow strict rules that are aligned with VERA’s main task of driving school and teaching development.

At the end of 2010, the Standing Conference published a concept for using the educational standards for teaching development. The text stresses that performance feedback from comparison tests should be a key component of a data-based development cycle in schools, and must be rooted in a feedback culture that acts as the interface between data reporting and data use (p. 17).

The pedagogical potential that VERA offers teachers and schools is reflected in the following:

  • the test items and performance feedback consistently focus on student competencies;
  • the tests give teachers an “outsider’s view”, i.e. multiple options for comparing learning progress in their own class;
  • the tests allow teachers to improve their diagnostic skills;
  • teachers can use the results to justify and plan pedagogical interventions and support measures;
  • the performance feedback can be used for improving instruction in subject-specific departments in schools.

Two key resources for making the most of this potential are the example items and the didactic commentaries that are developed and provided by the IQB (see: From development to testing).

How are the responsibilities for VERA tests divided between the IQB and the states?

The responsibilities for VERA are clearly divided between the IQB on the one hand and the state institutes, quality agencies and relevant ministerial departments on the other (see table below).

The states are responsible for conducting their VERA tests. They each use their own regulations to organise the way they prepare their tests, run them, analyse them and give feedback on the results. The scope of the tests and the choice of test items can also be adapted to fit the specific needs and circumstances of individual states.

Producing the test items, however, happens at the national level under the leadership of the IQB. Teachers develop the items, university experts in subject didactics review and evaluate them, and then, before they are rolled out for use across the country, academic testing experts from the IQB put them into empirical trials with several hundred students to assess their suitability and difficulty. Actual VERA tests only include the items that make it through the whole testing process. Once the items have been didactically and statistically verified, the IQB compiles them into test booklets.

Allocation of responsibilities

The IQB

The states

  • Developing items
  • Trialling items (piloting)
  • Determining the difficulty of items (scaling)
  • Compiling test booklets
  • Developing accompanying instructional materials
  • Feedback design
  • Printing and distributing test booklets
  • Conducting tests
  • Correcting tests, data entry
  • Statistical analysis
  • FeedbackImplementing additional measures to support schools after they receive results feedback

Allocation of VERA responsibilities

SHu 23.02.21