Evaluation reports are official. Before they are published each of the institutions evaluated is entitled to read through the descriptions and appraisal and to correct errors of fact. Individual assessment of specific programmes/institutions are not generally included in the evaluation reports on the primary and lower-secondary or upper-secondary sectors, but the conclusions and recommendations are expressed in general terms. The idea is that this will give the reports greater general usefulness. In addition, it is intended to avoid “making an exhibition” of individual institutions.45
However, the expert panel feels that without individual feed-back it can be difficult for specific institutions to see the benefits of participating in an evaluation, and this may well affect interest in the evaluations as well as their impact.46 In addition it becomes easier for the individual institutions to offer excuses for the results of an evaluation. Criticism that is expressed in general terms can always be explained as referring to some other institution.
EVA itself does not express its own institutional opinions or values in the report. It does not consider that the EVA Act gives it any mandate to do so. The reports account solely for the opinions of the expert panel and the project team’s task includes ensuring that no other conclusions are presented than those for which support can be found in the various kinds of material available in the project.
In the view of the expert panel, this can be taken to mean that each evaluation stands, as it were, on its own two feet, that its conclusions and results relate only to the specific evaluation and the circumstances that apply to it. The panel regards this as a failure to derive full benefit from the extensive knowledge that EVA has accumulated during the years. The project teams and EVA’s management could well add their own analysis to the panel’s, as a supplement, and this could be based on the institute’s collective experience from different evaluations.
In analyses like these synergies could be developed between evaluations, as well as between evaluations in different educational sectors. What common trends can be seen in the teaching of languages in higher education in Denmark? How is the administration of the primary and lower-secondary school organised in the different Danish municipalities? How well does mathematics teaching in the upper-secondary school offer preparation for more advanced study? How well do teacher-training schemes correlate with the demand for more advanced subject knowledge in core subjects that faces teachers in the Danish school system and which, in view for instance of the PISA survey, should also be demanded of their pupils?