Review of the Danish Evaluation Institute

Download 241 Kb.
Size241 Kb.
1   ...   78   79   80   81   82   83   84   85   86


Dahler-Larsen, P och Larsen, F. 2001 Anvendelse af evaluering – Historien om et begreb, der udvidger sig. I Dahler-Larsen, P. och Krogstrup, H. (ed.). Tendenser i evaluering. 2nd, Ed. Odense Universitetsforlag.

Danmarks Evalueringsinstitut. 2004. Bestyrelsemøde i Danmarks Evalueringsinstitut. Minutes of meetings held on 3 Feb.2004, 29 March 2004, 10 June 2004, 23 Aug. 2004, 28 Sept. 2004, 28 Oct. 2004, 29 Nov. 2004.

Danmarks Evalueringsinstitut. 2004. Effektundersøgelse. Redogørelse 2004.

Danmarks Evalueringsinstitut. EVA’s strategi 2004–2006. I Bilag EVA’s selvevaluering 2005. Danmarks Evalueringsinstitut.

Danmarks Evalueringsinstitut. 2005. Fokus på output i uddannelsessystemet. Videns- og erfaringsopsamling.

Danmarks Evalueringsinstitut. 2005. Handlingsplan 2006. Draft.

Danmarks Evalueringsinstitut. 2005. Med egne øjne. EVA’s selvevaluering 2005.

Danmarks Evalueringsinstitut. 2005. Vision, mål og strategi 2005-2007. Metodenheten.

ENQA. 2005. Standards and Guidelines for Quality Assurance in the European Higher Education AREA.

Ginsburg, A. and Rhett, N. 2003. Building a Better Body of Evidence: New Opportunities to Strengthen Evaluation Utilization. In American Journal of Evaluation. Vol. 24, No. 4, pp 489–498.

Grasso, P. 2003. What Makes an Evaluation Useful? Reflections from Experience in Large Organizations. In American Journal of Evaluation. Vol. 24, No. 4, pp 507–514.

Henry, G. 2003. Influential Evaluations. In American Journal of Evaluation. Vol.24, No. 4, pp 515–524.

Henry, G. and Melvin, M. 2003. Beyond Use: Understanding Evaluation’s Influence on Attitudes and Actions. In American Journal of Evaluation. Vol. 24, No. 3, pp 293–314.

Karlsson, O. 1999. Utvärdering – Mer än metod. En översikt. A´JOUR, En serie kunskapsöversikter från Svenska Kommunförbundet, No. 3.

Kirkhart, K. 2000. Reconceptualizing Evaluation Use: An integrated Theory of Influence. In New Directions for Evaluation. No. 88. Winter 2000.

Leviton, L. 2003. Evaluation Use: Advances, Challenges and Applications. In American Journal of Evaluation. Vol. 24, No. 3, 99 525–535.

Ministeriet for Videnskab, Teknologi og Udvikling. 2002. Tid for forandring for Danmarks universiteter. 22 October 2002.

Ministeriet for Videnskab, Teknologi og Udvikling. 2003. Lov om universiteter (universitetsloven). LOV Nr. 403 av 28/05/2003.

Preskill, H. Zuckerman, B. and Matthews, B. 2003. An Exploratory Study of Process Use: Findings and Implications for Future Research. In American Journal of Evaluation. Vol. 24, No. 4, pp 423–442.

Rothstein, B. 2005. Ämbetsverken står för ideologin. I Dagens Nyheter 2005-05-06.

Scott, G. And Hawke, I. 2003. Using an External Quality Audit as a Lever for Institutional Change. In Assessment & Evaluation in Higher Education. Vol. 28, No. 3.

SOU 2004:27. En Ny Doktorsutbildning – kraftsamling för excelens och tilväxt. Betänkande av Forskarutbildningsutred ningen.

Statsministeriet. 2005. Nye mål. Regeringsgrundlag. February 2005.

Undervisningsministeriet. 2000. Bekendtgørelse af lov om Danmarks Evalueringsinstitut. I Bilag EVA’s selvevaluering 2005. Danmarks Evalueringsinstitut.

Undervisningsministeriet. 2000. Vedtækt for Danmarks Evalueringsinstitut (EVA). In Bilag EVA´s selvevaluering 2005. Danmarks Evalueringsinstitut.

Undervisningsministeriet. 2001. Bekendtgørelse om opfølgning på evaluering ved Danmarks Evalueringsinstitut m.v. In Bilag EVA´s selvevaluering 2005. Danmarks Evalueringsinstitut.

Vedung, E. 1998. Utvärdering i politik och förvaltning. Andra upplagan. Studentlitteratur.

1 In addition to this review a number of other activities were carried out that focused mainly on quality assurance of EVA’s external cooperative relationships.

2 The aspects listed in ENQA’s report are:

  • Use of external quality assurance procedures for higher education

  • Official status

  • Activities

  • Resources

  • Mission statement

  • Independence

  • External quality assurance criteria and processes used by the agencies.

  • Accountability procedures (s. 23–26)

3 The panel is to consist of a chair with managerial experience from the public sector in Denmark, a Nordic expert on the school system, a Nordic evaluation expert on education at more advanced levels and an individual who is an expert on the collection and dissemination of information. One member of the panel also has to be an acknowledged expert researcher or teacher.

4 For instance Evert Vedung claims that evaluations for the purpose of control that are intended to provide a basis for decisive strategic decisions are often more comprehensive and probe more deeply than evaluations intended merely to stimulate improvement. (p. 103)

5 Standards and Guidelines for Quality Assurance in the European Higher Education Area, p. 25.

6 “But citizens need evaluation to be able to adopt a standpoint on how politicians at various levels behave and call them to account in elections” (p. 100). Vedung maintains that information about the outcomes of political decisions in the field “must be central for citizens”.

7 In an article in New Directions for Evaluation (2000) Karen E. Kirkhart, for instance, presents a model for the analysis of the impact of evaluations. In it she asks whether the impact can be attributed to the evaluation process and also if the impact is intended or not. She stresses the time perspective, i.e. that impact can be immediate, can be directly linked to the conclusion of the evaluation or that it can occur later, in the long term. Henry & Mark (2003), Henry (2003), Grasso (2003), Leviton (2003) as well as Ginsburg & Rhett (2003) have published articles in the American Journal of Evaluation on the use and impact of evaluations. In Sweden evaluation researchers such as Evert Vedung (Ut­värdering i politik och förvaltning) and Ove Karlsson (Utvärdering – mer än bara metod) have looked at the problems relating to the use of evaluations and attention has been drawn to this in Denmark, for instance, by Peter Dahler-Larsen and Flemming Larsen (Anvendelse af evaluering – Historien om et begreb, der udvidger sig).

8 Act on the Danish Evaluation Institute, Act No. 290, May 12th, 1999.

9 Knowledge is generated in the course of EVA’s evaluations that can be circulated to and used by others in internal and external contexts. Knowledge is also acquired during other studies and in connection with conferences, seminars and network meetings. This knowledge is circulated with the help of reports – Nøgler til Forandring från 2001, Educational Evaluation around the World, Skoleudvikling i Chicago – and also through courses, conference papers, etc.

10 According to Section 3 of the act, evaluations may be commissioned by the Ministry of Education, other ministries, general educational councils, county councils, borough councils, public educational institutions, approved educational institutions or those entitled to government support, as well as educational institutions that offer private courses.

11 The Ministry of Education can decide on follow-up of the evaluations made by EVA in primary and secondary education and also on sanctions based on these evaluations. According to the EVA Act, the ministry may decide to withdraw the institutions’ entitlement to offer certain programmes. The development contracts between the universities and the Department of Science, Technology and Innovation contain provisions about following up evaluations at university level. During the site visit neither EVA nor the institutions evaluated were certain that these follow-up measures were carried out effectively.

12 Minutes of the meeting of the board on August 23rd, 2004.

13 How sensitive the issue of media attention can be was confirmed during the site visit by several of the representatives of secondary education. They criticised both the formulation of EVA’s press releases and the way the media dealt with EVA’s evaluation results, claiming that there is a tendency in the press to present results as more negative than they really are.

14 During the spring of 2005 there has been discussion in the Swedish media of what have been referred to as “public agencies that generate ideology”, i.e. government agencies and boards that adopt standpoints on ideological/political issues and are not objective and impartial as required by the Swedish constitution. For instance, Bo Rothstein, Professor of Political Science, claims that this is a problem for democracy (Dagens Nyheter May 6th, 2005, Ämbetsverken står för ideologin (Ideology generated by civil servants.)

15 Second paragraph of Section 8: “Evalueringsinstituttet offentliggør alle evalueringsrapporter og orienterer offentligheden om sin aktivitet.

16 In a survey made by EVA among “the 87 organisations entitled to nominate members of the Committee of Representatives” one question concerned how many knew about the evaluations published by EVA between 2001 and June 2004. Among the respondents there was greatest awareness in the higher education sector (79 per cent) followed by the upper-secondary school system (68 per cent), adult and further education (59 per cent) and primary and lower-secondary education (50 per cent).

17 But the relatively limited circulation of evaluation results in various educational sectors can scarcely be blamed entirely on their failure to reach their potential readers either because of lack of attention in the press or for other reasons. In his book Utvärdering i politik och förvaltning Evert Vedung describes four alternative strategies for increasing the usefulness of evaluations. Only one of these involves improving the circulation of their results. He calls the others production focused, end-user focused and measure focused strategies. These assume, for instance, if evaluations have greater relevance and there is greater willingness among end-users to benefit from their results their usefulness will increase. In Ove Karlsson’s survey as well, shortcomings in the communication between the evaluator and those evaluated can only be seen as one of several explanations for the failure to use evaluation results. Other reasons are said to be related to weakness in the evaluations themselves and the end-users’ own inadequacies.

18 The mandate includes compulsory education, i.e. the primary and lower-secondary schools run by the state and also private schools that receive public funding.

19 The Danish term is “ungdomsutbildning” which corresponds to upper-secondary education in Sweden.

20 The Danish term “vidaregående utbildning” refers to all post-secondary education and in this report the terms “tertiary education” and “higher education” are used synonymously.

21 Two additional ministries are responsible for programmes subject to EVA’s evaluations: the Ministry of Culture is responsible for programmes in the fine arts and architecture and the Ministry of Economic and Business Affairs is responsible for the maritime programmes.

22 Vedtægt for Danmarks Evalueringsinstitut (EVA)

23 One element in the action plan for 2002 involved a survey initiated by EVA into the effects of four of EVA’s evaluations (Effektun­dersøgelse, 2004). One of its findings was that the institutions that participated were uncertain about who EVA was acting on behalf of, what the purpose of the evaluations was: “Is this being done for the educational institutions, ministry/ministries, is it perhaps for some obscure but very concrete objective or something entirely different?” (p. 29).

24 By ‘action-plan based evaluations’ is meant the evaluations undertaken by EVA by virtue of the EVA Act and which are determined in the institute’s annual action plan.

25 For instance this Board will be given the task of developing national tests, responsibility for documenting the results of primary and lower-secondary education, conducting other evaluations of primary and lower-secondary education, administering tests and examinations in primary education and lower-secondary as well as supervising quality and compliance with the Act on Primary Education by municipalities and offering them advice.

26 Of these 130 institutions, 115 are organised in 21 larger organisational units, known as Centre for Videregående Uddannelse (CVU’s).

27 An internal survey revealed for instance that awareness of EVA’s evaluations is considerably greater in higher education than in the other educational sectors (p. 88 in EVA’s self-evaluation).

28 The proposals were issued in a publication entitled Tid for forandring for Danmarks universiteter (A time of change for Denmark’s universities).

29 The University Act: Lov om universiteter, LOV No.403 May 28th, 2003.

30 During 2004 the Swedish National Agency for Higher Education completed 339 quality evaluations of undergraduate and postgraduate programmes in the framework of 15 national evaluation projects. In addition a number of evaluations were made in parallel with the national evaluations. The Swedish Board of Education – whose task is to inspect all the local authorities in Sweden and their pre-school and after-school facilities, all compulsory and upper-secondary schools and adult education – inspected 32 local authorities during the same year, which involved a total of more than 1,000 schools (according to the Board’s own website). In Finland the Council for the Evaluation of Education has drawn up a long-term evaluation programme for compulsory schools, upper-secondary schools offering general educational programmes, vocational programmes and adult education. The Ministry of Education decides which of the proposed evaluations are to be carried out. The Board of Education undertakes the evaluations that involve learning outcomes. During 2004 the Board of Education carried out or prepared 11 evaluation projects covering a total of 1,200 compulsory schools. The responsibility for the evaluation of higher education programmes rests with the Council for the Evaluation of the Higher Education Institutions. During 2004 a total of 23 evaluations of varying extent were prepared or carried out.

31 The effects of participation and the development of internal capacity is discussed for instance in articles by Scott & Hawke in Assessment & Evaluation in Higher Education (2003) and by Preskill, Zuckerman & Matthews in American Journal of Evaluation (2003). In the survey undertaken by EVA to gauge the effects of its own evaluations dialogue and reflection were said to be the most noticeable outcomes. These evaluations had to a lesser extent resulted in concrete actions (Effektundersøgelse, 2004).

32 In her article Evaluation Use: Advances, Challenges and Applications, for instance, Laura C. Leviton argues for the significance of a planned and accumulated basis of knowledge on which decisions can be based. This article was published in the American Journal of Evaluation, Vol. 24, No 4, 2003.

33 The Bologna process is described for instance in SOU 2004:27: En Ny Doktorsutbildning – kraftsamling för excellens och tillväxt (A new doctoral programme – pooling resources for excellence and growth).

34 In its report Fokus på output i uddannelsessystemet EVA for instance indicates that there is greater focus on output in the Danish educational system.

35 The strategic plan for 2004-2006 can be summarised as meaning that several educational institutions are to be involved in EVA’s, activities, that projects are to focus on special/priority issues, that in evaluations and surveys attention is to be paid to process, outcomes and horizontal issues in the educational system, and on the development of tools. In addition evaluations and knowledge centre activities are to be made visible. EVA’s expertise and skills are to be offered to the educational sector in the form of a predetermined range of surveys and other tasks, and it must be possible to follow up evaluations. The plan also stipulates that EVA is to participate in the development of methods for educational evaluation.

36 According to the survey of the effects of the evaluations referred to earlier, a number of the institutions involved also wonder what the aim of EVA’s evaluations is and why certain institutions are chosen to take part, but not others.

37 Two senior consultants are acting as heads of the methodology unit and the unit for tertiary/higher education. They have managerial responsibility for most of the projects in these areas and for most of the consultants employed by the units. EVA’s executive director has managerial responsibility for a couple of projects in the university sector. The senior consultants are not members of the management team. Nor are the unit coordinators, who coordinate what goes on in the units and between them and other units.

38 The power to appoint members of the institute’s staff may also be delegated to the Executive Director.

39 The Danish term is “kommissorium”.

40 The expert panel has had access to the minutes of all the board meetings held in 2004.

41 The Ministry of Education’s representatives claimed, for example, that the knowledge centre function needed to be developed through the establishment by EVA of relationships with national and international research environments, through the development of contacts, information activities and exchanges of experience with ministries and institutions as well as through enhancement of the general level of expertise.

42 Vedtægt for Danmarks Evalueringsinstitut (EVA).

43 In higher education the evaluations of programmes include all, or virtually all, of the higher education institutions offering the relevant subjects.

44 Some visits take only half a day, while audit visits extend over several days.

45 In the evaluations of higher education and adult education individual judgements are most often, but not always, expressed.

46 In the survey into the effects of the evaluations referred to earlier the institutions said that they expected a more individual response to the information they had submitted in their self-evaluations and to the effort they had made. It should be possible to identify their own institution in the reports if the conclusions and recommendations were to be of any use.

47 Bekendtgørelse om opfølgning på evaluering ved Danmarks Evalueringsinstitut m.v.

48 At the same time, the same individuals felt that EVA “should not become a public authority,” which could be the consequence if the institutions were obliged to comply with EVA’s recommendations and account for this compliance.

49 Vision, mål og strategi 2005–2007

50 At their Berlin meeting in September 2003, the EU’s ministers of education gave ENQA the authority to report to the meeting of ministers in Bergen in May 2005 on quality assurance and evaluation of evaluation institutes and on the relevant standards and criteria. ENQA’s work in this respect is approaching its conclusion, and the report will include proposals on the establishment of a European register of evaluation institutes. In future, membership of ENQA and admission to the register will depend on a positive review of the evaluation agencies in terms of current European Standards. Consequently, these standards must also be included in a review of EVA.

Share with your friends:
1   ...   78   79   80   81   82   83   84   85   86

The database is protected by copyright © 2020
send message

    Main page