Parallel to this, the project also developed a survey methodology for conducting innovation surveys covering the public sector activities (data collection). This stage involved identifying the statistical unit, activity classifications, target population and measurement of concepts. For example, the ‘public sector’ was identified as the target population. However, the conflicting definitions of this sector have made it difficult to identify what organisations are within scope. The type of ownership and product needs to be considered.
The pilot study only included governmental institutions in the population (central, regional and local government). Some countries, however, may also have included some publicly owned non profit institutions (NPIs). In the guidelines (to be developed at a later stage), both non-governmental NPIs and private enterprises involved in non-market services should be included along with some enterprises involved in market services if the government is involved in their provision.
The services and goods classified as public services have to be specified in detail, so that the definition becomes operational and makes it possible to identify and set up a classification of public services. The project uses the EU-version of the UN-classification of all activities in the economy, NACE21. This is because NACE is the classification recommended in the Oslo Manual for the business enterprise sector and all government institutions and NPIs are obliged by EU-regulation to assign a NACE-class to the legal units and their establishments.
The NACE classifications used in the pilot questionnaire were categorised into seven core service groups:
Administration Services (e.g. security, defence and justice).
With respect to enterprise units and establishment units, the study found that neither could be used as the sole type of statistical unit. Enterprise units can not be used due to heterogeneity reasons as a single organisation may have many different types of public services. At the establishment unit level, there are differing establishment sizes making it difficult to compare.
Instead, a possibility would be to use the so-called kind of activity units (KAU), defined as “an enterprise or part of an enterprise which engages in one kind of economic activity without being restricted to the geographic area in which that activity is carried out”. An example would be all human health activities by a municipality.
The problems with identification of statistical units would mean that reporting units are needed. These units would have to determine the statistical units within their fields of operation, guided by instructions from the survey conductors. Probably the municipalities and the regional units could be used as reporting units, but also associations of municipalities might be reporting units. The enterprises classified as “Central government” would need to be looked at in more detail, before reporting units (which in some cases might be ministries) and statistical units could be established. In all, this will require extensive work when the survey is to be con-ducted in full scale for the first time.
The pilot study would need to be rather experimental regarding the statistical unit, while the reporting units could consist of a number of municipalities, regional units and ministries. For these reporting units all KAU-units in the seven core groups could be identified – and even the local KAU-units as well. Some should then be included in the survey through sampling. Also, an experimental approach is needed regarding whom to ask. In the guidelines the discussion above on reporting units, KAU-units, more levels and respondent(s) should be included, qualified by the results of the pilot study.
The number of units in the target population is not expected to be too large for a census in each of the Nordic countries, though it may depend on the final coverage and scope of the survey and to what degree privately owned units is to be included. However, if smaller surveys on public innovation are to be conducted or the surveys are to be conducted in larger countries, sampling might be the option.
If this is the case, the study suggests stratifying the population according to the public service groups classed by NACE codes, the size of the statistical unit, the institutional type and geography/urbanisation. A random sample of each stratum will then be collected. Other issues to consider are the frequency of data collection, whether it is voluntary or mandatory and the data collection mode.
The study aims to distribute the pilot questionnaire to 300-500 units (with 50 in Iceland). This means that samples need to be drawn in each country. A discussion of this will be provided in the draft guidelines.
Further information on the survey methodology can be found in the publication Survey methodology for measuring public innovation.
Interviews and focus groups were conducted with representatives of public sector organisations in all the Nordic countries. This ensured the surveys would reflect how public sector organisations understand innovation and how innovation actually takes place.
The Feasibility Study was completed in two phases. Firstly approximately 60 representatives from the public sector were interviewed to get their views of public innovation (between March-May 2009) and then a questionnaire was sent to potential respondents and follow-up interviews were conducted.
Results were used as input to other project modules and to draft a questionnaire. The results showed:
The concept of innovation is not well known, and is not understood in the same way by all within the public sector.
Barriers to innovation were identified as risk aversion, scarce economic resources, bureaucracy and regulation, and lack of incentives.
Drivers of innovation were listed as a learning organisation, having resources (funding and HR) for innovation activities, offering staff incentives (monetary or other) and developing competences through seminars/courses etc. Additionally, budget restraints were mentioned by some as a driver. With a shrinking budget the organisation is forced to think in a new way and try to find new solutions to work more efficiently.
Some respondents stressed the difficulties in accounting all resources, in-house expenditure and man-power used in innovation activities (particularly for smaller innovation activities). This process is time-consuming and there was doubt regarding the accuracy of the estimate.
Most respondents agreed that it is very important to measure the effects of innovations. However, all realised that it is difficult to make valid measurements.
There were different views with respect to who the respondent for the survey should be (both which organisation and within the organisation).
Respondents identified the benefits of the survey as the dissemination of knowhow, benchmarking and the promotion of innovation in the public sector.
The second phase consisted of testing and discussing the draft questionnaire with a selection of potential respondents in each country.22 .For the second phase, 32 respondents filled out the questionnaire and 37 participated in the interviews, either face-to-face or via telephone. Due to time constraints, not all questions could be discussed during the interview session. The respondents were selected to reflect different levels of the public sector and different areas of operation.
The draft questionnaire was translated to the languages in each Nordic country23. The questionnaire was sent by mail to the respondents, who were allowed 1 to 3 weeks to complete the questionnaire. The questionnaire used the definitions outlined in Box 1 and covered the following topics:
Results from the questionnaire indicated almost all respondents reported the introduction of at least one innovation during 2008-09. When requested to give examples of implemented innovations, all four types of innovation (product, process, organisational, and communication innovations) were present in the responses to the questionnaire.
For most innovations their own organisation was the main developer. Of the innovations that were developed in collaboration, half of the respondents answered that innovation(s) were developed together with business firms.
Regarding the novelty of the innovations, 5 innovations were considered to be new to the world and 11 new to the sector. Many of the innovations were related to ICT. Thirteen out of 17 answered that their innovation(s) had either high or medium ICT content (6 high, 7 medium).
Most of the organisations engage in in-house innovation activities. Nearly nine out of ten answered that their organisation engages in In-house R&D, Other in-house innovation activities and Internal or external training of staff.
Common acquisitions made for developing, implementing and introducing innovations were bringing in private consultants and purchased machinery, equipment and software.
There were no effects of innovation that seemed more occurring than others. Among the answers there were only a few which pointed out an opposite effect of the innovations. These were within the categories Efficiency and Quality effects.
Approximately half of the respondents try to measure by some means and methods the impact of an introduced innovation. However, most measuring activities seem to occur ad hoc and not on a systematic basis.
There seems to be a wide range of information sources and co-operation partners to the innovation activities of the organisations responding. None of the reporting alternatives in the questionnaire stood out but every one of them were selected as of high importance for at least one organisation. Nearly half of the respondents had co-operation partners located in other countries. Most co-operation partners were public organisations but a few were enterprises.
Roughly half of the respondents used procurement practices to promote innovation in their own organization. Less frequent were procurement used to promote innovations in other organizations.
Among the driving forces of innovation, the Internal driving forces (i.e. management and staff) seemed to be important. The responses for other categories were very mixed.
The overall responses to the questionnaire on innovation strategy and capacity were rather mixed. However, there were some similarities in the answers among respondents in Denmark. Statements respondents fully agreed with were “the organisation has specific goals/targets for innovation activities”; “managers give high priority to developing new ways of working”; “top management takes an active role in leading the implementation of innovation”; and “innovation activities are mainly organised as projects, steered by a dedicated group”. Some of these categories were also apparent among the Icelandic respondents. Only one category of answers showed the same results in Sweden.
The responses with regard to hampering factors differ quite a lot between Sweden and the other two countries. Among the Swedish respondents, many categories were given high importance. For example “uncertainty about political environment”; “budgetary rules”; “lack of tools to measure expected benefits”; and “resistance of users to changes”. In Denmark and Iceland only three and two items respectively were marked as high importance of the respondents. Instead the answers were more graded as low or no importance.
Respondents indicated the survey was too long and comprehensive, the definition of innovation was too vague, and measuring expenditure was difficult. During testing, respondents found the definition of innovation activities to be understandable but abstract, making it difficult to have a clear conception in mind when answering questions that are based on innovation activities (such as linkages and barriers).
Some respondents also did not appear to understand the concept of R&D in the same way as defined in the Frascati Manual24. This suggested that public sector innovation surveys should include a lengthier explanation of what R&D is and how it is defined, or alternatively should not include R&D activities in the questionnaire.
It was also found that respondents had difficulty discerning the difference between information sources and co-operation partners. Respondents thought that the questions under the ‘barriers to innovation’ section were constructed in a leading way.
Finally, most respondents seemed to be able to report for the whole organisation, although some expressed concerns that it would require extensive work on their part to gather all the information needed for an actual survey. For large organisations, conducting a survey on innovation could be complicated and it seems necessary for respondents to use more than one source within the organisation to gather the relevant information needed.
The project was completed in February 2011. The final report25 gives the structure of the project as follows:
Module 1 – conceptual framework: background research, design of overall conceptual framework, indicators, incorporate insights from user needs and feasibility study
Module 2 – survey methodology: statistical unit, activity classifications, target populations, measurement of concepts
Module 3 – mapping user needs: form expert/stakeholder group in each country
Module 4 - feasibility study: interviews, testing and study of potential respondents
Module 5 – draft of pilot questionnaire: developing one or more pilot questionnaires, including experimental modules
Module 6 – pilot testing of questionnaire: each country will conduct large scale pilot survey of public sector institutions
1 In publications such as Powering Ideas: An Innovation Agenda for the 21st Century (2009) and Empowering Change: Fostering Innovation in the Australian Public Service (2010).
2Oslo Manual: Guidelines for collecting and interpreting innovation data (2005) is the foremost international source of guidelines for the collection and use of data on innovation activities in industry. It provides a strong conceptual and statistical framework for measuring innovation in firms.
3 Rather than using Likert scale type questions. This is to avoid the social desirability problem. A Likert Scale Question is a survey question that allows the user to choose the response that best represents his or her opinion relative to a series of statements eg. Selection of strongly agree to strongly disagree.
4 Hughes, A., Moore, K., and Kataria, N. (2011). Innovation in Public Sector Organisations: A pilot survey for measuring innovation across the public sector. NESTA. www.nesta.org.uk
. The Gallup Organisation. Innobarometer 2010 Analytical Report: Innovation in Public Administration (2011)
5 OECD Innovation Strategy: Getting a head start on tomorrow (May, 2010)
6 APSC State of the Service Reports, see: http://www.apsc.gov.au/stateoftheservice/index.html
7 For example see Innovation and the ATO: Opening address by Commissioner Michael D’Ascenzo, Public Service Innovation Network Meeting, Canberra, 17 March 2011. http://www.ato.gov.au/youth/content.aspx?menuid=42807&doc=/content/00274417.htm&page=1
8 Department of Premier and Cabinet, Victoria, VPS Innovation Action Plan, (Nov, 2009).Available at: www.dpc.vic.gov.au.
9 Department of Agriculture & Food and Department of Commerce, Western Australia, 2010, Thinking outside the box…Innovation in the WA Public Sector – A report from the WA Public Sector Innovation Workshop. http://www.agric.wa.gov.au/objtwr/imported_assets/content/amt/dg_forum_booklet_new_version_landscapev3.pdf
10 Martin J, (1999) Innovation Strategies in Australian Local Government. www.ahuri.edu/downloads/publications/occasional_Paper_4.pdf
11 http://govdex.gov.au/ or from the Department of Innovation, Industry, Science and Research, Innovation Division.
17 Public Sector Innovation Index: Exploratory Project. (October, 2009)
18 An Innovation Index for the Public Sector (October, 2009)
19 An Innovation Index for the Public Sector (October, 2009)
20 Towards a conceptual framework for measuring public sector innovation: Module1 – Conceptual Framework.
21 Nomenclature Generale des Activities Economiques
22 See Annerstedt, P and Björkbacka, R. (2010). Feasibility study of public sector organizations, Measuring Public Innovation (MEPIN project), Statistics Sweden.
23 Some variation in questions did exist.
24 The Frascati Manual is a document setting forth the methodology for collecting statistics about research and development. The Manual was prepared and published by the Organization for Economic Co-operation and Development.
25 Bloch, C. (2010). Measuring Public Innovation in the Nordic Countries: Final report