Measuring Innovation in the Public Sector: a literature review



Download 311.13 Kb.
Page9/13
Date31.05.2016
Size311.13 Kb.
1   ...   5   6   7   8   9   10   11   12   13

Project 3 – Damvad


Damvad15 suggested the innovation index should fulfil three broad objectives:

  • improve understanding of public sector innovation and provide needed measures for policymaking;

  • function as a tool to aid individual public sector organisations in their efforts to promote innovation; and

  • promote awareness of public sector innovation both in general and among individual public organisations.

Damvad’s conceptual framework for measuring public sector innovation consisted of five main elements - inputs to innovation, innovation processes within the organisation, outputs of the innovation process, general outcomes of innovation, and external factors or framework conditions that affect innovation in public sector organisations. The framework is shown below:

Inputs

Process

Outputs

Outcomes

Investment in R&D and innovation

Innovation strategy

Innovation productivity

Societal impacts

Staff (education, diversity, etc.)

Collaborations and learning activities

Types of innovation

Improved employee satisfaction

Sources of innovation (managers, staff, users, etc)

Diffusion of innovations

Degree of novelty and scope of innovations

Benefits for users

Technological infrastructure for innovation (e.g. ICT)

Organisational culture

Intangible outputs (e.g. trademarks, copyright)

Other intangible effects (e.g. trust and legitimacy)

Framework conditions

User and supplier demands

Public sector organisation & incentive structures

Policy priorities

Enablers/barriers for innovation

The proposed index consisted of a series of survey indicators based directly on data collected from public sector organisations (see below).

Themes

Survey Indicators

Input

  • Innovation expenditures (staff, funding, consulting expenditures and other knowledge purchases etc.)

  • Staff (education experience, diversity etc.)

  • Sources of information (e.g. management/senior staff versus employees/frontline staff, users, suppliers, collaboration)

  • Technological infrastructure for innovation (including access to and use of ICT)

Process

  • Explicit innovation strategy and targets

  • Systematic, internal measurement and evaluation of innovation

  • Role of management in innovation (active involvement, risk management, support/commitment to innovation and implementation)

  • Incentive and reward structures

  • Practices for learning and diffusing knowledge and innovations

  • Innovation collaboration and alliances

  • Perception of enablers and barriers to innovation

Output

  • Types of innovations (product, services, processes, delivery models, organisational design and practices, etc.)

  • Degree of novelty and scope of innovations (e.g. incremental versus radical innovation, autonomous versus systemic innovation)

  • Related, intangible outputs (e.g. patents, copyright, trademarks)

  • Effects of innovations

Outcomes

  • Organisational performance (both productivity and quality measures)

  • Employee satisfaction

  • User satisfaction

  • Other intangible effects, (e.g. increased trust and legitimacy)

Source: CFA

Damvad also proposed a set of seven diagnostic indicators to be constructed from the data gathered for the surveyed indicators and potentially other data sources. These indicators would be used as a diagnostic tool to assess innovation performance of public agencies. The diagnostic indicators were:

  • Implementation

  • Openness

  • Innovation culture

  • Innovation performance

  • Demand and user involvement

  • Innovative procurement

  • ICT and innovation

The diagnostic indicators provide additional analysis of organisations’ own responses to the survey. These results can then be benchmarked against other groups that have been specified for comparison. There are a number of possibilities here, such as:

  • Comparison with averages for comparable groups

  • Comparison with top performers for comparable groups

  • Comparison with results for other groups (that are perhaps less comparable, but where benchmarking may still be useful).

In all cases, this comparison can also include an analysis of strengths and weaknesses of an organisation within selected areas.

Damvad proposed the construction of an interactive tool that can be utilised to analyse and benchmark innovation in individual organisations. Damvad’s approach is similar to the Korean GII but takes a broader view than just organisational change. It includes issues such as procurement strategies, demand side factors, and mechanisms for knowledge sourcing, learning and implementation.

Damvad proposed a broad classification of organisations by level of government, dividing by level of government and whether the organisation is mainly involved with general administration or the delivery of services to the public:


  • Central government (including Devolved Administrations)

  • Local authorities

  • Frontline services (where organisations that deliver services to the public may either be a part of, or administered by, central or local government).

While stressing the importance of maintaining a common core of indicators across organisations, Damvad also suggested introducing a small set of questions (or a separate module) that focuses on aspects that are specific to individual sectors, such as health and education. These can concern effects of innovations, barriers, specific organisational issues or types of collaboration, and could either consist of a separate module or a small set of additional or modified questions throughout the survey. These sector-specific indicators would be developed in dialogue with stakeholders from respective sectors.

The stages of development of the proposed index is shown below:



  • Stage 1 - Consultations with users. Gain feedback on user needs, promote the overall idea (across departments and countries, etc.), and initiate dialogue and collaborations with key departments. Users to be provided with a fairly detailed proposal.

  • Stage 2 - Development work - survey, data and online tool. Developing a survey questionnaire, detailed examination of options for using other data and other data-related issues (e.g. access and administration of data), and the development of a plan for the online system.

  • Stage 3 – Testing. Testing of the questionnaire (potentially via the web-based tool) and feedback sought from users. Finalisation of the pilot survey questionnaire. Note that the pilot survey is needed to generate indicators for use in benchmarking etc for the diagnostic tool, and this also means that eventual users of the diagnostic tool should answer the same questions as in this pilot study. Hence, this is a very important step. Essentially, final decisions will be made here concerning the survey tool (though not necessarily for the diagnostic indicators).

  • Stage 4 - Pilot survey. Work in constructing a population and sample may take some time, though this preliminary work can be initiated parallel to development and testing work. The pilot survey needs to be completed before remaining steps can be undertaken.

  • Stage 5 - Constructing the diagnostic indicators. Careful analysis of the results will need to be done, including development and analysis of various methods for constructing diagnostic indicators. Note that this process will also require any other data sources to be used for the diagnostic indicators. This matching process can begin as soon as the sample is determined for the pilot study. A proposal concerning diagnostic indicators should be presented to user groups for feedback. Work should then be finalised for all three strands: the diagnostic indicators, the data set and the web-based diagnostic tool.

  • Stage 6 - Final preparations. Thereafter, tabulations should be made (both survey and diagnostic indicators) for a publication that can be released when the diagnostic tool is launched. The web-based tool should also be tested for technical functioning.

  • Stage 7 – Launch. Launch diagnostic tool, perhaps through a promotional event. A publication to be released at the same time reporting results of the pilot survey, and showing examples of the types of benchmarking analysis that can be conducted based on this data.



Share with your friends:
1   ...   5   6   7   8   9   10   11   12   13




The database is protected by copyright ©essaydocs.org 2020
send message

    Main page