Understanding Telecentre evaluation frameworks through the Venezuelan Infocentros programme abstract



Download 319.92 Kb.
Page5/8
Date conversion15.05.2016
Size319.92 Kb.
1   2   3   4   5   6   7   8

how is infocentros evaluated?

Evaluations of Infocentros have followed the standard cycle of impact assessment, which includes a pre-operations planning or ‘start-up’ (Genesis), ‘early impacts’ (UNESCO), ‘mid-life’ (IDC) and ‘late-life’ (future evaluation) assessments. (NTCA, 2000)


A quantitative approach has been generally used, being the most important: the nature of client demand for services and user satisfaction. The sources give priority to the use of surveys although interviews with operators and technical staff have been included. Techniques such as story-telling and focus groups have not yet been considered. Qualitative approaches, which provide ‘information, knowledge and even wisdom about the impacts’ (NTCA, 2000) were used on fewer occasions and mainly focused on interviews and observation.
Similarly with other telecentre experiences, only direct impacts – empirically observed - have been assessed in Infocentros. In the developing world, there have been few evaluation and impact assessment methodologies identified which account for indirect; – attributable to telecentre, - and induced; – changes through sequential causation - impacts (NTCA, 2000)
When examining the object of evaluation, a progressive incorporation of more IS elements, such as context, people of context and interactions with Infocentros has been observed. Content created for Infocentros, has not been considered here, as has occurred in evaluations of other developing countries which have introduced ICTs for development (OneWorld et al, 2004). Similarly, users’ problems when accessing content in different languages has not been contemplated.
In terms of people who initiated, were involved in, designed and carried out evaluations, there were significant differences. NCIT directors obeying very specific political requirements mainly initiated evaluations. Processes which were not initiated by senior officials (2004 evaluation), have not had the necessary support. Although externals have been hired to plan, design and carry out evaluations, NCIT staff have been permanently involved in these tasks. Participatory approaches have been ignored so far, so not all stakeholders have been actively involved in early stages of evaluations. Readers of evaluations reports have been defined depending on the outcomes of evaluations.

Despite the fact that the finding of sustainability formulas, the need for a business plan, and the necessity of linking Infocentros to communities, have frequently been cited as the purposes of evaluations, there is a strong political ingredient in assessments, this has been made evident by the given uses of evaluation results and the ‘strategic’ moments at which reports have been released.



Infocentros in Light of the Guiding Principles

Participatory

Although with the evolution of the project, the evaluations have considered a “wider” IS, participation has been augmented only in terms of application of surveys. For example in IDC evaluation, a specific survey focusing on context was designed and applied. Telecentre operators, and users and non-users viewpoints and opinions have not been involved when planning of evaluations and target groups have not been identified. Only during stages of survey testing on pilot telecentres were operators and users answers and reactions considered in refining the surveys. Viewpoints and needs are important and local and national stakeholders’ opinions should be included using a participatory approach which mainly focus on local context (Whyte, 2000; Reilly et al, 2001).


In the particular case of the pre-operations evaluation, participatory needs assessments were not performed. This confirmed the position that the task is rarely performed prior to installation or formation of telecentres (Michiels et al, 2001). The priorities of the project, even in terms of evaluation, tended to be influenced by the interests of externals – in this case the President of Venezuela and Directors of NCIT rather than community-based organisation. (Michiels et al, 2001). For example, the implementation strategy produced by the pre-operations evaluation, did not include significant and fundamental aspects, which are derived from the global telecentre experience, such as direct participation and discussion with communities and assessment or self-evaluation of progress of the Infocentros in the short and medium terms (Whyte, 2000).

Socially Inclusive

Due to the non-participatory approach that has been used in Infocentros evaluations it could be argued that they have not been socially inclusive either. The IDC’s evaluation however, attempted to design samples in which different social groups in terms of gender, age, and education level were considered. This was an improvement, because earlier evaluations collected information on randomly chosen users. Differentiation of social groups, however, was shown when evaluation results were provided in reports, which in each evaluation were separated according to age, gender and occupation.


Locally Grounded

Consideration of local context has been applied only in certain areas. It has been evident in terms of Infocentro context characterisation, but not in the participation of local user, non-user and operators. Despite evaluations having evolved and started to incorporate more IS ‘elements’ when collecting data - from the context, people in context, operators and users - surveys have not been designed for adapting to local contexts of Infocentros, and the same surveys were applied to the total sample. Reports of evaluations, however, have discriminated between different regional results, although these were shown after into tables with national information. UNESCO’s evaluation for example, attempted to use local expertise when operators were in charge of data collection tasks, however, this ‘step forward’ did not involve continuity and IDC evaluation used non-local ‘professional pollsters’ to gather the information.



Public and Transparent

The public character of evaluations can be questioned due to the UNESCO evaluation having omitted all stakeholders. Results of other evaluations have been publicised because they helped in gaining credit and prestige for the administration. It is not a coincidence that only positive results have been made public to stakeholders. In the same vein, even publicised positive results could also be questioned because evaluation reports are the result of interpretations made by authors and immediate recipients of results. (Menou et al, 2001). It is important to note that during the preparation of this study, reports and material related to the UNESCO evaluation were provided by people in the Venezuelan ICTs and development sector. There remain questions about the transparency of the evaluation processes.

Methodologically Appropriate

Evaluation exercises in Infocentros have evolved methodologically. While first attempts involved a limited number of Infocentros, IS elements, indicators, and evaluation team members, later ones have diversified. Evaluation 2004, for instance, is considering more complex indicators of impact assessment and is examining numerous existing methodologies. Despite the evolution, it could be said that the methodologies utilised are inappropriate. Mainly quantitative measures have been used which can offer ‘scientific’ results, however the exclusive use of this type of measurement is not appropriate when impacts have to be quantified (Menou et al, 2001). This approach could have been replicable with samples but it has been very useful. Similarly more participatory evaluations are needed.


Sustainability Enhancing

Sustainability issues are always present in telecentre discussions, because telecentres offering access and training to poor communities tend to have difficulties surviving. (Whyte, 1999b, OneWorld et al, 2004). Due to the irrevocable charge-free character of Infocentro services, the problem of sustainability in the financial area is never-ending. Infocentros evaluations have given special significance to financial aspects of the project although aspects related to staff capability (Baark et al, 1998), community acceptance (Whyte, 1999b) and service delivery (Colle et at, 2002 ) sustainability have recently begun to be considered. Although other countries’ experiences have shown that supporting telecentres exclusively by government funding is not financially and politically sustainable (Colle et al, 2002), Infocentros still uses a charge-free schema. Sustainability recommendations in evaluation reports are limited as a result of this presidential resolution. One could argue that financial sustainability is not a concern of government. As a populist tool with total presidential support and one which can potentially produce results for this government, the programme’s required funding will be provided to ensure its survival.


Capacity Building

Evaluation practices in Infocentros are weak in terms of learning and capacity building. Although findings of earlier evaluations have been included into the objectives of future evaluations, for example, recommendations for linking Infocentros to communities, and financial sustainability issues, there remain some efforts to be made in terms of learning from evaluations. For example, the fact that earlier evaluations are ‘intentionally’ hidden, shows that negative results of the programme or failures in evaluation methodology are not used as learning lessons in order to avoid future mistakes in evaluation. Having appropriate documentation, there will be always information available, even in cases of high rotation of personnel; a frequent factor in government organisations.

Similarly, due to the non-existence of an association grouping of Infocentros, the opportunities for sharing, comparing and documenting experiences between them is limited. Using participatory approaches, Infocentros would be actively evaluated by their operators and users which would produce ‘learning evaluations’ (Reilly, et al, 2001). Training will have to be provided to operators so they can help in the determination of community needs and opinions. Additionally, simple mechanisms– i.e. suggestion boxes – would contribute to the collection of users’ complaints and suggestions.

An expressed intention to learn from ‘experts’ was found however, during the planning of IDC evaluation. NCIT staff participated in telecentre forums and mailing lists in order to obtain specific recommendations in conducting an evaluation of impact. Nonetheless there is no evidence to establish whether given recommendations were followed.


Reflective of Shared Visions

Although there are no defined specific roles played by different stakeholders in Infocentros evaluation, when evaluating the programme all stakeholders have been given no clear idea of the purpose of the evaluation, evaluation needs and how this information was going to be used. To illustrate, users, non-users and operators have not participated when planning and designing evaluations and this has limited their understanding of any evaluation ‘vision’.


Strategically Oriented

When carrying out Infocentros evaluations, survey questions for gathering information have been used principally, these allowed evaluators to control the strategic orientation of evaluations. There is not sufficient evidence, however, to determine to what extent defined evaluation strategies were incorporated and reflected into others areas of the projects.


Gender Sensitive

Infocentros evaluations have not considered gender issues in terms of strategy, process, methodology and tools utilised. Despite the fact that recent exercises, specifically the IDC evaluation, applied some sampling methodologies in order to secure a well-balanced sample, participation of users and particular social groups has been limited and, as a consequence, so has the specific participation of women in the processes. Some studies have shown evidence that the use of ICTs is equally divided between men and women (OneWorld et al, 2004), however, modes of use differs with more technical skills dominated by men. It is important to establish – with the correct participation, methodology and tools - the way in which genders are using or failing to use Infocentro services, because computer skills can have a significant empowering effect on women (Holmes, 1999; Menou et al, 2001; GEM, 2003).



1   2   3   4   5   6   7   8


The database is protected by copyright ©essaydocs.org 2016
send message

    Main page