A guide for field projects on adaptive strategies Layer 3 Layer 4 Layer 2
The nature of evaluation
Types of evaluation models-You Are Here-
Thinking of evaluation as a process
Context-specific evaluations
Balance between quantitative and qualitative measures
Participatory Evaluations
Project processes impacting an evaluation framework
A hypothetical evaluation

[Stg1][Stg2][Stg3][Stg4][Stg5][Stg6][Stg7]
Stages

Evaluation models

There are a wide variety of evaluation models which "prescribe what evaluators ought to do and explain how to conduct a particular type of evaluation" (Patton 1982: 37). House (1978 as cited in Patton 1982) has created a taxonomy of evaluation models, distinguishable by the audiences the evaluation addresses, the outcomes they examine, the typical questions they ask and the methods they employ. The categories within this taxonomy include: systems analysis, behavioral objectives approach, goal-free evaluation, art criticism approach, accreditation model, transaction approach and decision-making models.

It is not necessary to evaluate only with regards to goals and objectives; instead "goal free evaluation" and "eolithic" evaluations are certainly possible. In fact, Patton (1982) states that often goals and objectives constrain/limit an evaluation unnecessarily. This is echoed by Buhrs and Bartlett (1993) who state that there are inherent biases in approaching an evaluation using preset goals and objectives; rather the evaluator should establish multiple values and criteria on which to base such an analysis.

Eolithic evaluation is more of a "process" evaluation. Instead of looking at how/if goals are fulfilled, the investigator is directed to "consider how ends can flow from means. One begins by seeing what exists in the natural setting and then attains whatever outcomes one can with the resources at hand" (Patton 1981: 113). Goals are discovered or emergent as the process progresses; "they [the participants] look around them to see what's available and then do whatever they can with whatever they find. What they do moves them towards emerging goals that are discovered in and grow out of the environment in which they find themselves..." (114).

Richards (1985) describes an illuminative evaluation in The Evaluation of Cultural Action. He states that an illuminative evaluation is a custom built research strategy which lacks in formal statements of objectives, avoids (but does not exclude) statistical procedures, employs subjective methods, and is primarily interested in the informing function of an evaluation, rather than the more usual inspectoral or grading functions of an evaluation.

He states that illuminative evaluations are:

holistic - evaluators attending closely to the various contexts of a program being evaluated and seeking to portray the program as a working whole, as an individual organizational construction that needs to be examined simultaneously from different perspectives.
responsive - with researchers working closely to provide all concerned with a program with a genuinely helpful report, that might take many different forms and draw on many diverse sources and methods, but is designed to interest, to inform, to add to their understanding (p xiv)

You're @ IISDnet