Authors: Drake, William D. | Miller, Roy I. | Schon, Donald A.
Article Type:
Research Article
Abstract:
After analyzing longitudinal anthropometric data from eight community-level nutrition programs to determine their impact, it was concluded that conventional approaches to analysis do not eliminate indeterminacy because: (1) the data was inaccurate or inconsistent, (2) the measures or measurement methods produced misleading results, and, most importantly, (3) a lack of information about the local context of the interventions precluded the elimination of competing explanations of observed outcomes. In that analysis, as with most similar analyses, the traditional experimental approach (applying a predesigned experiment using controls in a presumably constant environment) failed because the experimental context was unstable, unpredictable, and unique
…in each case. Furthermore, the instability, unpredictability, and uniqueness of each case called for a flexible intervention strategy to cope with the changing context. As an alternative approach to both analysis and intervention, reflection-in-action is proposed. Six features of this model are: explicit specification of the framework underlying the intervention strategy; continuous monitoring of both data gathering procedures and intervention strategies; periodic redesign of those procedures and strategies; collaboration between researchers; practitioners, and subjects throughout; use of on-the-spot experimentation to test particular hypotheses: and explicit enumeration and accounting for potential factors confounding both the analysis and the intervention itself. By actively using the data for continuous monitoring, field practitioners, working with analytic specialists, are more likely to reduce or eliminate indeterminacy due to inaccurate data and/or contextual changes than would traditional researchers or evaluators who maintain distance between themselves and the intervention. Reflection-in-action, in part, is illustrated in the context of a recent evaluation conducted in Sri Lanka where a revisit to the field with preliminary quantitative results caused modification in the interpretation of those results. Problems remain, however, in achieving full implementation of this approach. Practitioners and scientists will have to change their attitudes and behavior to accomodate R-I-A, the role of quantitative analysis in program management and evaluation will have to be placed in proper perspective, and institutions supporting intervention activities will have to modify their approach to both funding and evaluation.
Show more
Keywords: Intervention, nutrition, program evaluation, reflection-in-action, program management, experimental design, planning
DOI: 10.3233/HSM-1983-4204
Citation: Human Systems Management,
vol. 4, no. 2, pp. 82-97, 1983
Price: EUR 27.50